From e545d04d8ed6e95c6a161b3b18135ca0cb930419 Mon Sep 17 00:00:00 2001 From: RodriMora Date: Wed, 11 Dec 2024 18:27:04 +0100 Subject: [PATCH] Update README.md Added instructions to clone the repo and change directory into it. In following steps there is a "make install" step that would fail if people have not cloned the repo and cd into it, so it may be confusing for some Added python venv alternative to conda too. --- README.md | 16 ++++++++++++++-- 1 file changed, 14 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 631a97a2..6beb8281 100644 --- a/README.md +++ b/README.md @@ -196,14 +196,26 @@ Detailed blogpost by Adyen on TGI inner workings: [LLM inference at scale with T You can also opt to install `text-generation-inference` locally. -First [install Rust](https://rustup.rs/) and create a Python virtual environment with at least -Python 3.9, e.g. using `conda`: +First clone the repository and change directoy into it: + +```shell +git clone https://github.com/huggingface/text-generation-inference +cd text-generation-inference +``` + +Then [install Rust](https://rustup.rs/) and create a Python virtual environment with at least +Python 3.9, e.g. using `conda` or `python venv`: ```shell curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh +#using conda conda create -n text-generation-inference python=3.11 conda activate text-generation-inference + +#using pyton venv +python3 -m venv .venv +source .venv/bin/activate ``` You may also need to install Protoc.