Update README.md (#2827)

Added instructions to clone the repo and change directory into it. 

In following steps there is a "make install" step that would fail if people have not cloned the repo and cd into it, so it may be confusing for some

Added python venv alternative to conda too.
This commit is contained in:
RodriMora 2024-12-11 19:45:49 +01:00 committed by GitHub
parent 82c24f7420
commit cc66dccbe8
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -196,14 +196,26 @@ Detailed blogpost by Adyen on TGI inner workings: [LLM inference at scale with T
You can also opt to install `text-generation-inference` locally.
First [install Rust](https://rustup.rs/) and create a Python virtual environment with at least
Python 3.9, e.g. using `conda`:
First clone the repository and change directoy into it:
```shell
git clone https://github.com/huggingface/text-generation-inference
cd text-generation-inference
```
Then [install Rust](https://rustup.rs/) and create a Python virtual environment with at least
Python 3.9, e.g. using `conda` or `python venv`:
```shell
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
#using conda
conda create -n text-generation-inference python=3.11
conda activate text-generation-inference
#using pyton venv
python3 -m venv .venv
source .venv/bin/activate
```
You may also need to install Protoc.