From 0966704dd6037815381afa8ceb55fb5e9092751b Mon Sep 17 00:00:00 2001 From: Merve Noyan Date: Tue, 12 Sep 2023 15:52:30 +0200 Subject: [PATCH] Update docs/source/basic_tutorials/non_core_models.md Co-authored-by: OlivierDehaene --- docs/source/basic_tutorials/non_core_models.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/source/basic_tutorials/non_core_models.md b/docs/source/basic_tutorials/non_core_models.md index 280c0991..6f2e6cfa 100644 --- a/docs/source/basic_tutorials/non_core_models.md +++ b/docs/source/basic_tutorials/non_core_models.md @@ -17,7 +17,8 @@ docker run --gpus all --shm-size 1g -p 8080:80 -v $volume:/data ghcr.io/huggingf Finally, if the model is not on Hugging Face Hub but on your local, you can pass the path to the folder that contains your model like below 👇 ```bash -docker run --platform linux/x86_64 --shm-size 1g --net=host -p 8080:80 -v $volume:/data -e CUDA_VISIBLE_DEVICES= ghcr.io/huggingface/text-generation-inference:latest --model-id +# Make sure your model is in the $volume directory +docker run --shm-size 1g -p 8080:80 -v $volume:/data ghcr.io/huggingface/text-generation-inference:latest --model-id /data/ ``` You can refer to [transformers docs on custom models](https://huggingface.co/docs/transformers/main/en/custom_models) for more information.