From 4213eb57daebf397b114bf908ab9fb4f67335931 Mon Sep 17 00:00:00 2001 From: Merve Noyan Date: Thu, 24 Aug 2023 11:38:06 +0300 Subject: [PATCH] Update non_core_models.md --- docs/source/basic_tutorials/non_core_models.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/source/basic_tutorials/non_core_models.md b/docs/source/basic_tutorials/non_core_models.md index 623285a5..0f593571 100644 --- a/docs/source/basic_tutorials/non_core_models.md +++ b/docs/source/basic_tutorials/non_core_models.md @@ -8,8 +8,10 @@ You can serve these models using Docker like below 👇 docker run --gpus all --shm-size 1g -p 8080:80 -v $volume:/data ghcr.io/huggingface/text-generation-inference:latest --model-id gpt2 ``` -If the model you wish to serve is not a transformers model, but weights and implementation is included in the repository, you can still serve the model by passing `--trust-remote-code` flag to `docker run` command like below 👇 +If the model you wish to serve is a custom transformers model, but weights and implementation is included in the repository, you can still serve the model by passing `--trust-remote-code` flag to `docker run` command like below 👇 ```bash docker run --gpus all --shm-size 1g -p 8080:80 -v $volume:/data ghcr.io/huggingface/text-generation-inference:latest --model-id --trust-remote-code ``` + +You can refer to [transformers docs on custom models](https://huggingface.co/docs/transformers/main/en/custom_models) for more information.