Add mention that it just works with api-hosted models

This commit is contained in:
osanseviero 2023-08-15 23:53:32 +02:00
parent f1fb15f4ae
commit 16a679390e

View File

@ -16,7 +16,7 @@ curl 127.0.0.1:8080/generate \
## Inference Client ## Inference Client
[`huggingface-hub`](https://huggingface.co/docs/huggingface_hub/main/en/index) is a Python library to interact with the Hugging Face Hub, including its endpoints. It provides a nice high-level class, [`~huggingface_hub.InferenceClient`], which makes it easy to make calls to a TGI endpoint. `InferenceClient` also takes care of parameter validation and provides a simple to-use interface. [`huggingface-hub`](https://huggingface.co/docs/huggingface_hub/main/en/index) is a Python library to interact with the Hugging Face Hub, including its endpoints. It provides a nice high-level class, [`~huggingface_hub.InferenceClient`], which makes it easy to make calls to a TGI endpoint. `InferenceClient` also takes care of parameter validation and provides a simple to-use interface. At the moment, `InferenceClient` only works for models hosted with the Inference API or Inference Endpoints.
You can simply install `huggingface-hub` package with pip. You can simply install `huggingface-hub` package with pip.