From 58ddedec16d8d9662bf377e6db7dcb30342556f2 Mon Sep 17 00:00:00 2001 From: drbh Date: Fri, 2 Feb 2024 09:58:29 -0500 Subject: [PATCH] Update docs/source/messages_api.md Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com> --- docs/source/messages_api.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/source/messages_api.md b/docs/source/messages_api.md index beb141b6..939850aa 100644 --- a/docs/source/messages_api.md +++ b/docs/source/messages_api.md @@ -92,7 +92,8 @@ print(chat_completion) ## Hugging Face Inference Endpoints -TGI is now integrated with [Inference Endpoints](https://huggingface.co/inference-endpoints/dedicated) and can be easily accessed with only a few lines of code. Here's an example of how to use IE with TGI using OpenAI's Python client library: +The Messages API is integrated with [Inference Endpoints](https://huggingface.co/inference-endpoints/dedicated). +Every endpoint that uses "Text Generation Inference" with an LLM, which has a chat template can now be used. Below is an example of how to use IE with TGI using OpenAI's Python client library: > **Note:** Make sure to replace `base_url` with your endpoint URL and to include `v1/` at the end of the URL. The `api_key` should be replaced with your Hugging Face API key.