From 06ebc7220befa1673db7d95d937bfb50a6980de3 Mon Sep 17 00:00:00 2001 From: osanseviero Date: Wed, 16 Aug 2023 18:23:01 +0200 Subject: [PATCH] Fix --- docs/source/conceptual/streaming.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/conceptual/streaming.md b/docs/source/conceptual/streaming.md index a74db33b..c2d8a2e6 100644 --- a/docs/source/conceptual/streaming.md +++ b/docs/source/conceptual/streaming.md @@ -4,7 +4,7 @@ With streaming, the server returns the tokens as the LLM generates them. This enables showing progressive generations to the user rather than waiting for the whole generation. Streaming is an essential aspect of the end-user experience as it reduces latency, one of the most critical aspects of a smooth experience. -
+