From 189465fd60bdd1d4d3b1a36097bab36d21080fd5 Mon Sep 17 00:00:00 2001 From: Guspan Tanadi <36249910+guspan-tanadi@users.noreply.github.com> Date: Tue, 4 Apr 2023 15:07:28 +0700 Subject: [PATCH] style: Logits Warper mention top-p top-k README more readability --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index fbf94825..cd3ae9aa 100644 --- a/README.md +++ b/README.md @@ -46,7 +46,7 @@ to power LLMs api-inference widgets. - Quantization with [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) - [Safetensors](https://github.com/huggingface/safetensors) weight loading - Watermarking with [A Watermark for Large Language Models](https://arxiv.org/abs/2301.10226) -- Logits warper (temperature scaling, TopP, TopK, repetition penalty, more details see [transformers.generation_logits_process](https://huggingface.co/transformers/v4.1.1/_modules/transformers/generation_logits_process.html)) +- Logits warper (temperature scaling, top-p, top-k, repetition penalty, more details see [transformers.generation_logits_process](https://huggingface.co/transformers/v4.1.1/_modules/transformers/generation_logits_process.html)) - Stop sequences - Log probabilities - Production ready (distributed tracing with Open Telemetry, Prometheus metrics)