mirror of
https://github.com/huggingface/text-generation-inference.git
synced 2025-09-11 12:24:53 +00:00
add clarifying example for n-gram speculation
This commit is contained in:
parent
e041c78de4
commit
f9f23aaf2c
@ -36,7 +36,7 @@ In order to use medusa models in TGI, simply point to a medusa enabled model, an
|
|||||||
|
|
||||||
|
|
||||||
If you don't have a medusa model, or don't have the resource to fine-tune, you can try to use `n-gram`.
|
If you don't have a medusa model, or don't have the resource to fine-tune, you can try to use `n-gram`.
|
||||||
Ngram works by trying to find in the previous sequence existing tokens that match, and use those as speculation.
|
N-gram works by trying to find matching tokens in the previous sequence, and use those as speculation for generating new tokens. For example, if the tokens "np.mean" appear multiple times in the sequence, the model can speculate that the next continuation of the tokens "np." is probably also "mean".
|
||||||
|
|
||||||
This is an extremely simple method, which works best for code, or highly repetitive text. This might not be beneficial, if the speculation misses too much.
|
This is an extremely simple method, which works best for code, or highly repetitive text. This might not be beneficial, if the speculation misses too much.
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user