This website requires JavaScript.
Explore
Help
Sign In
huggingface
/
text-generation-inference
Watch
5
Star
0
Fork
0
You've already forked text-generation-inference
mirror of
https://github.com/huggingface/text-generation-inference.git
synced
2025-04-19 22:02:06 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
b94f30215f
text-generation-inference
/
server
/
text_generation
History
Nicolas Patry
b94f30215f
fix(server): Use cleanup_tokenization_spaces=False for lossless decoding (
#13
)
...
Fixes
#12
in the easiest way I could think of.
2023-01-03 11:07:05 +01:00
..
models
fix(server): Use cleanup_tokenization_spaces=False for lossless decoding (
#13
)
2023-01-03 11:07:05 +01:00
pb
feat(server): Support all AutoModelForCausalLM on a best effort basis
2022-10-28 19:24:00 +02:00
__init__.py
feat(server): Support all AutoModelForCausalLM on a best effort basis
2022-10-28 19:24:00 +02:00
cache.py
feat(server): Support AutoModelForSeq2SeqLM
2022-11-04 18:03:04 +01:00
cli.py
feat(server): Support all AutoModelForCausalLM on a best effort basis
2022-10-28 19:24:00 +02:00
server.py
feat(server): Support AutoModelForSeq2SeqLM
2022-11-04 18:03:04 +01:00
utils.py
fix(server): Fix stop sequences (
#11
)
2022-12-16 16:03:39 +01:00