text-generation-inference/server/text_generation_server/models
Nicolas Patry 7f1816a4e1
Change add_special_tokens in order to have the correct tokens for chat
input and not (since it's super important with the prefixing now)
2024-08-27 20:06:11 +02:00
..
custom_modeling Fix: don't apply post layernorm in SiglipVisionTransformer (#2459) 2024-08-26 17:04:46 -04:00
__init__.py feat: validate template variables before apply and improve sliding wi… (#2403) 2024-08-12 10:58:40 -04:00
bloom.py Refactor dead code - Removing all flash_xxx.py files. (#2166) 2024-07-05 10:29:56 +02:00
causal_lm.py Fixing exl2 and other quanize tests again. (#2419) 2024-08-15 11:12:51 +02:00
flash_causal_lm.py Change add_special_tokens in order to have the correct tokens for chat 2024-08-27 20:06:11 +02:00
galactica.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
globals.py This seems to be working. 2024-08-27 20:06:10 +02:00
idefics_causal_lm.py Upgrading exl2. (#2415) 2024-08-14 11:58:08 +02:00
idefics.py Upgrading exl2. (#2415) 2024-08-14 11:58:08 +02:00
mamba.py Fixing exl2 and other quanize tests again. (#2419) 2024-08-15 11:12:51 +02:00
model.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
pali_gemma.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
seq2seq_lm.py Fixing exl2 and other quanize tests again. (#2419) 2024-08-15 11:12:51 +02:00
types.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
vlm_causal_lm.py Prefix caching (#2402) 2024-08-20 11:15:30 +02:00