text-generation-inference/server/text_generation_server/models
Nicolas Patry 849bd93dc3 Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385)
* Using an enum for flash backens (paged/flashdecoding/flashinfer)

* Early exit on server too.

* Clippy.

* Fix clippy and fmt.
2024-09-25 06:04:51 +00:00
..
custom_modeling fix: prefer hidden_activation over hidden_act in gemma2 (#2381) 2024-09-25 05:55:39 +00:00
__init__.py Update documentation for Supported models (#2386) 2024-09-25 06:04:51 +00:00
bloom.py Refactor dead code - Removing all flash_xxx.py files. (#2166) 2024-09-25 05:20:28 +00:00
causal_lm.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
flash_causal_lm.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-09-25 06:04:51 +00:00
galactica.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
globals.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-09-25 06:04:51 +00:00
idefics_causal_lm.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
idefics.py enable HuggingFaceM4/idefics-9b in intel gpu (#2338) 2024-09-25 05:55:39 +00:00
mamba.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
model.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
pali_gemma.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
seq2seq_lm.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
types.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
vlm_causal_lm.py fix crash in multi-modal (#2245) 2024-09-25 05:39:58 +00:00