text-generation-inference/server/text_generation_server/models
drbh 215ed3ad52
fix: attempt forward on flash attn2 to check hardware support (#2335)
* fix: attempt forward on flash attn2 to check hardware support

* fix: warn window_size_left when using flash attn 1

* fix: prefer version check over test op and avoid window_size_left if not flash attn2

* fix: improve condtional and error message

* fix: update sliding window conditional

* fix: simplify changes and revert model changes

* fix: avoid changing conditional

* fix: typo tweak
2024-08-05 09:11:40 -04:00
..
custom_modeling Unify attention output handling (#2343) 2024-08-01 17:03:28 +02:00
__init__.py fix: attempt forward on flash attn2 to check hardware support (#2335) 2024-08-05 09:11:40 -04:00
bloom.py Refactor dead code - Removing all flash_xxx.py files. (#2166) 2024-07-05 10:29:56 +02:00
causal_lm.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
flash_causal_lm.py Pr 2290 ci run (#2329) 2024-07-31 10:27:15 -04:00
galactica.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
globals.py Pr 2290 ci run (#2329) 2024-07-31 10:27:15 -04:00
idefics_causal_lm.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
idefics.py enable HuggingFaceM4/idefics-9b in intel gpu (#2338) 2024-08-01 11:08:36 +02:00
mamba.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
model.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
pali_gemma.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
seq2seq_lm.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
types.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
vlm_causal_lm.py fix crash in multi-modal (#2245) 2024-07-24 10:39:08 +02:00