text-generation-inference/server/text_generation_server/models
Nicolas Patry 51506aa57a Mllama flash version (#2585)
* Working loading state.

* Preprocessing.

* Working state ? (Broke idefics1 temporarily).

* Cleaner condition.

* Fix idefics.

* Updating config, removing TODO

* Mllama

* Ugrade transformers 4.45

* Flashing mllama.

* Starting to get there.

* Working state.

* Integrations tests for mllama (cutting to 10 tokens because there seems'
to be instability after (meaning size of the batch matters.

* Updating model link.

* Earlier assert.

* Fix vlm ?

* remove log.

* Force ignore all images but last.

* Default dtype bfloat16.

* Update integration test after switch to bf16.

* Remove dead code.

* Removed dead code.

* Upgrade the flake to latest transformers/tokenizers

* Move to hf tgi-nix

* Upgrade to 0.5.0
2024-10-27 04:03:57 +00:00
..
custom_modeling Mllama flash version (#2585) 2024-10-27 04:03:57 +00:00
__init__.py Mllama flash version (#2585) 2024-10-27 04:03:57 +00:00
bloom.py Make Gaudi adapt to the tgi 2.3.0 2024-09-26 06:04:55 +00:00
causal_lm.py Simplify the warmup 2024-10-25 08:38:59 +00:00
flash_causal_lm.py Update ROCM libs and improvements (#2579) 2024-10-25 09:01:04 +00:00
galactica.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
globals.py Make Gaudi adapt to the tgi 2.3.0 2024-09-26 06:04:55 +00:00
idefics_causal_lm.py Mllama flash version (#2585) 2024-10-27 04:03:57 +00:00
mamba.py Fixing exl2 and other quanize tests again. (#2419) 2024-09-25 06:08:38 +00:00
mllama_causal_lm.py Mllama flash version (#2585) 2024-10-27 04:03:57 +00:00
model.py Pass the max_batch_total_tokens to causal_lm 2024-10-23 08:28:26 +00:00
pali_gemma.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
seq2seq_lm.py Fixing exl2 and other quanize tests again. (#2419) 2024-09-25 06:08:38 +00:00
starcoder.py Make Gaudi adapt to the tgi 2.3.0 2024-09-26 06:04:55 +00:00
types.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
vlm_causal_lm.py Merge branch 'habana-main' into 2.3.0 2024-10-23 16:32:12 +08:00