text-generation-inference/server/text_generation_server/layers/attention
Nicolas Patry 849bd93dc3 Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385)
* Using an enum for flash backens (paged/flashdecoding/flashinfer)

* Early exit on server too.

* Clippy.

* Fix clippy and fmt.
2024-09-25 06:04:51 +00:00
..
__init__.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
common.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-09-25 06:04:51 +00:00
cuda.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-09-25 06:04:51 +00:00
flash_attn_triton.py feat: add ruff and resolve issue (#2262) 2024-09-25 05:46:24 +00:00
flash_infer.py Add FlashInfer support (#2354) 2024-09-25 06:01:59 +00:00
ipex.py Pr 2337 ci branch (#2379) 2024-09-25 05:55:39 +00:00
rocm.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-09-25 06:04:51 +00:00