.. |
attention
|
Fix num_key_value_heads issue
|
2025-05-20 02:29:12 +00:00 |
awq
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
gptq
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
moe
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
__init__.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
bnb.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
conv.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
exl2.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
fp8.py
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
layernorm.py
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
linear.py
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
lora.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
medusa.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
mlp.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
rotary.py
|
Gaudi: clean cuda/rocm code in hpu backend, enable flat_hpu (#3113)
|
2025-04-14 15:58:13 +02:00 |
speculative.py
|
Add Gaudi Backend (#3055)
|
2025-02-28 12:14:58 +01:00 |
tensor_parallel.py
|
Add Qwen3
|
2025-05-16 01:53:23 +00:00 |