text-generation-inference/backends/gaudi/server/text_generation_server
Wang, Yi 778b61c0da
[gaudi] Remove unnecessary reinitialize to HeterogeneousNextTokenChooser to make sampling output correct (#3284)
Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
Co-authored-by: regisss <15324346+regisss@users.noreply.github.com>
2025-07-03 10:03:16 +02:00
..
adapters Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
layers [gaudi] Gemma3 sliding window support (#3280) 2025-07-01 10:06:01 +02:00
models [gaudi] Remove unnecessary reinitialize to HeterogeneousNextTokenChooser to make sampling output correct (#3284) 2025-07-03 10:03:16 +02:00
pb Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
utils [gaudi] Refine logging for Gaudi warmup (#3222) 2025-06-18 12:34:00 +02:00
__init__.py Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
cache.py Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
cli.py [Gaudi] Remove optimum-habana (#3261) 2025-06-12 22:35:36 +02:00
interceptor.py Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
server.py Deepseek R1 for Gaudi backend (#3211) 2025-05-19 16:36:39 +02:00
tgi_service.py Fix the crash in default ATTENTION path for Gaudi backend (#3235) 2025-05-20 14:02:32 +02:00
tracing.py Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00