text-generation-inference/backends/gaudi/server
Wang, Yi A 1e56e5fe5c [gaudi] HuggingFaceM4/idefics2-8b issue fix
batch.prefill_cache_indices is reset in generate_token instead of forward, so that position_id could be updated correctly

Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
2025-06-12 22:15:33 -07:00
..
integration-tests Gaudi: Add Integration Test for Gaudi Backend (#3142) 2025-04-07 16:55:03 +02:00
text_generation_server [gaudi] HuggingFaceM4/idefics2-8b issue fix 2025-06-12 22:15:33 -07:00
.gitignore Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
dill-0.3.7-patch.sh Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
dill-0.3.8-patch.sh Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-awq Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-eetq Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-fbgemm Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-flash-att Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-flash-att-v2 Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-selective-scan Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
Makefile-vllm Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
poetry.lock Remove useless packages (#3253) 2025-06-03 13:42:29 +02:00
pyproject.toml [Gaudi] Remove optimum-habana (#3261) 2025-06-12 22:35:36 +02:00
README.md Add Gaudi Backend (#3055) 2025-02-28 12:14:58 +01:00
requirements.txt [Gaudi] Remove optimum-habana (#3261) 2025-06-12 22:35:36 +02:00

Text Generation Inference Python gRPC Server

A Python gRPC server for Text Generation Inference

Install

make install

Run

make run-dev