mirror of
https://github.com/huggingface/text-generation-inference.git
synced 2025-11-18 23:15:59 +00:00
* Move to moe-kernels package and switch to common MoE layer This change introduces the new `moe-kernels` package: - Add `moe-kernels` as a dependency. - Introduce a `SparseMoELayer` module that can be used by MoE models. - Port over Mixtral and Deepseek. * Make `cargo check` pass * Update runner |
||
|---|---|---|
| .. | ||
| test_flash_mixtral_all_params.json | ||
| test_flash_mixtral_load.json | ||
| test_flash_mixtral.json | ||