text-generation-inference/server/text_generation_server/layers/moe
Daniël de Kok d4f995e718 Add DenseMoELayer and wire it up in Mixtral/Deepseek V2 (#2537)
This replaces the custom layers in both models.
2024-10-25 09:01:04 +00:00
..
__init__.py Add DenseMoELayer and wire it up in Mixtral/Deepseek V2 (#2537) 2024-10-25 09:01:04 +00:00
unquantized.py Move to moe-kernels package and switch to common MoE layer (#2511) 2024-09-25 06:18:05 +00:00