Wang, Yi
53b6f6e604
Apply suggestions from code review
...
Co-authored-by: Daniël de Kok <me@github.danieldk.eu>
2024-11-18 19:28:07 +08:00
Wang, Yi A
d9a8bbc183
add ipex moe implementation to support Mixtral and PhiMoe
...
Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
2024-10-29 23:54:42 -07:00
Daniël de Kok
64142489b6
Add support for fused MoE Marlin for AWQ ( #2616 )
...
* Add support for fused MoE Marlin for AWQ
This uses the updated MoE Marlin kernels from vLLM.
* Add integration test for AWQ MoE
2024-10-08 11:56:41 +02:00
Daniël de Kok
1c84a30fe6
MoE Marlin: support desc_act
for groupsize != -1
( #2590 )
...
This change uses the updated Marlin MoE kernel from vLLM to support
MoE with activation sorting and groups.
2024-09-30 19:40:25 +02:00
Daniël de Kok
90a1d04a2f
Add support for GPTQ-quantized MoE models using MoE Marlin ( #2557 )
...
This change add support for MoE models that use GPTQ quantization.
Currently only models with the following properties are supported:
- No `desc_act` with tensor parallelism, unless `group_size=-1`.
- No asymmetric quantization.
- No AWQ.
2024-09-30 11:14:32 +02:00
Mohit Sharma
f9e561eced
Update ROCM libs and improvements ( #2579 )
...
* style
* update torch
* ix issues
* fix clone
* revert mkl
* added custom PA
* style
* fix style
* style
* hide env vart
* fix mixtral model
* add skinny kernel and merge fixes
* fixed style
* fix issue for sliding window models
* addressed review comments
* fix import
* improved error messag
* updated default value
* remove import
* fix imports after rebase
* float16 dep
* improve dockerfile
* cleaned dockerfile
2024-09-30 10:54:32 +02:00
Daniël de Kok
3f14cd1420
Add DenseMoELayer
and wire it up in Mixtral/Deepseek V2 ( #2537 )
...
This replaces the custom layers in both models.
2024-09-24 14:27:06 +02:00
Daniël de Kok
ce85efa968
Move to moe-kernels package and switch to common MoE layer ( #2511 )
...
* Move to moe-kernels package and switch to common MoE layer
This change introduces the new `moe-kernels` package:
- Add `moe-kernels` as a dependency.
- Introduce a `SparseMoELayer` module that can be used by MoE
models.
- Port over Mixtral and Deepseek.
* Make `cargo check` pass
* Update runner
2024-09-17 18:08:58 +02:00