-
LocMoE: A Low-overhead MoE for Large Language Model Training
Paper • 2401.13920 • Published • 2 -
HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of Experts
Paper • 2312.07035 • Published • 2 -
DEMix Layers: Disentangling Domains for Modular Language Modeling
Paper • 2108.05036 • Published • 3
Collections
Discover the best community collections!
Collections including paper arxiv:2401.13920
-
Turn Waste into Worth: Rectifying Top-k Router of MoE
Paper • 2402.12399 • Published • 2 -
CompeteSMoE -- Effective Training of Sparse Mixture of Experts via Competition
Paper • 2402.02526 • Published • 3 -
Buffer Overflow in Mixture of Experts
Paper • 2402.05526 • Published • 8 -
OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Paper • 2402.01739 • Published • 26
-
Robust Mixture-of-Expert Training for Convolutional Neural Networks
Paper • 2308.10110 • Published • 2 -
Experts Weights Averaging: A New General Training Scheme for Vision Transformers
Paper • 2308.06093 • Published • 2 -
ConstitutionalExperts: Training a Mixture of Principle-based Prompts
Paper • 2403.04894 • Published • 2 -
Mixture-of-LoRAs: An Efficient Multitask Tuning for Large Language Models
Paper • 2403.03432 • Published • 1
-
Non-asymptotic oracle inequalities for the Lasso in high-dimensional mixture of experts
Paper • 2009.10622 • Published • 1 -
MoE-LLaVA: Mixture of Experts for Large Vision-Language Models
Paper • 2401.15947 • Published • 48 -
MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Paper • 2401.04081 • Published • 70 -
MoE-Infinity: Activation-Aware Expert Offloading for Efficient MoE Serving
Paper • 2401.14361 • Published • 2
-
Mixtral of Experts
Paper • 2401.04088 • Published • 157 -
MoE-LLaVA: Mixture of Experts for Large Vision-Language Models
Paper • 2401.15947 • Published • 48 -
MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Paper • 2401.04081 • Published • 70 -
EdgeMoE: Fast On-Device Inference of MoE-based Large Language Models
Paper • 2308.14352 • Published