Published August 18, 2025
| Version v1
Software
Open
X-MoE: Enabling Scalable Training for Emerging Mixture-of-Experts Architectures on HPC Platforms
Authors/Creators
Description
X-MoE is an optimized cross-platform framework for training large-scale expert-specialized Mixture-of-Experts (MoE) models. It introduces system-level enhancements for improved end-to-end throughput and memory efficiency.
Files
X-MoE.zip
Files
(411.9 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:2d06a064bf92507f670934b50e05c09f
|
411.9 MB | Preview Download |
Additional details
Software
- Repository URL
- https://github.com/Supercomputing-System-AI-Lab/X-MoE
- Programming language
- Python