Published August 18, 2025 | Version v1
Software Open

X-MoE: Enabling Scalable Training for Emerging Mixture-of-Experts Architectures on HPC Platforms

  • 1. ROR icon University of Illinois Urbana-Champaign
  • 2. ROR icon Oak Ridge National Laboratory

Description

X-MoE is an optimized cross-platform framework for training large-scale expert-specialized Mixture-of-Experts (MoE) models. It introduces system-level enhancements for improved end-to-end throughput and memory efficiency.

Files

X-MoE.zip

Files (411.9 MB)

Name Size Download all
md5:2d06a064bf92507f670934b50e05c09f
411.9 MB Preview Download

Additional details

Software