Pathway to exascale: experiences in adopting more scalable algorithms in ESPResSo
- 1. Institute for Computational Physics, University of Stuttgart, Germany
Description
Soft matter systems often exhibit physical phenomena that resolve different time- or length-scales, which can only be captured in numerical simulations by a multiscale approach that combines particle-based methods and grid-based methods. These algorithms have hardware-dependent performance characteristics and usually leverage one of the following optimizations: CPU vectorization, shared memory parallelization and offloading to the GPU.
The ESPResSo package[1] combines a molecular dynamics (MD) engine with a lattice-Boltzmann (LB) solver, electrostatics solvers and Monte Carlo schemes to model reactive and charged matter from the nanoscale to the mesoscale, such as gels, energy materials, and biological structures[2]. The LB method is widely used to model solvents and diffusive species that interact with solid boundaries and particles. The popularity of the method can be explained by its simplicity, re-usability in different contexts, and excellent scalability on massively parallel systems. New LB schemes can be rapidly prototyped in Jupyter Notebooks using LbmPy[3] and PyStencils[4], which rely on a symbolic formulation of the LB method to generate highly optimized and hardware-specific C++ and CUDA kernels, that can be re-used in waLBerla[5].
Originally designed for high-throughput computing, ESPResSo has recently found new scientific applications that require resources only available at high-performance computing (HPC) facilities. Major structural changes were necessary to make efficient use of these resources:[6] replacing the original LB code by waLBerla, a library tailored for HPC; rewriting the MD engine to support data layouts optimized for memory access; and redesigning the particle management code to reduce communication overhead. These changes make ESPResSo more performant, productive and portable, and easily extensible and re-usable in other domains of soft matter physics. In collaboration with our partners of the Cluster of Excellence MultiXscale, the software is now available on EasyBuild and will be part of the EESSI[7] pilot.
References:
[1] Weik et al. "ESPResSo 4.0 – an extensible software package for simulating soft matter systems". In: European Physical Journal Special Topics 227.14, 2019. doi:10.1140/epjst/e2019-800186-9
[2] Weeber et al. "ESPResSo, a Versatile Open-Source Software Package for Simulating Soft Matter Systems". In: Comprehensive Computational Chemistry. Elsevier, 2024. doi:10.1016/B978-0-12-821978-2.00103-3
[3] Bauer et al. "lbmpy: Automatic code generation for efficient parallel lattice Boltzmann methods". In: Journal of Computational Science 49, 2021. doi:10.1016/j.jocs.2020.101269
[4] Bauer et al. "Code generation for massively parallel phase-field simulations". In: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, 2019. doi:10.1145/3295500.3356186
[5] Bauer et al. "waLBerla: A block-structured high-performance framework for multiphysics simulations". In: Computers & Mathematics with Applications 81, 2021. doi:10.1016/j.camwa.2020.01.007
[6] Grad, Weeber, "Report on the current scalability of ESPResSo and the planned work to extend it". MultiXscale Deliverable, EuroHPC Centre of Excellence MultiXscale, 2023. doi:10.5281/zenodo.8420222
[7] Dröge et al. "EESSI: A cross-platform ready-to-use optimised scientific software stack". In: Software: Practice and Experience 53(1), 2023. doi:10.1002/spe.3075
Files
derse24_espresso_exascale.pdf
Files
(657.6 kB)
Name | Size | Download all |
---|---|---|
md5:4037a167d3fea4f4b33ec96f5199d627
|
657.6 kB | Preview Download |
Additional details
Funding
- MultiXscale – Centre of Excellence in exascale-oriented application co-design and delivery for multiscale simulations 101093169
- European Commission
- Verbundprojekt: MultiXscale: HPC-Exzellenzzentrum für Multi-Skalen-Simulationen auf Höchstleitungsrechnern 16HPC095
- Federal Ministry of Education and Research
Dates
- Available
-
2024-03-04Original publication on HIFIS (https://events.hifis.net/event/994/contributions/7959/)
Software
- Repository URL
- https://github.com/espressomd/espresso
- Programming language
- Python, C++, Cuda
- Development Status
- Active
References
- Weik et al. "ESPResSo 4.0 – an extensible software package for simulating soft matter systems". In: European Physical Journal Special Topics 227.14, 2019. doi:10.1140/epjst/e2019-800186-9
- Weeber et al. "ESPResSo, a Versatile Open-Source Software Package for Simulating Soft Matter Systems". In: Comprehensive Computational Chemistry. Elsevier, 2024. doi:10.1016/B978-0-12-821978-2.00103-3
- Bauer et al. "lbmpy: Automatic code generation for efficient parallel lattice Boltzmann methods". In: Journal of Computational Science 49, 2021. doi:10.1016/j.jocs.2020.101269
- Bauer et al. "Code generation for massively parallel phase-field simulations". In: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, 2019. doi:10.1145/3295500.3356186
- Bauer et al. "waLBerla: A block-structured high-performance framework for multiphysics simulations". In: Computers & Mathematics with Applications 81, 2021. doi:10.1016/j.camwa.2020.01.007
- Grad, Weeber, "Report on the current scalability of ESPResSo and the planned work to extend it". MultiXscale Deliverable, EuroHPC Centre of Excellence MultiXscale, 2023. doi:10.5281/zenodo.8420222
- Dröge et al. "EESSI: A cross-platform ready-to-use optimised scientific software stack". In: Software: Practice and Experience 53(1), 2023. doi:10.1002/spe.3075
- Holzer et al. "Highly efficient lattice Boltzmann multiphase simulations of immiscible fluids at high-density ratios on CPUs and GPUs through code generation". In: International Journal of High Performance Computing Applications 35(4), 2021. doi:10.1177/10943420211016525