Published March 1, 2023
| Version 23.03
Software
Open
ECP-WarpX/WarpX: 23.03
Creators
-
Vay, Jean-Luc1
-
Almgren, Ann1
-
Amorim, Lígia Diana1
-
Andriyash, Igor2
-
Belkin, Daniel1
- Bizzozero, David1
- Blelly, Aurore1
-
Clark, Stephen Eric3
-
Fedeli, Luca4
-
Garten, Marco1
-
Ge, Lixin5
-
Gott, Kevin1
-
Harrison, Cyrus3
-
Huebl, Axel1
-
Giacomel, Lorenzo6
-
Groenewald, Roelof E.7
-
Grote, David3
-
Gu, Junmin1
-
Jambunathan, Revathi1
-
Klion, Hannah1
-
Kumar, Prabhat1
-
Thévenet, Maxence8
-
Richardson, Glenn1
-
Shapoval, Olga1
-
Lehe, Remi1
-
Loring, Burlen1
-
Miller, Phil9
-
Myers, Andrew1
-
Rheaume, Elisa1
-
Rowan, Michael E.1
-
Sandberg, Ryan T.1
-
Scherpelz, Peter7
-
Yang, Eloise1
-
Zhang, Weiqun1
-
Zhao, Yinjian1
-
Zhu, Kevin Z.7
-
Zoni, Edoardo1
-
Zaim, Neïl4
- 1. Lawrence Berkeley National Laboratory
- 2. Laboratoire d'Optique Appliquée
- 3. Lawrence Livermore National Laboratory
- 4. LIDYL, CEA-Université Paris-Saclay, CEA Saclay
- 5. SLAC National Accelerator Laboratory
- 6. CERN
- 7. Modern Electron
- 8. Lawrence Berkeley National Laboratory, now DESY
- 9. Intense Computing
Contributors
- 1. Lawrence Berkeley National Laboratory
- 2. Intel
- 3. Modern Electron
- 4. Bloomberg LP
- 5. Helmholtz Institute Jena
- 6. SLAC National Accelerator Laboratory
Description
Dependencies
AMReX: 23.03
PICSAR (development
, incl. QED): 1903ecfff51a31a321d39790af90d8520c10537e
picmistandard: release 0.0.22
openPMD-api: releases 0.14.2-0.14.*
This list was generated with
git log 23.02.. --format='- %s'
- Release 23.03 (#3719)
- GB: Added DOI (#3718)
- AMReX: Weekly Update (#3715)
- Change plotfile RZ allowed diags to r,t from x,y (#3663)
- Reduced diagnostics: charge on the embedded boundary (#3648)
- Doc: Lawrencium (LBNL) Machine (#3700)
- Fix getBulkMomentum for Maxwellian distributions (#3705)
- Add option to increase
max_step
orstop_time
to fill BTD (#3693) - Doc: Perlmutter 80 GB GPUs (#3706)
- Fix Silver-Mueller boundary condition in 1D (#3703)
- Doc: HPC build/bin (#3675)
- AMReX: Weekly Update (#3701)
- Renamed "particle_vel" to "particle_mom" (#3672)
- Add warning message in RZ for < 2 azimuthal modes (#3655)
- GetExternalEBField: Use AMReX's CompileTimeOption ParallelFor (#3696)
- Add FieldReduction and ParticleHistogram to PICMI (#3697)
- add flag to picmi
FieldDiagnostic
for whether particle data should be saved along with the field data (#3699) - Update Crusher Modules to cce/15.0.0 and others (#3688)
- Pass seed for gpu in ResetRandomSeed (#3682)
- Updated gitignore with vscode-specific folder (#3685)
- CI & Bug Fixes: 1D & 2D Compile (#3680)
- Fix bug in mirror with F,G fields (#3681)
- Bug fix: retain particles in boundary buffer during Redistribute() (#3679)
- Pre-Commit: Smaller Files & NB (#3678)
- AMReX: Weekly Update (#3677)
Files
ECP-WarpX/WarpX-23.03.zip
Files
(6.4 MB)
Name | Size | Download all |
---|---|---|
md5:354cebe1f1f1dd33b985771fa74c6ea8
|
6.4 MB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/ECP-WarpX/WarpX/tree/23.03 (URL)
References
- Myers A, Almgren A, Amorim LD, Bell J, Fedeli L, Ge L, Gott K, Grote DP, Hogan M, Huebl A, Jambunathan R, Lehe R, Ng C, Rowan M, Shapoval O, Thevenet M, Vay JL, Vincenti H, Yang E, Zaim N, Zhang W, Zhao Y, Zoni E. Porting WarpX to GPU-accelerated platforms. Parallel Computing. 2021 Sep, 108:102833. https://doi.org/10.1016/j.parco.2021.102833
- Fedeli L, Zaim N, Sainte-Marie A, Thevenet M, Huebl A, Myers A, Vay J-L, Vincenti H. PICSAR-QED: a Monte Carlo module to simulate Strong-Field Quantum Electrodynamics in Particle-In-Cell codes for exascale architectures. New Journal of Physics. in-press, 2022. https://arxiv.org/abs/2110.00256
- Zoni E, Lehe R, Shapoval O, Belkin D, Zaim N, Fedeli L, Vincenti H, Vay J-L. A Hybrid Nodal-Staggered Pseudo-Spectral Electromagnetic Particle-In-Cell Method with Finite-Order Centering. under review, 2022. https://arxiv.org/abs/2106.12919
- Shapoval O, Lehe R, Thevenet M, Zoni E, Zhao Y, Vay J-L. Overcoming timestep limitations in boosted-frame Particle-In-Cell simulations of plasma-based acceleration. Phys. Rev. E. Nov 2021, 104:055311. https://doi.org/10.1103/PhysRevE.104.055311
- Vay JL, Huebl A, Almgren A, Amorim LD, Bell J, Fedeli L, Ge L, Gott K, Grote DP, Hogan M, Jambunathan R, Lehe R, Myers A, Ng C, Rowan M, Shapoval O, Thevenet M, Vincenti H, Yang E, Zaim N, Zhang W, Zhao Y, Zoni E. Modeling of a chain of three plasma accelerator stages with the WarpX electromagnetic PIC code on GPUs. Physics of Plasmas. 2021 Feb 9, 28(2):023105. https://doi.org/10.1063/5.0028512
- Rowan ME, Gott KN, Deslippe J, Huebl A, Thevenet M, Lehe R, Vay JL. In-situ assessment of device-side compute work for dynamic load balancing in a GPU-accelerated PIC code. PASC '21: Proceedings of the Platform for Advanced Scientific Computing Conference. 2021 July, 10, pages 1-11. https://doi.org/10.1145/3468267.3470614
- Vay JL, Almgren A, Bell J, Ge L, Grote DPHogan M, Kononenko O, Lehe R, Myers A, Ng C, Park J, Ryne R, Shapovala O, Thevene M, Zhang W. Warp-X: A new exascale computing platform for beam–plasma simulations. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 2018 Nov, 909(12) Pages 476-479. https://doi.org/10.1016/j.nima.2018.01.035