There is a newer version of the record available.

Published December 13, 2024 | Version 1.0.0
Workflow Open

Meson spectroscopy in the Sp(4) gauge theory with three antisymmetric fermions—Analysis workflow

  • 1. Swansea University
  • 2. ROR icon Pusan National University
  • 3. ROR icon University of Tsukuba
  • 4. ROR icon Institute for Basic Science
  • 5. ROR icon Chung Yuan Christian University
  • 6. ROR icon University of Plymouth

Description

The workflow in this repository performs the analyses presented in the paper Meson spectroscopy in the Sp(4) gauge theory with three antisymmetric fermions.

Requirements

Notes for Apple silicon users

This project uses Julia, installed via a Conda environment, which at time of writing is not available on Apple silicon. To automate the environment setup process for the workflow, an x86-64 version of Snakemake must be used.

To set this up, create a new x86-64 Conda environment with Snakemake, using

conda create -n snakemake_x86 -c conda-forge -c bioconda snakemake

then activate this and install Mamba. At time of writing, Snakemake is not compatible with versions of Mamba from 2.0.0 onwards, so this version must be constrained for the Conda integration to work correctly:

conda activate snakemake_x86
conda install -c conda-forge 'mamba<2.0.0'

With this environment active, the steps below involving running snakemake should work correctly.

Setup

  1. Install the dependencies above.

  2. Clone this repository including submodules (or download its Zenodo release and unzip it) and cd into it:

    git clone --recurse-submodules https://github.com/telos-collaboration/antisymmetric_analysis_2024
    cd antisymmetric_analysis_2024
    
  3. Either:

    1. Download the raw_data.zip file from the data release, and extract it into the root of the repository, or

    2. Download the correlators_smear.h5, correlators_wall.h5, flows.h5, and hmc.h5 files from the data release, and place them into the data_assets directory. Instruct Snakemake that these files are up to date by running

      snakemake --touch data_assets/{correlators_smear,correlators_wall,flows,hmc}.h5
      
  4. Download the ensemble_metadata.csv file from the data release, and place it into the metadata directory.

Running the workflow

The workflow is run using Snakemake:

snakemake --cores 1 --use-conda

where the number 1 may be replaced by the number of CPU cores you wish to allocate to the computation.

Snakemake will automatically download and install all required Python packages. This requires an Internet connection; if you are running in an HPC environment where you would need to run the workflow without Internet access, details on how to preinstall the environment can be found in the Snakemake documentation.

Using --cores 6 on a MacBook Pro with an M1 Pro processor, the analysis takes around 13 minutes starting from HDF5 files, and around 5 hours starting from raw data.

Output

Output plots, tables, and definitions are placed in the assets/plots, assets/tables, and assets/definitionsdirectories.

Output data assets are placed into the data_assets directory.

Intermediary data are placed in the intermediary_data directory.

Reusability

This workflow is relatively tailored to the data which it was originally written to analyse. Additional ensembles may be added to the analysis by adding relevant files to the raw_data directory, and adding corresponding entries to the files in the metadata directory. Tools present in the tools directory may be used to help determine some inputs for the latter; for example, the plateaux positions and lengths. However, extending the analysis in this way has not been as fully tested as the rest of the workflow, and is not guaranteed to be trivial for someone not already familiar with the code.

Other (English)

We would like to thank Giacomo Cacciapaglia, Gabriele Ferretti, Thomas Flacke, Anna Hasenfratz, Chulwoo Jung, and Sarada Rajeev, for very helpful discussions during the “PNU Workshop on Composite Higgs: Lattice study and all”, at Haeundae, Busan, in February 2024, where preliminary results of this study were presented. We also thank Will Detmold, Alberto Ramos, and André Walker-Loud for useful discussions. 

The work of EB and BL is supported in part by the EPSRC ExCALIBUR programme ExaTEPP (project EP/X017168/1). 
The work of EB, BL, and MP has been supported by the STFC Consolidated Grant No. ST/X000648/1.
The work of EB has also been supported by the UKRI Science and Technology Facilities Council (STFC) Research Software Engineering Fellowship EP/V052489/1.
The work of DKH was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2017R1D1A1B06033701). 
The work of DKH was further supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (2021R1A4A5031460).
The work of JWL is supported by IBS under the project code, IBS-R018-D1. 
The work of HH and CJDL is supported by the Taiwanese MoST grant 109-2112-M-009-006-MY3 and NSTC grant 112-2112-M-A49-021-MY3. 
The work of CJDL is also supported by Grants No. 112-2639-M-002-006-ASP and No. 113-2119-M-007-013-.
The work of BL and MP has been further supported in part by the STFC  Consolidated Grant No. ST/T000813/1.
BL and MP received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program under Grant Agreement No. 813942. 
The work of DV is supported by STFC under Consolidated Grant No. ST/X000680/1.

Numerical simulations have been performed on the Swansea University SUNBIRD cluster (part of the Supercomputing Wales project) and AccelerateAI A100 GPU system, on the local HPC clusters in Pusan National University (PNU), in Institute for Basic Science (IBS) and in National Yang Ming Chiao Tung University (NYCU), and on the DiRAC Data Intensive service at Leicester. The Swansea University SUNBIRD system and AccelerateAI are part funded by the European Regional Development Fund (ERDF) via Welsh Government.

The DiRAC Data Intensive service at Leicester is operated by the University of Leicester IT Services, which forms part of the STFC DiRAC HPC Facility (www.dirac.ac.uk). The DiRAC Data Intensive service equipment at Leicester was funded by BEIS capital funding via STFC capital grants ST/K000373/1 and ST/R002363/1 and STFC DiRAC Operations grant ST/R001014/1. DiRAC is part of the National e-Infrastructure.

Files

README.md

Files (157.7 kB)

Name Size Download all
md5:2b19553ebe7d8ecaf5aa8e1a3e493bbd
153.3 kB Preview Download
md5:2133d36835ed875a3799310d0e6270e7
4.4 kB Preview Download

Additional details

Related works

Is referenced by
Preprint: 10.48550/arXiv.2412.01170 (DOI)
Requires
Dataset: 10.5281/zenodo.13819562 (DOI)

Funding

UK Research and Innovation
Theoretical and Experimental Particle Physics at the Exascale Frontier EP/X017168/1
UK Research and Innovation
Theoretical Particle Physics and Cosmology ST/X000648/1
UK Research and Innovation
Reproducible analysis frameworks in Lattice Field Theory and STFC-enabled computational research in Wales EP/V052489/1
European Commission
EuroPLEx – European network for Particle physics, Lattice field theory and Extreme computing 813942
UK Research and Innovation
Lattice investigations of strongly Interacting theories in the Standard Model and beyond. ST/X000680/1

Software

Repository URL
https://github.com/telos-collaboration/antisymmetric_analysis_2024
Programming language
Python, Snakemake, Julia
Development Status
Inactive