Published January 22, 2024 | Version v1
Dataset Open

Data of "Self-consistency Reinforced minimal Gated Recurrent Unit for surrogate modeling of history-dependent non-linear problems: application to history-dependent homogenized response of heterogeneous materials"

  • 1. University of Liege
  • 2. Université de Liège

Description

Development of the Self-Consistency reinforced Minimum Recurrent Unit (SC-MRU)

This directory contains the data and algorithms generated in publication1

Table of Contents

  1. Dependencies and Prerequisites
  2. Structure of Repository
  3. Part 1: Data preparation
  4. Part 2: RNN training
  5. Part 3: Multiscale analysis
  6. Part 4: Reproduce paper[^1] figures

Dependencies and Prerequisites

 

  • Python, pandas, matplotlib, texttabble and latextable are pre requisites for visualizing and navigating the data.

  • For generating mesh and for vizualization, gmsh (www.gmsh.info) is required.

  • For running simulations, cm3Libraries (http://www.ltas-cm3.ulg.ac.be/openSource.htm) is required.

Instructions using apt & pip3 package manager

Instructions for Debian/Ubuntu based workstations are as follows.

python, pandas and dependencies

 sudo apt install python3 python3-scipy libpython3-dev python3-numpy python3-pandas

matplotlib, texttabble and latextable

 pip3 install matplotlib texttable latextable

Pytorch (only for run with cm3Libraries)

  • Without GPU
 pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
  • With GPU
 pip3 install torch torchvision torchaudio

Libtorch (for compiling the cells)

  • Without GPU: In a local directory (e.g. ~/local with export TORCHDIR=$HOME/local/libtorch)
 wget https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-2.1.1%2Bcpu.zip
 unzip libtorch-shared-with-deps-2.1.1%2Bcpu.zip
  • With GPU: In a local directory (e.g. ~/local with export TORCHDIR=$HOME/local/libtorch)
 wget https://download.pytorch.org/libtorch/cu121/libtorch-shared-with-deps-2.1.1%2Bcu121.zip
 unzip libtorch-shared-with-deps-2.1.1+cu121.zip

Structure of Repository

Part 1: Data preparation

Generate the loading paths

  • TrainingPaths/testGenerationData.py is used to generate random walk paths, with the options
    • Rmax = 0.11 # bound on the final Green Lagrange strain
    • TimeStep = 1. # in second
    • EvalStep = [1e-4,5e-3] #Bounds on the Green Lagrange increments
    • Nmax = 2500 #maximum length of the sequence
    • k = 4000 # number of path to generate
    • The path are storred by default in ConstRVE/Paths/. The path has to be existing before launching the script. You can change the name in line 123 saveDir = '../ConstRVE'+'/Paths/'.
    • Examples of generated paths can be found in ConstRVE/PathsExamples/
    • The command to be run from the directory TrainingPaths is
(mkdir ../ConstRVE/Paths) #if needed
python3 testGenerationData.py
  • TrainingPaths/generationData_Cyclic.py is used to generate random cylic paths, with the options
    • Rmax = [np.random.uniform(0.,0.04),np.random.uniform(0.,0.06),np.random.uniform(0.0,0.09),0.12] # bound on the final Green Lagrange strain is random
    • TimeStep = 1. # in second
    • EvalStep = [1e-4,5e-3] #Bounds on the Green Lagrange increments
    • Nmax = 2500 #maximum length of the sequence
    • k = 2000 # number of path to generate
    • The path are stored by default in ConstRVE/Paths/. You can change the name in line 123 saveDir = '../ConstRVE'+'/Paths/'.
    • The command to be run from the directory TrainingPaths is
(mkdir ../ConstRVE/Paths) #if needed
 python3 generationData_Cyclic.py
  • TrainingPaths/countPathLength.py gives average, minimum and maximum lengths of the generated paths and the distribution of the \Delta R. By default the paths are read in ConstRVE/Paths/ but the directory can be given as an argument. The file can be used to read
    • either the generated loading paths
 python3 countPathLength.py '../ConstRVE/PathsExamples'
 python3 countPathLength.py '../All_Path_Res/Path_Res9'

Generate the RVEs direct simulation results

 python3 Rve_withoutInternalVars.py

Collect, normalised and truncate the RVEs direct simulation results as training and testing data

 python3 CheckNanData.py
 python3 CollectData.py
 python3 Normalization.py
 python3 Data_Padding.py

Part 2: RNN training

Available rnn

Compile the neural network models

  • In the adequate directory, e.g. SC_MRU/MGRU/NNW_SMRU for the Neural network with SMRU recurrent cell.
 cd build
 rm -rf *
 cmake -DCMAKE_PREFIX_PATH=$TORCHDIR ..
 make
  • This create the RNN_CELL model in the build directory

Train the neural network models

  • In the adequate directory, e.g. SC_MRU/MGRU/NNW_SMRU for the Neural network with SMRU recurrent cell.
  • Requires to have compiled the RNN model.
  • The file Train.py
    • Uses the RNN_CELL model compiled in the build directory.
    • Uses the trimmed and padded normalised training and testing data in TrainingData/Processed_Data/ as 'GS_Train'N and 'GS_Test'N, respectively.
    • Can be modified to use the requested ratios of training and testing sequences of different lengths to prepate the mini-baches, e.g.:
      • ratio = [0.2,0.02,0.6,0.06,0.01,0.9]
      • PathIn1 = ['../../../TrainingData/Processed_data/GS_Train200','../../../TrainingData/Processed_data/GS_Train2500','../../../TrainingData/Processed_data/GS_Test200','../../../TrainingData/Processed_data/GS_Test2500']
      • PathIn2 = ['../../../TrainingData/Processed_data/GS_Test200','../../../TrainingData/Processed_data/GS_Test400','../../../TrainingData/Processed_data/GS_Test2500','../../../TrainingData/Processed_data/GS_Test400']
    • Saves the mini-batches in
      • PathOut = "TrainingData.pt"
    • Saves the model module-checkpoint.pt and module-checkpoint-optimizer.pt, and loss evolution Loss.txt in
      • Module/Hn with n the number of hiden variables.
      • Output directory can be changes in NNW_CELL.cpp of the cell name CELL (and recompiling).
      • Warm start can be disabled in by commenting torch::load(net, "./Module/H120/module-checkpoint_CM0.pt"); and torch::load(optimizer, "./Module/H120/module-optimizer-checkpoint_CM0.pt"); in NNW_CELL.cpp of the cell name CELL (and recompiling).
      • Is executed with
 python3 Train.py

Convert trained c++ models for pyTorch (in view of multiscale simulations)

python3 Save_RNN_script.py

Vizualize loss evolution and testing results

  • SC_MRU/CheckLoss/Data_prepare.py contains fucntion used for testing.
  • SC_MRU/CheckLoss/LossVS_insertN.py tests the effect of testing data augmentation:
    • Requires to have compiled the RNN models.
    • Cell type and augmentation kind can be modified:
      • repeat = 10 #number of tests
      • dataType ='random' #'even' or 'random'
      • InpType = cell[0] #choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``
      • reevaluate= False # False to use the saved loss values and True to evaluate the loss values
      • fastshifting=False # False to vizualize before fast shifting (Fig. 11) and True after (Fig. 12). When reevaluate== True, this has no effect: the training with fast-shifting or not has to be done manually, see details.
    • Generates new Loss_CELL.txt and TrainingData.pt files in case reevaluate== True, read the existing one if reevaluate== False.
    • Generates Figs. 11 and 12 from SC_MRU/CheckLoss/
 python3 LossVS_insertN.py
  • SC_MRU/CheckLoss/Plot_GS.py test the different NNW on RVE testing paths:
    • Requires to have compiled the RNN models.
    • Cell type and augmentation kind can be modified:
      • InpType = cell[0] #choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``
    • Generate Figs. 13, 14 and 15 from SC_MRU/CheckLoss/
    • Command to be run from SC_MRU/CheckLoss/ is
 python3 Plot_GS.py
  • SC_MRU/CheckLoss/Plot_GS_step.py tests the different pyTorch NNWs on the RVE testing paths:
    • Requires to have converted the RNN models.
    • Cell type and augmentation kind can be modified:
      • InpType = cell[0] #choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``
    • Command to be run from SC_MRU/CheckLoss/ is
 python3 Plot_GS_step.py
 python3 PlotLoss_HiddenV.py
 python3 PlotLoss_3MRU.py
 python3 PlotLoss_RefNL_H120.py
 python3 PlotLoss_MGRU.py
  • SC_MRU/PlotLoss_NL.py shows the loss evolution for the different non-liner transition layers (quadratic and Leaky ReLU) of the SC-MRU-I cell --after having trained or using saved loss files-- and for the 120 hidden variables:
    • Generates Fig. B.19.
    • Case=1 # 0 for quadratic transition blocks and 1 for Leaky ReLU transition blocks
    • Command to be run from SC_MRU/ is
 python3 PlotLoss_NL.py
 python3 PlotLoss_DiffNL_H120.py

Part 3: Multiscale analysis

Trained surrogate models

Run mutiscale simulations using the surrogates

python3 model.py

Vizualize mutiscale simulations results

  • MultiScale/2D_MultiScale/plot_force.py: is used to plot the multiscale simulations results:
    • InpType = cell[0] #choose the recurrent cell typeAmong the different available cells cell=["SC_MRU_T","SC_MRU_I","SMRU"].
    • The multiscale simulations results to be plotted are saved in the directories CELL_120_Step, where Step # is the number of steps during the reloading stage between points B and C.
    • The command to be run from the directory MultiScale/2D_MultiScale/ is
python3 plot_force.py

Part 4: Reproduce paper1 figures

  • Fig. 7: The command to be run from the diretory TrainingPaths is
 python3 countPathLength.py '../ConstRVE/PathsExamples'
  • Fig. 9: The command to be run from the diretory SC_MRU/ is
 python3 PlotLoss_HiddenV.py
  • Fig. 10: The command to be run from the diretory SC_MRU/ is
 python3 PlotLoss_3MRU.py
  • Fig. 11: The command to be run from the diretory SC_MRU/ is
 python3 PlotLoss_DiffNL_H120.py
 python3 LossVS_insertN.py
 python3 Plot_GS.py
python3 plot_force.py
 python3 PlotLoss_MGRU.py.py
 python3 PlotLoss_NL.py
 python3 PlotLoss_DiffNL_H120.py

Disclaimer

This project has received funding from the European Union’s Horizon Europe Framework Programme under grant agreement No. 101056682 for the project “DIgital DEsign strategies to certify and mAnufacture Robust cOmposite sTructures (DIDEAROT)”. The contents of this publication are the sole responsibility of ULiege and do not necessarily reflect the opinion of the European Union. Neither the European Union nor the granting authority can be held responsible for them.

  1. The work is described in:
    "Wu, L. and Noels, L. (2024). Self-consistency Reinforced minimal Gated Recurrent Unit for surrogate modeling of history-dependent non-linear problems: application to history-dependent homogenized response of heterogeneous materials 424: 116881, doi: 10.1016/j.cma.2024.116881" which can be downloaded. We would be grateful if you could cite this publication in case you use the files. 2 3

Files

All_Path_Res.zip

Files (11.2 GB)

Name Size Download all
md5:28a96ea53f9fca5a3ab0d93ce0642d74
1.7 GB Preview Download
md5:665813980c08d0aea1957f9b7d5536f9
134.2 MB Preview Download
md5:9e344f381c5184e71b3f9cbbcc6c99e4
64.7 MB Preview Download
md5:df5a5be38168f868bc83fe5da000076e
26.3 kB Preview Download
md5:84a8471a765a62786273fee803ffd40a
141.5 MB Preview Download
md5:2ba488e15847317724a8fdf7f56b95fc
9.2 GB Preview Download
md5:4898cb6e5d8dc06deba33d1fc573580a
592.4 kB Preview Download

Additional details

Related works

Documents
Publication: 10.1016/j.cma.2024.116881 (DOI)

Funding

DIDEAROT – Digital Design strategies to certify and mAnufacture Robust cOmposite sTructures  101056682
European Commission