Data of "Self-consistency Reinforced minimal Gated Recurrent Unit for surrogate modeling of history-dependent non-linear problems: application to history-dependent homogenized response of heterogeneous materials"
- 1. University of Liege
- 2. Université de Liège
Description
Development of the Self-Consistency reinforced Minimum Recurrent Unit (SC-MRU)
This directory contains the data and algorithms generated in publication1
Table of Contents
- Dependencies and Prerequisites
- Structure of Repository
- Part 1: Data preparation
- Part 2: RNN training
- Part 3: Multiscale analysis
- Part 4: Reproduce paper[^1] figures
Dependencies and Prerequisites
-
Python, pandas, matplotlib, texttabble and latextable are pre requisites for visualizing and navigating the data.
-
For generating mesh and for vizualization, gmsh (www.gmsh.info) is required.
-
For running simulations, cm3Libraries (http://www.ltas-cm3.ulg.ac.be/openSource.htm) is required.
Instructions using apt & pip3 package manager
Instructions for Debian/Ubuntu based workstations are as follows.
python, pandas and dependencies
sudo apt install python3 python3-scipy libpython3-dev python3-numpy python3-pandas
matplotlib, texttabble and latextable
pip3 install matplotlib texttable latextable
Pytorch (only for run with cm3Libraries)
- Without GPU
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
- With GPU
pip3 install torch torchvision torchaudio
Libtorch (for compiling the cells)
- Without GPU: In a local directory (e.g.
~/local
withexport TORCHDIR=$HOME/local/libtorch
)
wget https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-2.1.1%2Bcpu.zip
unzip libtorch-shared-with-deps-2.1.1%2Bcpu.zip
- With GPU: In a local directory (e.g.
~/local
withexport TORCHDIR=$HOME/local/libtorch
)
wget https://download.pytorch.org/libtorch/cu121/libtorch-shared-with-deps-2.1.1%2Bcu121.zip
unzip libtorch-shared-with-deps-2.1.1+cu121.zip
Structure of Repository
- All_Path_Res: results of the direct numerical simulations used as training and testing data, see details in Part 1: Data preparation.
- ConstRVE: script to run direct numerical finite element simulations, see details in Part 1: Data preparation.
- MultiScale: scripts to run and visualise the multiscale analyses, see details in Part 3: Multiscale analysis.
- SC_MRU: implementation of the RNN and scripts to train them, see details in Part 2: RNN training.
- TrainingData: scripts to collect, normalise and truncate the RVEs direct simulation results as training and testing data, see details in Part 1: Data preparation. The director also contained the storred processed data used in 1.
- TrainingPaths: scripts to generate the different loading paths for the direct numerical simulations used as training and testing data, see details in Part 1: Data preparation.
Part 1: Data preparation
Generate the loading paths
- TrainingPaths/testGenerationData.py is used to generate random walk paths, with the options
Rmax = 0.11
# bound on the final Green Lagrange strainTimeStep = 1.
# in secondEvalStep = [1e-4,5e-3]
#Bounds on the Green Lagrange incrementsNmax = 2500
#maximum length of the sequencek = 4000
# number of path to generate- The path are storred by default in ConstRVE/Paths/. The path has to be existing before launching the script. You can change the name in line 123
saveDir = '../ConstRVE'+'/Paths/'
. - Examples of generated paths can be found in ConstRVE/PathsExamples/
- The command to be run from the directory TrainingPaths is
(mkdir ../ConstRVE/Paths) #if needed
python3 testGenerationData.py
- TrainingPaths/generationData_Cyclic.py is used to generate random cylic paths, with the options
Rmax = [np.random.uniform(0.,0.04),np.random.uniform(0.,0.06),np.random.uniform(0.0,0.09),0.12]
# bound on the final Green Lagrange strain is randomTimeStep = 1.
# in secondEvalStep = [1e-4,5e-3]
#Bounds on the Green Lagrange incrementsNmax = 2500
#maximum length of the sequencek = 2000
# number of path to generate- The path are stored by default in ConstRVE/Paths/. You can change the name in line 123
saveDir = '../ConstRVE'+'/Paths/'
. - The command to be run from the directory TrainingPaths is
(mkdir ../ConstRVE/Paths) #if needed
python3 generationData_Cyclic.py
- TrainingPaths/countPathLength.py gives average, minimum and maximum lengths of the generated paths and the distribution of the
\Delta R
. By default the paths are read in ConstRVE/Paths/ but the directory can be given as an argument. The file can be used to read- either the generated loading paths
python3 countPathLength.py '../ConstRVE/PathsExamples'
-
- or the results of the simulations
python3 countPathLength.py '../All_Path_Res/Path_Res9'
- TrainingPaths/graphData.py generates illustrations from randomly picked paths in ConstRVE/Paths/ and generate png figures.
Generate the RVEs direct simulation results
- Uses the loading paths existing in ConstRVE/Paths/.
- ConstRVE/rve.geo is the RVE geometry file that can be read by gmsh (www.gmsh.info).
- ConstRVE/rve.msh is the RVE mesh file that can be read by gmsh (www.gmsh.info).
- ConstRVE/utilsFunc.py contains python tools to be used.
- ConstRVE/Rve_withoutInternalVars.py is used to run all the RVE simulations:
- This requires cm3Libraries (http://www.ltas-cm3.ulg.ac.be/openSource.htm).
- All the ouptus are stored in
All_Path_Res/Path_Res12
, you can change the name in line 71Path_Res = '../All_Path_Res/Path_Res12/'
. The results of RVE simulations are saved as the sequence (one configuration per line) of the Green-Lagrange strains and Second Piola-Kirchhoff stress (in column). One example can be found in All_Path_Res/Path_Res1/data_path1000.csv. - The command to be run from ConstRVE directory is
python3 Rve_withoutInternalVars.py
Collect, normalised and truncate the RVEs direct simulation results as training and testing data
- TrainingData/CheckNanData.py is used to check the integrity of the direct numerical simulations results
- DNS results are read from All_Path_Res/Path_res1 to All_Path_Res/Path_res11 subdirectories.
- The command to be run from TrainingData directory is
python3 CheckNanData.py
- TrainingData/CollectData.py is used to gather all the direct numerical simulations results
- DNS results are read from All_Path_Res/Path_res1 to All_Path_Res/Path_res11 subdirectories. It can be changed in line 31
for ll in range(11):
. - It saves the bounds and raw data in TrainingData/Processed_Data/Bounds_GS and TrainingData/Processed_Data/Origin_GS, respectively.
- The command to be run from TrainingData directory is
- DNS results are read from All_Path_Res/Path_res1 to All_Path_Res/Path_res11 subdirectories. It can be changed in line 31
python3 CollectData.py
- TrainingData/Normalization.py is used to normalise the gathered direct numerical simulations results
- Bounds and raw data are read from TrainingData/Processed_Data/Bounds_GS and TrainingData/Processed_Data/Origin_GS, respectively.
- It saves the normalized training (75%) and testing (25%) data in TrainingData/Processed_Data/Normalized_GS_Train and TrainingData/Processed_Data/Normalized_GS_Test, respectively.
- The command to be run from TrainingData directory is
python3 Normalization.py
- TrainingData/GaussCollectData.py is an alternative using Gaussian normaliation and is used to gather all the direct numerical simulations results.
- TrainingData/GaussNormalization.py is an alternative to normalise following a Gaussian the gathered direct numerical simulations results.
- TrainingData/Data_Padding.py is used to pad and trim the normalised data
- Normalized training (75%) and testing (25%) data in TrainingData/Processed_Data/Normalized_GS_Train and TrainingData/Processed_Data/Normalized_GS_Test, respectively.
- The final length of the sequence (including zero padding and trimming) is given by
N = 200
. - It saves the trimmed and padded normalised training and testing data in TrainingData/Processed_Data/ as
'GS_Train'N
and'GS_Test'N
, respectively. - The command to be run from TrainingData directory is
python3 Data_Padding.py
- TrainingData/Tool.py is a list of function used to normalise dat.
Part 2: RNN training
Available rnn
- The different cells are in the following directories
- SC_MRU/MGRU/NNW_SMRU: Neural network with SMRU recurrent cell.
- SC_MRU/MGRU/NNW_SCMRU_T: Neural network with SC-MRU-T recurrent cell.
- SC_MRU/MGRU/NNW_MGRU: Neural network with orginal MGRU recurrent cell.
- SC_MRU/MGRU/NNW_MGRU_M: Neural network with modified MGRU recurrent cell.
- SC_MRU/FNN_LeakyReLU/NNW_
X
Fw: Neural network with SC-MRU-I recurrent cell usingX
Feed Forward non-linear transition layers. - SC_MRU/Quandratic_NLT/NNW_
X
Q, NNW_Q_Fw, NNW_Fw_Q: Neural network with SC-MRU-I recurrent cell usingX
quadratic or mixed feed-forward-quadratic non-linear transition layers. - SC_MRU/ReferenceRNN/NNW_
X
Layers: Neural network with SC-LMSC recurrent cell usingX
non-linear transition layers.
Compile the neural network models
- In the adequate directory, e.g. SC_MRU/MGRU/NNW_SMRU for the Neural network with SMRU recurrent cell.
cd build
rm -rf *
cmake -DCMAKE_PREFIX_PATH=$TORCHDIR ..
make
- This create the RNN_
CELL
model in the build directory
Train the neural network models
- In the adequate directory, e.g. SC_MRU/MGRU/NNW_SMRU for the Neural network with SMRU recurrent cell.
- Requires to have compiled the RNN model.
- The file
Train.py
- Uses the RNN_
CELL
model compiled in thebuild
directory. - Uses the trimmed and padded normalised training and testing data in TrainingData/Processed_Data/ as
'GS_Train'N
and'GS_Test'N
, respectively. - Can be modified to use the requested ratios of training and testing sequences of different lengths to prepate the mini-baches, e.g.:
ratio = [0.2,0.02,0.6,0.06,0.01,0.9]
PathIn1 = ['../../../TrainingData/Processed_data/GS_Train200','../../../TrainingData/Processed_data/GS_Train2500','../../../TrainingData/Processed_data/GS_Test200','../../../TrainingData/Processed_data/GS_Test2500']
PathIn2 = ['../../../TrainingData/Processed_data/GS_Test200','../../../TrainingData/Processed_data/GS_Test400','../../../TrainingData/Processed_data/GS_Test2500','../../../TrainingData/Processed_data/GS_Test400']
- Saves the mini-batches in
PathOut = "TrainingData.pt"
- Saves the model
module-checkpoint.pt
andmodule-checkpoint-optimizer.pt
, and loss evolutionLoss.txt
in- Module/H
n
withn
the number of hiden variables. - Output directory can be changes in NNW_
CELL
.cpp of the cell nameCELL
(and recompiling). - Warm start can be disabled in by commenting
torch::load(net, "./Module/H120/module-checkpoint_CM0.pt");
andtorch::load(optimizer, "./Module/H120/module-optimizer-checkpoint_CM0.pt");
in NNW_CELL
.cpp of the cell nameCELL
(and recompiling). - Is executed with
- Module/H
- Uses the RNN_
python3 Train.py
Convert trained c++ models for pyTorch (in view of multiscale simulations)
- SC_MRU/CheckLoss/RU.py: functions used to read and use the rnn models.
- SC_MRU/CheckLoss/Save_RNN_script.py:
- Converts c++ trained models to pyTorch models
- Reads the trained models in SC_MRU/CELL_
Kind
/NNW_CELL
/Module/HN
/module.pt - Save the converted models to SC_MRU/CheckLoss/ and to MultiScale/Model/
InpType = cell[0]
#choose the recurrent cell typeAmong the different available cellscell=["SC_MRU_T","SC_MRU_I","SMRU"]
.- The command to be run from the directory SC_MRU/CheckLoss/ is
python3 Save_RNN_script.py
Vizualize loss evolution and testing results
- SC_MRU/CheckLoss/Data_prepare.py contains fucntion used for testing.
- SC_MRU/CheckLoss/LossVS_insertN.py tests the effect of testing data augmentation:
- Requires to have compiled the RNN models.
- Cell type and augmentation kind can be modified:
repeat = 10
#number of testsdataType ='random'
#'even' or 'random'InpType = cell[0]
#choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``reevaluate= False
#False
to use the saved loss values andTrue
to evaluate the loss valuesfastshifting=False
#False
to vizualize before fast shifting (Fig. 11) andTrue
after (Fig. 12). Whenreevaluate== True
, this has no effect: the training with fast-shifting or not has to be done manually, see details.
- Generates new Loss_
CELL
.txt and TrainingData.pt files in casereevaluate== True
, read the existing one ifreevaluate== False
. - Generates Figs. 11 and 12 from SC_MRU/CheckLoss/
python3 LossVS_insertN.py
- SC_MRU/CheckLoss/Plot_GS.py test the different NNW on RVE testing paths:
- Requires to have compiled the RNN models.
- Cell type and augmentation kind can be modified:
InpType = cell[0]
#choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``
- Generate Figs. 13, 14 and 15 from SC_MRU/CheckLoss/
- Command to be run from SC_MRU/CheckLoss/ is
python3 Plot_GS.py
- SC_MRU/CheckLoss/Plot_GS_step.py tests the different pyTorch NNWs on the RVE testing paths:
- Requires to have converted the RNN models.
- Cell type and augmentation kind can be modified:
InpType = cell[0]
#choose the recurrent cell typeAmong the different available cells ```cell=["SC_MRU_T","SC_MRU_I","SMRU"]``
- Command to be run from SC_MRU/CheckLoss/ is
python3 Plot_GS_step.py
- SC_MRU/PlotLoss_HiddenV.py shows the loss evolution for the different number of hidden variables --after having trained or using saved loss files-- for the SMRU cell:
- Generates Fig. 8.
- Command to be run from SC_MRU/ is
python3 PlotLoss_HiddenV.py
- SC_MRU/PlotLoss_3MRU.py shows the loss evolution for the different recurrent units --after having trained or using saved loss files-- for the 120 hidden variables:
- Generates Fig. 9.
- Command to be run from SC_MRU/ is
python3 PlotLoss_3MRU.py
- SC_MRU/PlotLoss_RefNL_H120.py shows the loss evolution for the different SC-LMSC and SC-MRU-I cells --after having trained or using saved loss files-- for 120 hidden variables:
- Generates Fig. 10.
- Command to be run from SC_MRU/ is
python3 PlotLoss_RefNL_H120.py
- SC_MRU/PlotLoss_MGRU.py shows the loss evolution for the original and modified MGRU --after having trained or using saved loss files-- and for the 120 hidden variables:
- Generates Fig. A.18.
- Command to be run from SC_MRU/ is
python3 PlotLoss_MGRU.py
- SC_MRU/PlotLoss_NL.py shows the loss evolution for the different non-liner transition layers (quadratic and Leaky ReLU) of the SC-MRU-I cell --after having trained or using saved loss files-- and for the 120 hidden variables:
- Generates Fig. B.19.
Case=1
# 0 for quadratic transition blocks and 1 for Leaky ReLU transition blocks- Command to be run from SC_MRU/ is
python3 PlotLoss_NL.py
- SC_MRU/PlotLoss_DiffNL_H120.py shows the loss evolution for the different non-liner transition layers (hybrid) of the SC-MRU-I cell --after having trained or using saved loss files-- and for the 120 hidden variables:
- Generates Fig. B.20.
- Command to be run from SC_MRU/ is
python3 PlotLoss_DiffNL_H120.py
Part 3: Multiscale analysis
Trained surrogate models
- MultiScale/Model: contains the different rnn
- MultiScale/Model/Bounds_GS: Bounds.
- MultiScale/Model/DInpFullModel.pt: trained rnn with
SC-MRU-T
recurrent cell. - MultiScale/Model/IncrementModel.pt: trained rnn with
SC-MRU-I
recurrent cell. - MultiScale/Model/FullModel.pt: trained rnn with
SMRU
recurrent cell.
Run mutiscale simulations using the surrogates
- MultiScale/2D_MultiScale/model.geo: geometry of the macro-scale model that can be read by gmsh (www.gmsh.info).
- MultiScale/2D_MultiScale/model.msh: mesh of the macro-scale model that can be read by gmsh (www.gmsh.info).
- MultiScale/2D_MultiScale/model.py: is used to run all the multiscale simulation:
- This requires cm3Libraries (http://www.ltas-cm3.ulg.ac.be/openSource.htm).
- Uses the bounds and trained models in MultiScale/Model.
InpType = cell[0]
#choose the recurrent cell typeAmong the different available cellscell=["SC_MRU_T","SC_MRU_I","SMRU"]
.factorStep= 100
# is the number of steps x 20 during the reloading stage between points B and C.- The command to be run from the directory MultiScale/2D_MultiScale/ is
python3 model.py
- MultiScale/FE2:
- Contains the reference displacement-force results of the FE2 simulation.
- MultiScale/FE2/Distributions: contains the macro-scale displacement and stress fields. They can be vizualized with gmsh (www.gmsh.info) using the mesh file model.msh.
- MultiScale/S
CELL
_120_STEPS
:- Contain the reference displacement-force results of the rnn-based multiscale simulations for the different recurrent cells
CELL
and steps numberSTEPS
during the reloading stage between points B and C. - For the cases MultiScale/S
CELL
_120_20, the macro-scale displacement and stress fields are also available and can be vizualized with gmsh (www.gmsh.info) using the mesh file model.msh.
- Contain the reference displacement-force results of the rnn-based multiscale simulations for the different recurrent cells
Vizualize mutiscale simulations results
- MultiScale/2D_MultiScale/plot_force.py: is used to plot the multiscale simulations results:
InpType = cell[0]
#choose the recurrent cell typeAmong the different available cellscell=["SC_MRU_T","SC_MRU_I","SMRU"]
.- The multiscale simulations results to be plotted are saved in the directories
CELL_120_Step
, whereStep
# is the number of steps during the reloading stage between points B and C. - The command to be run from the directory MultiScale/2D_MultiScale/ is
python3 plot_force.py
- To vizualize the macro-scale displacement and stress fields distributions:
- MultiScale/FE2/Distributions: contains the macro-scale displacement and stress fields. They can be vizualized with gmsh (www.gmsh.info) using the mesh file model.msh.
- MultiScale/S
CELL
_120_20, the macro-scale displacement and stress fields are also available and can be vizualized with gmsh (www.gmsh.info) using the mesh file model.msh.
Part 4: Reproduce paper1 figures
- Fig. 7: The command to be run from the diretory TrainingPaths is
python3 countPathLength.py '../ConstRVE/PathsExamples'
- Fig. 9: The command to be run from the diretory SC_MRU/ is
python3 PlotLoss_HiddenV.py
- Fig. 10: The command to be run from the diretory SC_MRU/ is
python3 PlotLoss_3MRU.py
- Fig. 11: The command to be run from the diretory SC_MRU/ is
python3 PlotLoss_DiffNL_H120.py
- Figs. 12 and 13: The command to be run from the directory SC_MRU/CheckLoss/ is, see details
python3 LossVS_insertN.py
- Figs. 14, 15 and 16: The command to be run from the directory SC_MRU/CheckLoss/
python3 Plot_GS.py
- Figs. 17(b)(c)(d): The command to be run from the directory MultiScale/2D_MultiScale/ is, see details:
python3 plot_force.py
- Figs. 18-23: Need gmsh to vizualize the results stored in MultiScale/FE2/Distributions and MultiScale/S
CELL
_120_20, see details - Fig. A.24: The command to be run from the directory SC_MRU/CheckLoss/ is, see details
python3 PlotLoss_MGRU.py.py
- Fig. B.25: The command to be run from the directory SC_MRU/CheckLoss/ is, see details
python3 PlotLoss_NL.py
- Fig. B.26: The command to be run from the directory SC_MRU/CheckLoss/ is, see details
python3 PlotLoss_DiffNL_H120.py
Disclaimer
This project has received funding from the European Union’s Horizon Europe Framework Programme under grant agreement No. 101056682 for the project “DIgital DEsign strategies to certify and mAnufacture Robust cOmposite sTructures (DIDEAROT)”. The contents of this publication are the sole responsibility of ULiege and do not necessarily reflect the opinion of the European Union. Neither the European Union nor the granting authority can be held responsible for them.
-
The work is described in:
"Wu, L. and Noels, L. (2024). Self-consistency Reinforced minimal Gated Recurrent Unit for surrogate modeling of history-dependent non-linear problems: application to history-dependent homogenized response of heterogeneous materials 424: 116881, doi: 10.1016/j.cma.2024.116881" which can be downloaded. We would be grateful if you could cite this publication in case you use the files. ↩ ↩2 ↩3
Files
All_Path_Res.zip
Files
(11.2 GB)
Name | Size | Download all |
---|---|---|
md5:28a96ea53f9fca5a3ab0d93ce0642d74
|
1.7 GB | Preview Download |
md5:665813980c08d0aea1957f9b7d5536f9
|
134.2 MB | Preview Download |
md5:9e344f381c5184e71b3f9cbbcc6c99e4
|
64.7 MB | Preview Download |
md5:df5a5be38168f868bc83fe5da000076e
|
26.3 kB | Preview Download |
md5:84a8471a765a62786273fee803ffd40a
|
141.5 MB | Preview Download |
md5:2ba488e15847317724a8fdf7f56b95fc
|
9.2 GB | Preview Download |
md5:4898cb6e5d8dc06deba33d1fc573580a
|
592.4 kB | Preview Download |
Additional details
Related works
- Documents
- Publication: 10.1016/j.cma.2024.116881 (DOI)