Published March 22, 2023 | Version v1
Poster Open

A neural network approach to accelerate chemical kinetics codes

  • 1. Leiden Observatory
  • 2. SRON; Leiden Observatory

Description

This study is focused on the implementation of neural networks to replace mathematical frameworks in the one-dimensional chemical kinetics code, VULCAN (Tsai et al. 2017; 2021). The underlying time-dependent ordinary differential equations are very time-consuming to compute when using numerical methods. The neural network in this study is designed to replace them. Our data set consists of 13291 hot-Jupiters atmospheres. Using the gravity gradient, temperature-pressure profiles, initial mixing ratios, and stellar flux as free parameters, the neural network is built to predict the mixing ratio outputs. The architecture of the network is composed of individual autoencoders for each input variable to reduce the input dimensionality, which are then used in an LSTM-like neural network to train this sequential data on. Results show that the autoencoders for the mixing ratios, stellar spectra, and pressure gradients are exceedingly successful in encoding and decoding the data. The temperature and gravity gradients are shown to be more difficult to reconstruct using autoencoders. Using the original temperature- and gravity gradients and the encoded data to predict the time-dependent output mixing ratios by training the core network has shown to be successful within errors between different chemical kinetics codes (Venot et al. 2012). In 90% of the cases, the fully trained model is able to predict the evolved mixing ratios of the species in the hot-Jupiter atmosphere simulations with errors in the range [-0.66, 0.65] orders in magnitude. Due to imbalances in the data set, the model is biased to more accurately solve for some examples than others. The fully trained model is 10^3 times faster than the VULCAN simulations while making accurate predictions.

Files

Poster_ESLAB.pdf

Files (18.2 MB)

Name Size Download all
md5:3f5a37aec808ad0edc1c862c3dffb806
18.2 MB Preview Download