This work was completed as part of the Flatiron Institute 2022 MLxScience Summer School
This repository contains the implementation of a conditional Glow (cGlow) used in order to model neutral hydrogen (HI) maps. The implementation itself relies quite heavily on the code from https://github.com/rosinality/glow-pytorch.
The data used to fit the model in this project were maps from the CAMELS simulation project, which simulates galaxy formation through the use of 6 parameters: 2 cosmological and 4 astrophysical. For this work, we were interested in learning a generative model for these HI maps conditional on the two cosmological parameters, called
While primarily focused on HI maps, the code in this repository can be used for any task where a cGlow model, conditioned on few parameters, is needed.
The goal in this work was double: to be able to generate new HI maps and to have the capability of parameter inference, together in the same model. Normalizing flows, such as Glow, are perfect for this task as they allow for generation as well as exact likelihood estimation.
The standard Glow architecture can be summarized by the following schematic:
Where, for all
Glow is mainly constructed by 3 types of layers:
- Actnorm: a channel-wise affine transformation
- Invertible convolutions: the core of Glow is the addition of invertible convolutions with 1x1 spatial resolution kernels. These layers mix the information in the channel dimension, while also allowing some scaling
- Affine coupling: the affine coupling layer is the main workhorse of the model. In Glow, the affine coupling is a neural network that uses half of the channels in order to define an affine transformation over the remaining channels
These layers are then stacked into a flow step (as seen in the image above, left) and these flow steps are then stacked
As defined, the 3 layers are invertible, which allows for training of the model using standard MLE through the use of the change of variable identity. Let
Given a set of parameters
In practice, the way to do this is by changing the layers inside Glow so that they receive
Essentially, a learnable affine transformation is learned with
HIGlow is a conditional Glow model trained on the CAMELS simulation data. Below are examples for generated samples conditional on the cosmological parameters defined in the data:
As can be seen, HIGlow is able to generate maps that are very similar to the real data from the CAMELS simulation, even when conditioning on specific parameter values.
To quantify this more explicitly, we can look at the mean power spectrum and its standard deviation using generated data versus training data:
On the left, images from the marginal
When the distribution of the true conditional parameters
where
For HIGlow,
Above, the red cross indicates the true parameter value while the contour lines indicate the distribution
@article{kingma2018glow,
title={Glow: Generative flow with invertible 1x1 convolutions},
author={Kingma, Durk P and Dhariwal, Prafulla},
journal={Advances in neural information processing systems},
volume={31},
year={2018}
}
@inproceedings{lugmayr2020srflow,
title={Srflow: Learning the super-resolution space with normalizing flow},
author={Lugmayr, Andreas and Danelljan, Martin and Gool, Luc Van and Timofte, Radu},
booktitle={European conference on computer vision},
pages={715--732},
year={2020},
organization={Springer}
}
@article{villaescusa2021camels,
title={The CAMELS Project: Cosmology and Astrophysics with Machine-learning Simulations},
author={Villaescusa-Navarro, Francisco and Angl{\'e}s-Alc{\'a}zar, Daniel and Genel, Shy and Spergel, David N and Somerville, Rachel S and Dave, Romeel and Pillepich, Annalisa and Hernquist, Lars and Nelson, Dylan and Torrey, Paul and others},
journal={The Astrophysical Journal},
volume={915},
number={1},
pages={71},
year={2021},
publisher={IOP Publishing}
}
@article{hassan2021hiflow,
title={HIFlow: Generating Diverse HI Maps Conditioned on Cosmology using Normalizing Flow},
author={Hassan, Sultan and Villaescusa-Navarro, Francisco and Wandelt, Benjamin and Spergel, David N and Angl{\'e}s-Alc{\'a}zar, Daniel and Genel, Shy and Cranmer, Miles and Bryan, Greg L and Dav{\'e}, Romeel and Somerville, Rachel S and others},
journal={arXiv preprint arXiv:2110.02983},
year={2021}
}