# Normalizing flows for random fields in cosmology

@inproceedings{Rouhiainen2021NormalizingFF, title={Normalizing flows for random fields in cosmology}, author={Adam Rouhiainen and U. Giri and Moritz Munchmeyer}, year={2021} }

Normalizing flows are a powerful tool to create flexible probability distributions with a wide range of potential applications in cosmology. Here we are studying normalizing flows which represent cosmological observables at field level, rather than at the level of summary statistics such as the power spectrum. We evaluate the performance of different normalizing flows for both density estimation and sampling of near-Gaussian random fields, and check the quality of samples with different… Expand

#### Figures from this paper

#### References

SHOWING 1-10 OF 27 REFERENCES

KSZ tomography and the bispectrum

- Physics
- 2018

Several statistics have been proposed for measuring the kSZ effect by combining the small-scale CMB with galaxy surveys. We review five such statistics, and show that they are all mathematically… Expand

Optimal estimation of non-gaussianity

- Physics
- 2005

We systematically analyze the primordial non-Gaussianity estimator used by the Wilkinson Microwave Anisotropy Probe (WMAP) science team with the basic ideas of estimation theory in order to see if… Expand

Fast likelihood-free cosmology with neural density estimators and active learning

- Physics
- Monthly Notices of the Royal Astronomical Society
- 2019

Likelihood-free inference provides a framework for performing rigorous Bayesian inference using only forward simulations, properly accounting for all physical and observational effects that can be… Expand

Acoustic signatures in the primary microwave background bispectrum

- Physics
- 2001

If the primordial fluctuations are non-Gaussian, then this non-Gaussianity will be apparent in the cosmic microwave background (CMB) sky. With their sensitive all-sky observation, MAP and Planck… Expand

Variational Inference: A Review for Statisticians

- Computer Science, Mathematics
- ArXiv
- 2016

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived. Expand

Masked Autoregressive Flow for Density Estimation

- Computer Science, Mathematics
- NIPS
- 2017

This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow. Expand

Review of local non-Gaussianity from multi-field inflation

- Physics
- 2010

We review models which generate a large non-Gaussianity of the local form. We first briefly consider three models which generate the non-Gaussianity either at or after the end of inflation; the… Expand

Variational Inference with Normalizing Flows

- Computer Science, Mathematics
- ICML
- 2015

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference. Expand

Maximum a posteriori CMB lensing reconstruction

- Physics
- 2017

Gravitational lensing of the cosmic microwave background (CMB) is a valuable cosmological signal that correlates to tracers of large-scale structure and acts as a important source of confusion for… Expand

Rapid separable analysis of higher order correlators in large-scale structure

- Physics
- 2012

We present an efficient separable approach to the estimation and reconstruction of the bispectrum and the trispectrum from observational (or simulated) large-scale structure data. This is developed… Expand