Skip to main content

Bibliography

info

If you know of a paper or library that ought to be listed in this bibliography please let us know in the forum or by starting a pull request.

Here is a collation of materials related to the research, engineering, and teaching of normalizing flows. A corresponding BibTex file can be found here.

Surveys

[bond2021deep] Bond-Taylor, S., Leach, A., Long, Y., and Willcocks, C.G. Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models. arXiv preprint arXiv:2103.04922, 2021.
[kobyzev2020normalizing] Kobyzev, I., Prince, S., and Brubaker, M. Normalizing flows: An introduction and review of current methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020.
[papamakarios2019normalizing] Papamakarios, G., Nalisnick, E., Rezende, D., Mohamed, S., and Lakshminarayanan, B. Normalizing flows for probabilistic modeling and inference. arXiv preprint arXiv:1912.02762, 2019.

Methodology

[dinh2014nice] Dinh, L., Krueger, D., and Bengio, Y. NICE: Non-linear Independent Components Estimation. Workshop contribution at the International Conference on Learning Representations (ICLR), 2015.
[dinh2016density] Dinh, L., Sohl-Dickstein, J., and Bengio, S. Density estimation using real NVP. Conference paper at the International Conference on Learning Representations (ICLR), 2017.
[durkan2019neural] Durkan, C., Bekasov, A., Murray, I., and Papamakarios, G. Neural spline flows. 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019.
[germain2015made] Germain, M., Gregor, K., Murray, I., and Larochelle, H. MADE: Masked autoencoder for distribution estimation . International Conference on Machine Learning (ICML), 2015.
[kingma2016improving] Kingma, D.P., Salimans, T., Jozefowicz, R., Chen, X., Sutskever, I., and Welling, M. Improving variational inference with inverse autoregressive flow . 29th Conference on Neural Information Processing Systems (NeurIPS), 2016.
[papamakarios2017masked] Papamakarios, G., and Pavlakou, T., and Murray, I. Masked autoregressive flow for density estimation . 30th Conference on Neural Information Processing Systems (NeurIPS), 2017.
[rezende2015variational] Rezende, D., and Mohamed, S. Variational inference with normalizing flows . International Conference on Machine Learning (ICML), 2015.

Applications

[jin2019unsupervised] Jin, L., and Doshi-Velez, F., and Miller, T., and Schwartz, L., and Schuler, W. Unsupervised learning of PCFGs with normalizing flow. 57th Annual Meeting of the Association for Computational Linguistics (ACL), 2019.
[kim2020wavenode] Kim, H., and Lee, H., and Kang, W. H., and Cheon, S. J., Choi, B. J., and Kim, N. S. WaveNODE: A Continuous Normalizing Flow for Speech Synthesis. 2nd workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models (ICML 2020), 2020.
[yang2019pointflow] Yang, G., Huang, X., Hao, Z., Liu, M., Belongie, S., and Hariharan, B. Pointflow: 3d point cloud generation with continuous normalizing flows. IEEE/CVF International Conference on Computer Vision, 2019.

Libraries

PyTorch

[bingham2018pyro] Bingham, E., and Chen, J.P., Jankowiak, M., Obermeyer, F., Pradhan, N., Karaletsos, T., Singh, R., Szerlip, P., Horsfall, P., and Goodman, N.D. Pyro: Deep Universal Probabilistic Programming. Journal of Machine Learning Research (JMLR), 2018.

The majority of the bijections in Pyro were written by the core developer, Stefan Webb, and FlowTorch builds upon this code and the experience gained from it.

[phan2019composable] Phan, D., Pradhan, N., and Jankowiak, M. Composable Effects for Flexible and Accelerated Probabilistic Programming in NumPyro. arXiv preprint arXiv:1912.11554, 2019.

Probabilistic Graphical Models

[webb2017faithful] Webb, S., Golinski, A., Zinkov, R., Siddharth, N., Rainforth, T., Teh, Y.W., and Wood, F. Faithful inversion of generative models for effective amortized inference. 31th Conference on Neural Information Processing Systems (NeurIPS), 2018.