
I'm a Research Scientist at Facebook AI Research (FAIR) in New York.
My research involves probabilistic deep learning with differentiable numerics, drawing ideas from interdisciplinary research areas to build structural models of the world.
In terms of fundamental research, I combine numerical simulations, automatic differentiation, and stochastic estimation. I enjoy applying these tools to a variety of application domains, such as probabilistic inference, normalizing flows, and spatiotemporal modeling.
Research
- Semi-Discrete Normalizing Flows through Differentiable Tessellation Preprint. 2022 arxiv | slides
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2022 arxiv | code
- Fully differentiable optimization protocols for non-equilibrium steady states New Journal of Physics. 2021 arxiv | publisher link | code
- “Hey, that's not an ODE”: Faster ODE Adjoints via Seminorms International Conference on Machine Learning (ICML). 2021 arxiv | code
- Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization International Conference on Learning Representations (ICLR). 2021 arxiv | code
- Learning Neural Event Functions for Ordinary Differential Equations International Conference on Learning Representations (ICLR). 2021 arxiv | slides | poster
- Neural Spatio-Temporal Point Processes International Conference on Learning Representations (ICLR). 2021 arxiv | poster | code
- Self-Tuning Stochastic Optimization with Curvature-Aware Gradient Filtering (ORAL) Workshop on "I Can't Believe It's Not Better!", NeurIPS. 2020 arxiv | bibtex | slides | poster | talk
- Scalable Gradients for Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2020 arxiv | bibtex | code
- SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2020 arxiv | bibtex | poster | colab
- Neural Networks with Cheap Differential Operators (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk (@9:45) | poster | code
- Residual Flows for Invertible Generative Modeling (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk | poster | code
- Latent ODEs for Irregularly-Sampled Time Series Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | code | talk
- Invertible Residual Networks (LONG ORAL) International Conference on Machine Learning (ICML). 2019 arxiv | bibtex | code
-
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
(ORAL)
(BEST STUDENT PAPER @ AABI 2018) International Conference on Learning Representations (ICLR). 2019 arxiv | bibtex | poster | code - Neural Ordinary Differential Equations (BEST PAPER AWARD) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | poster | code
- Isolating Sources of Disentanglement in Variational Autoencoders (ORAL) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | talk | poster | code
- Fast Patch-based Style Transfer of Arbitrary Style (ORAL) Workshop in Constructive Machine Learning, NIPS. 2016 arxiv | bibtex | slides | poster | code