
I'm a Research Scientist at Facebook AI Research (FAIR) in New York.
My research involves probabilistic deep learning with differentiable numerics, flowing between differential equations and statistics.
I enjoy connecting various problem formulations and methodologies across different domains. A different way of framing a problem can lead to different research questions; a different way of applying a method can help develop novel use cases. My desire to understand (too) many things leads me to seek out interdisciplinary approaches and collaborations. If you find my research interesting, please feel free to reach out.
Research
- Semi-Discrete Normalizing Flows through Differentiable Tessellation Preprint. 2022 arxiv | slides
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2022 arxiv | code
- Fully differentiable optimization protocols for non-equilibrium steady states New Journal of Physics. 2021 arxiv | publisher link | code
- “Hey, that's not an ODE”: Faster ODE Adjoints via Seminorms International Conference on Machine Learning (ICML). 2021 arxiv | code
- Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization International Conference on Learning Representations (ICLR). 2021 arxiv | code
- Learning Neural Event Functions for Ordinary Differential Equations International Conference on Learning Representations (ICLR). 2021 arxiv | slides | poster
- Neural Spatio-Temporal Point Processes International Conference on Learning Representations (ICLR). 2021 arxiv | poster | code
- Self-Tuning Stochastic Optimization with Curvature-Aware Gradient Filtering (ORAL) Workshop on "I Can't Believe It's Not Better!", NeurIPS. 2020 arxiv | bibtex | slides | poster | talk
- Scalable Gradients for Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2020 arxiv | bibtex | code
- SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2020 arxiv | bibtex | poster | colab
- Neural Networks with Cheap Differential Operators (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk (@9:45) | poster | code
- Residual Flows for Invertible Generative Modeling (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk | poster | code
- Latent ODEs for Irregularly-Sampled Time Series Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | code | talk
- Invertible Residual Networks (LONG ORAL) International Conference on Machine Learning (ICML). 2019 arxiv | bibtex | code
-
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
(ORAL)
(BEST STUDENT PAPER @ AABI 2018) International Conference on Learning Representations (ICLR). 2019 arxiv | bibtex | poster | code - Neural Ordinary Differential Equations (BEST PAPER AWARD) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | poster | code
- Isolating Sources of Disentanglement in Variational Autoencoders (ORAL) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | talk | poster | code