
I'm a Research Scientist at Meta AI (FAIR) in New York.
My research involves probabilistic deep learning with differentiable numerics, flowing between differential equations and statistics.
I generally work on integrating structured transformations into probabilistic modeling, with the goal of improved interpretability, tractable optimization, or extending into novel areas of application. In terms of fundamental research, I combine numerical simulations, automatic differentiation, and stochastic estimation. I enjoy applying these tools to a variety of scientific and machine learning problems.
CV | Github | Twitter | Google Scholar | rtqichen@meta.com
Research
- Riemannian Flow Matching on General Geometries Preprint. 2023 arxiv | code
- Multisample Flow Matching: Straightening Flows with Minibatch Couplings International Conference on Machine Learning (ICML). 2023 arxiv
- On Kinetic Optimal Probability Paths for Generative Models International Conference on Machine Learning (ICML). 2023 arxiv
- Distributional GFlowNets with Quantile Flows Preprint. 2023 arxiv | code
- Flow Matching for Generative Modeling (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2023 arxiv
- Latent State Marginalization as a Low-cost Approach for Improving Exploration International Conference on Learning Representations (ICLR). 2023 arxiv | code
- Neural Conservation Laws: A Divergence-Free Perspective Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv | bibtex | code
- Semi-Discrete Normalizing Flows through Differentiable Tessellation Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv | bibtex | poster | code
- Matching Normalizing Flows and Probability Paths on Manifolds International Conference on Machine Learning (ICML). 2022 arxiv | bibtex
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2022 arxiv | bibtex | code
- Fully differentiable optimization protocols for non-equilibrium steady states New Journal of Physics. 2021 arxiv | bibtex | publisher link | code
- “Hey, that's not an ODE”: Faster ODE Adjoints via Seminorms International Conference on Machine Learning (ICML). 2021 arxiv | bibtex | code
- Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | code
- Learning Neural Event Functions for Ordinary Differential Equations International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | slides | poster
- Neural Spatio-Temporal Point Processes International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | poster | code
- Self-Tuning Stochastic Optimization with Curvature-Aware Gradient Filtering (ORAL) Workshop on "I Can't Believe It's Not Better!", NeurIPS. 2020 arxiv | bibtex | slides | poster | talk
- Scalable Gradients for Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2020 arxiv | bibtex | code
- SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2020 arxiv | bibtex | poster | colab
- Neural Networks with Cheap Differential Operators (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk (@9:45) | poster | code
- Residual Flows for Invertible Generative Modeling (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk | poster | code
- Latent ODEs for Irregularly-Sampled Time Series Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | code | talk
- Invertible Residual Networks (LONG ORAL) International Conference on Machine Learning (ICML). 2019 arxiv | bibtex | code
-
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
(ORAL)
(BEST STUDENT PAPER @ AABI 2018) International Conference on Learning Representations (ICLR). 2019 arxiv | bibtex | poster | code - Neural Ordinary Differential Equations (BEST PAPER AWARD) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | poster | code
- Isolating Sources of Disentanglement in Variational Autoencoders (ORAL) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | talk | poster | code