I'm a Research Scientist at Meta AI (FAIR) in New York.

My research involves probabilistic deep learning with differentiable numerics, flowing between differential equations and statistics.

I generally work on integrating structured transformations into probabilistic modeling, with the goal of improved interpretability, tractable optimization, or extending into novel areas of application. In terms of fundamental research, I combine numerical simulations, automatic differentiation, and stochastic estimation. I enjoy applying these tools to a variety of scientific and machine learning problems.

CV | Github | Twitter? | Google Scholar | rtqichen@meta.com

Research

  • Flow Matching for Generative Modeling Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le Preprint. 2022 arxiv
  • Latent State Marginalization as a Low-cost Approach for Improving Exploration Dinghuai Zhang, Aaron Courville, Yoshua Bengio, Qinqing Zheng, Amy Zhang, Ricky T. Q. Chen Preprint. 2022 arxiv
  • Neural Conservation Laws: A Divergence-Free Perspective Jack Richter-Powell, Yaron Lipman, Ricky T. Q. Chen Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv
  • Semi-Discrete Normalizing Flows through Differentiable Tessellation Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv
  • Theseus: A Library for Differentiable Nonlinear Optimization Meta AI and Reality Labs Research Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv
  • Matching Normalizing Flows and Probability Paths on Manifolds Heli Ben-Hamu, Samuel Cohen, Joey Bose, Brandon Amos, Aditya Grover, Maximilian Nickel, Ricky T.Q. Chen, Yaron Lipman International Conference on Machine Learning (ICML). 2022 arxiv
  • Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations Winnie Xu, Ricky T. Q. Chen, Xuechen Li, David Duvenaud International Conference on Artificial Intelligence and Statistics (AISTATS). 2022 arxiv | code
  • Fully differentiable optimization protocols for non-equilibrium steady states Rodrigo A Vargas-Hernández, Ricky T. Q. Chen, Kenneth A Jung, Paul Brumer New Journal of Physics. 2021 arxiv | publisher link | code
  • “Hey, that's not an ODE”: Faster ODE Adjoints via Seminorms Patrick Kidger, Ricky T. Q. Chen, Terry Lyons International Conference on Machine Learning (ICML). 2021 arxiv | code
  • Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization Chin-Wei Huang, Ricky T. Q. Chen, Christos Tsirigotis, Aaron Courville International Conference on Learning Representations (ICLR). 2021 arxiv | code
  • Learning Neural Event Functions for Ordinary Differential Equations Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel International Conference on Learning Representations (ICLR). 2021 arxiv | slides | poster
  • Neural Spatio-Temporal Point Processes Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel International Conference on Learning Representations (ICLR). 2021 arxiv | poster | code
  • Self-Tuning Stochastic Optimization with Curvature-Aware Gradient Filtering (ORAL) Ricky T. Q. Chen, Dami Choi, Lukas Balles, David Duvenaud, Philipp Hennig Workshop on "I Can't Believe It's Not Better!", NeurIPS. 2020 arxiv | bibtex | slides | poster | talk
  • Scalable Gradients for Stochastic Differential Equations Xuechen Li, Ting-Kam Leonard Wong, Ricky T. Q. Chen, David Duvenaud International Conference on Artificial Intelligence and Statistics (AISTATS). 2020 arxiv | bibtex | code
  • SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models (SPOTLIGHT) Yucen Luo, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen International Conference on Learning Representations (ICLR). 2020 arxiv | bibtex | poster | colab
  • Neural Networks with Cheap Differential Operators (SPOTLIGHT) Ricky T. Q. Chen, David Duvenaud Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk (@9:45) | poster | code
  • Residual Flows for Invertible Generative Modeling (SPOTLIGHT) Ricky T. Q. Chen, Jens Behrmann, David Duvenaud, Jörn-Henrik Jacobsen Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk | poster | code
  • Latent ODEs for Irregularly-Sampled Time Series Yulia Rubanova, Ricky T. Q. Chen, David Duvenaud Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | code | talk
  • Invertible Residual Networks (LONG ORAL) Jens Behrmann, Will Grathwohl, Ricky T. Q. Chen, David Duvenaud, Jörn-Henrik Jacobsen International Conference on Machine Learning (ICML). 2019 arxiv | bibtex | code
  • FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models (ORAL)
    (BEST STUDENT PAPER @ AABI 2018)
    Will Grathwohl, Ricky T. Q. Chen, Jesse Bettencourt, Ilya Sutskever, David Duvenaud International Conference on Learning Representations (ICLR). 2019 arxiv | bibtex | poster | code
  • Neural Ordinary Differential Equations (BEST PAPER AWARD) Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | poster | code
  • Isolating Sources of Disentanglement in Variational Autoencoders (ORAL) Ricky T. Q. Chen, Xuechen Li, Roger Grosse, David Duvenaud Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | talk | poster | code