I'm a Research Scientist at Meta Fundamental AI Research (FAIR) team in New York. My research is on building simplified abstractions of the world through the lens of dynamical systems and flows.
Lately, I've been exploring the use of stochastic control theory for large-scale generative modeling [Adjoint Matching], as well as constructing discrete generative models through continuous-time Markov chains [Discrete Flow Matching]. Our methods such as [Flow Matching] have been applied successfully for foundation models of video and audio [Movie Gen].
CV | Github | Twitter | Google Scholar | rtqichen@meta.com
Research
(See Google Scholar for exhaustive list.)- Flow Matching Guide and Code Arxiv. 2024 arxiv | code
- Flow Matching with General Discrete Paths: A Kinetic-Optimal Perspective Preprint. 2024 arxiv
- Generator Matching: Generative modeling with arbitrary Markov processes Preprint. 2024 arxiv
- Adjoint Matching: Fine-tuning Flow and Diffusion Generative Models with Memoryless Stochastic Optimal Control Preprint. 2024 arxiv
- Discrete Flow Matching Advances in Neural Information Processing Systems (NeurIPS). 2024 arxiv
- FlowMM: Generating Materials with Riemannian Flow Matching International Conference on Machine Learning (ICML). 2024 arxiv | code
- Bespoke Non-Stationary Solvers for Fast Sampling of Diffusion and Flow Models International Conference on Machine Learning (ICML). 2024 arxiv
- Bespoke Solvers for Generative Flow Models (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2024 arxiv
- Generalized Schrödinger Bridge Matching International Conference on Learning Representations (ICLR). 2024 arxiv | code
- Flow Matching on General Geometries (OUTSTANDING PAPER HONORABLE MENTION) International Conference on Learning Representations (ICLR). 2024 arxiv | code
- Stochastic Optimal Control Matching Preprint. 2023 arxiv | code
- Multisample Flow Matching: Straightening Flows with Minibatch Couplings International Conference on Machine Learning (ICML). 2023 arxiv
- On Kinetic Optimal Probability Paths for Generative Models International Conference on Machine Learning (ICML). 2023 arxiv
- Flow Matching for Generative Modeling (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2023 arxiv
- Latent State Marginalization as a Low-cost Approach for Improving Exploration International Conference on Learning Representations (ICLR). 2023 arxiv | code
- Neural Conservation Laws: A Divergence-Free Perspective Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv | bibtex | code
- Semi-Discrete Normalizing Flows through Differentiable Tessellation Advances in Neural Information Processing Systems (NeurIPS). 2022 arxiv | bibtex | poster | code
- Matching Normalizing Flows and Probability Paths on Manifolds International Conference on Machine Learning (ICML). 2022 arxiv | bibtex
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2022 arxiv | bibtex | code
- Fully differentiable optimization protocols for non-equilibrium steady states New Journal of Physics. 2021 arxiv | bibtex | publisher link | code
- “Hey, that's not an ODE”: Faster ODE Adjoints via Seminorms International Conference on Machine Learning (ICML). 2021 arxiv | bibtex | code
- Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | code
- Learning Neural Event Functions for Ordinary Differential Equations International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | slides | poster
- Neural Spatio-Temporal Point Processes International Conference on Learning Representations (ICLR). 2021 arxiv | bibtex | poster | code
- Self-Tuning Stochastic Optimization with Curvature-Aware Gradient Filtering (ORAL) Workshop on "I Can't Believe It's Not Better!", NeurIPS. 2020 arxiv | bibtex | slides | poster | talk
- Scalable Gradients for Stochastic Differential Equations International Conference on Artificial Intelligence and Statistics (AISTATS). 2020 arxiv | bibtex | code
- SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models (SPOTLIGHT) International Conference on Learning Representations (ICLR). 2020 arxiv | bibtex | poster | colab
- Neural Networks with Cheap Differential Operators (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk (@9:45) | poster | code
- Residual Flows for Invertible Generative Modeling (SPOTLIGHT) Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | slides | talk | poster | code
- Latent ODEs for Irregularly-Sampled Time Series Advances in Neural Information Processing Systems (NeurIPS). 2019 arxiv | bibtex | code | talk
- Invertible Residual Networks (LONG ORAL) International Conference on Machine Learning (ICML). 2019 arxiv | bibtex | code
-
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
(ORAL)
(BEST STUDENT PAPER @ AABI 2018) International Conference on Learning Representations (ICLR). 2019 arxiv | bibtex | poster | code - Neural Ordinary Differential Equations (BEST PAPER AWARD) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | poster | code
- Isolating Sources of Disentanglement in Variational Autoencoders (ORAL) Advances in Neural Information Processing Systems (NeurIPS). 2018 arxiv | bibtex | slides | talk | poster | code