PhD Candidate, Computer Science, NYU
email: goldstein [at] nyu [dot] edu
pronouns: he/him/his
I am a PhD candidate at NYU Courant Institute of Mathematical Sciences, advised by Rajesh Ranganath and Thomas Wies. I work on causal inference, deep generative models, and machine learning for health.
March 2024: Looking forward to being a student researcher at Google DeepMind NYC with Will Grathwohl, starting in May!
March 2024: glad to help out on the survival analysis / estimation side of things on A dynamic risk score for early prediction of cardiogenic shock using machine learning, published in European Heart Journal: Acute Cardiovascular Care, March 2024. Also available on arxiv.
March 2024: some work on probabilistic forecasting with stochastic interpolants and Föllmer processes! Work with Yifan Chen, Mengjian Hua, Michael S. Albergo, Nicholas M. Boffi, and Eric Vanden-Eijnden.
Feb 2024: glad to help out on the survival analysis / estimation side of things on QTNet: Predicting Drug-Induced QT Prolongation with Artificial Intelligence-Enabled Electrocardiograms, published in Journals of the American College of Cardiology, Clinical Electrophysiology. Led by Hao Zhang!
Jan 2024: Scalable Interpolant Transformers! A family of generative models built on Diffusion Transformers, but with a flexible stochastic interpolant process replacing the score based diffusion. Stable to train, performs well at each model size, and many choices for sampling!
Fall 2023: glad to take part in the Flatiron Institute’s upcoming second workshop on Measure Transport, Sampling, and Diffusions!
Fall 2023: glad to teach a guest lecture on diffusions for the NYU course, Inference and Representations, with prof. Yoav Wald.
Fall 2023: excited to talk about diffusions and flows at the Decisions, Risk and Operations ML reading group at Columbia, organied by Hongseok Namkoong, Kelly Zhang, Tiffany Cai, and others!
Fall 2023: Some work on choosing data-dependent base distributions for continuous-time flows! Stochastic interpolants with data-dependent couplings (preprint), work with Michael Albergo, Nick Boffi, Rajesh Ranganath, and Eric Vanden-Eijnden!
Summer 2023: We are running the second iteration of the workshop on Spurious Correlations, Invariance, and Stability at ICML 2023!
Spring 2023: Where to Diffuse, How to Diffuse, and How to Get Back: Automated Learning for Multivariate Diffusions accepted to ICLR 2023! Joint work with Raghav Singhal and Rajesh Ranganath.
Fall 2022: glad to give a guest lecture with my collaborator Raghav Singhal, on diffusions, at Alfredo and Yann’s deep learning class at NYU
Fall 2022: glad to take part in, and give a short talk at, the Flatiron Institute’s workshop on Sampling, Transport, and Diffusions!
Summer 2022: excited to be co-organizing the ICML Workshop on Spurious Correlations, Invariance, and Stability on July 22. Looking forward to all the talks, and congrats to accepted authors!
Summer 2022: Survival Mixture Density Networks accepted to Machine Learning for Healthcare Conference!
Summer 2022: glad to continue at Apple Health AI for the summer!
Spring 2022: Learning Invariant Representations with Missing Data (a full version) accepted to CLeaR (Causal Learning and Reasoning) 2022!
Fall 2021: Learning Invariant Representations with Missing Data accepted to NeurIPS 2021 DistShift Workshop (on distribution shifts)! Work done as part of my internship at Apple Health AI.
Fall 2021: selected as a recipient of the NeurIPS 2021 Outstanding Reviewer Award. Glad to be a part of it!
Fall 2021: Inverse-Weighted Survival Games accepted to NeurIPS 2021!
Summer 2021: working with Apple’s Health AI team this summer supervised by Andy Miller and team!
Spring 2021: Understanding Failures in Out-of-Distribution Detection with Deep Generative Models accepted to ICML 2021!
Spring 2021: Understanding Out-of-Distribution Detection with Deep Generative Models accepted to RobustML workshop @ ICLR 2021!
Fall 2020: I qualified! Upgrade from Student to Candidate.
Fall 2020: after some time away from harvard cs, happy to help out Prof Nada Amin with the harvard AI/PL seminar
Fall 2020: the deep learning course I TA’ed in spring 2020 for Yann LeCun and Alfredo Canziani is now up on Alf’s github page, check out all of Alf’s wonderful teaching materials and thanks to students for your notetaking
Fall 2020: X-CAL: Explicit Calibration for Survival Analysis accepted to NeurIPS 2020!
Summer 2019: I’m working in Emtiyaz Khan’s Approximate Bayesian Inference group at RIKEN AIP in Tokyo!
I was a research assistant and teaching fellow in the computer science department at Harvard SEAS. I am still an on/off TF for the harvard undergrad ML course. Between Harvard and NYU, I worked with the CoCoSci group at MIT BCS. Previous to that, I studied music composition, improvisation, and theory at New England Conservatory with Anthony Coleman, Stratis Minakakis and Ran Blake. I am still involved with music and rehearse with Gamelan Kusuma Laras a classical Javanese ensemble that performs the repetoire of the courts of Central Java.