Chase Goddard

PhD student at Princeton University

prof-pic.png

I’m Chase Goddard, a PhD student in the physics department at Princeton University advised by David Schwab and Bill Bialek. My research has focused on understanding how modern machine learning methods work, particularly questions related to optimization and generalization. Much of my work relies on controlled experiments that are simple enough to be theoretically well-understood, but complex enough to still provide practical understanding of the complex phenomena observed in deep learning. I’ve also taught several courses at the graduate and undergraduate level.

Before Princeton, I majored in physics and computer science at Cornell University, where I worked with Carl Franck on X-ray spectroscopy and with Julia Thom-Levy on the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider. I also held an internship at Boston Consulting Group’s Henderson Institute, where I worked on a data analysis project that contributed to the Fortune Future 50 ranking.

I am currently working on understanding & improving reasoning in LLMs via reinforcement learning, as well as understanding global geometric properties of the loss landscape of overparameterized models. Stay tuned!

For an up-to-date list of my work, see my Google Scholar, or see below.

Publications

2025

July
  1. When can in-context learning generalize out of task distribution?
    Chase Goddard, Lindsay M. Smith, Vudtiwat Ngampruetikorn, and David J. Schwab
    In Forty-second International Conference on Machine Learning, Jul 2025
May
  1. Preprint
    variability.png
    Optimization and variability can coexist
    Marianne Bauer, William Bialek, Chase Goddard*, Caroline M. Holmes, and 5 more authors
    May 2025
    *Authors in alphabetical order. To be submitted.

2024

December
  1. Model Recycling: Model component reuse to promote in-context learning
    Lindsay M. Smith, Chase Goddard, Vudtiwat Ngampruetikorn, and David J. Schwab
    In NeurIPS 2024 Workshop on Scientific Methods for Understanding Deep Learning, Dec 2024
  2. Specialization-generalization transition in exemplar-based in-context learning
    Chase Goddard, Lindsay M. Smith, Vudtiwat Ngampruetikorn, and David J. Schwab
    In NeurIPS 2024 Workshop on Scientific Methods for Understanding Deep Learning, Dec 2024

2021

October
  1. Testing for the continuous spectrum of x rays predicted to accompany the photoejection of an atomic inner-shell electron
    Philip Jacobson, Andrija Rasovic, Arthur Campello, Chase Goddard, and 7 more authors
    Phys. Rev. A, Oct 2021