About me

My name is Michael and I am a research engineer on the Polymathic team at the Flatiron Institute. I received my Ph.D. from the University of Colorado, Boulder working under Prof. Jed Brown on machine learning for computational physics where I worked on problems in data assimilation, dynamics modeling, and large-scale deep learning. Prior to beginning my Ph.D. I worked as a Data Scientist in industry where learned to apply machine learning to concrete problems across the healthcare, financial, and energy sectors.

My research interests are broadly in machine learning and optimization. I like digging into ideas that are coming into prominence and finding connections to better understood tools. Then using those connections to learn from new types of data or develop algorithms that are especially data efficient for certain classes of problems. Right now, I’m especially interested in ML for physics-driven systems where we have prior knowledge of system behavior in the form of PDEs, invariances, or conservation laws.

Outside of work, I do a lot of climbing, running, and reading.

News

  • 2023/12/18 - MPP won best paper at the NeurIPS 2023 Workshop on AI for Science!
  • 2023/12/08 - Our paper on stability of neural operators was accepted to TMLR!
  • 2023/10/09 - Released work on multiple physics pretraining with the PolymathicAI collaboration to arxiv!
  • 2023/06/22 - Released joint work with Peter and Shashank from LBL on stability in autoregressive neural operators on arxiv!
  • 2022/5/10 - Excited to be working with Lawrence Berkeley National Lab on deep learning based weather models this summer!
  • 2021/9/28 - The updated version of our earlier workshop paper, now titled, “Learning to Assimilate in Chaotic Dynamical Systems” was accepted to NeurIPS 2021.
  • 2021/5/21 - I will be working with Argonne National Lab as a Givens associate this summer!