PhD student | Computer Science | Princeton University
I'm a second-year PhD student at Princeton, where I have the privilege of being advised by Prof. Elad Hazan. I am grateful to be supported by the Gordon Wu Fellowship.
I am interested in machine learning theory, with a focus on online control and learning in games, motivated by challenges in reinforcement learning and LLM alignment.
During my Master’s studies, I had the pleasure of working with Prof. Daniel Soudry and collaborating with Prof. Nathan Srebro on theory of generalization in deep learning. During my undergraduate studies, I was fortunate to intern with Prof. Michal Irani at the Weizmann Institute, where we worked on memorization in deep learning.
I also help organize the ALG-ML seminar here at Princeton.
Besides research, I enjoy hiking, diving, and jazz music.
My email: first-name dot last-name at gmail dot com
Preprints:
Gon Buzaglo, Noah Golowich and Elad Hazan.
Anand Brahmbhatt, Gon Buzaglo, Sofiia Druchyna and Elad Hazan (authors listed alphabetically).
Conference Papers:
Anand Brahmbhatt, Gon Buzaglo, Sofiia Druchyna and Elad Hazan (authors listed alphabetically). NeurIPS 2025
Master's Work:
Gon Buzaglo*, Itamar Harel*, Mor Shpigel-Nacson*, Alon Brutzkus, Nathan Srebro and Daniel Soudry. ICML 2024
Undergraduate work:
Gon Buzaglo*, Niv Haim*, Gilad Yehudai, Gal Vardi, Yakir Oz, Yaniv Nikankin and Michal Irani. NeurIPS 2023
Itay Evron, Edward Moroshko, Gon Buzaglo, Maroun Khriesh, Badea Marjieh, Nathan Srebro and Daniel Soudry. ICML 2023
Instructor: Prof. Sanjeev Arora
2024 - 2029 | PhD in Computer Science | Princeton University
Studying Theoretical Machine Learning under the supervision of Prof. Elad Hazan.
2023 - 2024 | M.Sc. in Electrical and Computer Engineering | Technion
Studied theory for Deep Learning under the supervision of Prof. Daniel Soudry.
Focused on generalization bounds for neural networks, in the setting of random interpolation.
2020 - 2024 | B.Sc. in Computer Science and Physics | Technion
Studied Deep Learning under the supervision of Prof. Michal Irani.
Focused on memorization in neural networks, using training data reconstruction.