Lam M. Nguyen | |
---|---|
Education | Lomonosov Moscow State University (BS) McNeese State University (MBA) Lehigh University (PhD) |
Known for | SARAH algorithm; optimization for machine learning |
Scientific career | |
Fields | Optimization, Machine Learning, Applied Mathematics |
Institutions | IBM Research MIT-IBM Watson AI Lab Lehigh University |
Doctoral advisor | Katya Scheinberg |
Lam M. Nguyen is a Vietnamese computer scientist and applied mathematician known for his contributions to optimization algorithms for machine learning and notable for proposing and developing the SARAH stochastic recursive gradient method. [1] He is a Research Scientist at the IBM Research, Thomas J. Watson Research Center, New York, USA, where his research focuses on optimization, federated learning, and time-series foundation models. [2] He is an INFORMS Senior member and a member of the Beta Gamma Sigma honor society, one of the highest honors a business student can receive. [3] [4]
Nguyen earned a Bachelor of Science degree in Applied Mathematics and Computer Science from Lomonosov Moscow State University (2008), an M.B.A. from McNeese State University (2013), and a Ph.D. in Industrial and Systems Engineering from Lehigh University (2018). His dissertation, A Service System with On-Demand Agents, Stochastic Gradient Algorithms and the SARAH Algorithm, received the university's Elizabeth V. Stout Dissertation Award. [5] His doctoral advisor was Katya Scheinberg.
He joined IBM Research in 2018 as a Research Scientist. [2] Since 2020, Nguyen has been a Principal Investigator at the MIT-IBM Watson AI Lab, leading projects on safe and interpretable learning for time-series data. He was appointed Adjunct Faculty at Lehigh University in 2024. [6]
Nguyen's research centers on optimization methods for machine learning and stochastic optimization. He is recognized as the lead inventor of the SARAH (Stochastic Recursive Gradient) algorithm, introduced at ICML 2017, which has influenced a wide class of variance-reduced optimization methods. [1]
He is the co-editor of the book Federated Learning: Theory and Practice (Elsevier, 2024), which provides a unified treatment of the theoretical and practical aspects of federated learning. [7]
Nguyen serves as an Action Editor for the Journal of Machine Learning Research and Machine Learning , and as an Associate Editor of the Journal of Optimization Theory and Applications . [8] [9] [10] He has been in the Organizing Committee for the Conference on Neural Information Processing Systems (NeurIPS) 2023–2025, [11] and Senior Area Chair for International Conference on Machine Learning (ICML) , International Conference on Learning Representations (ICLR) , Conference on Neural Information Processing Systems (NeurIPS), and Artificial Intelligence and Statistics (AISTATS). He has organized workshops at NeurIPS 2021 and AAAI 2023.
Nguyen has delivered invited lectures at major conferences including multiple INFORMS Annual Meetings. He is a Plenary Speaker at the International Conference on Modeling, Computation and Optimization (MCO 2025), held at the University of Lorraine, France, presenting Advances in Non-Convex Optimization: Shuffling Methods and Momentum Techniques for Machine Learning. [12]
{{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link)