Aryan Mokhtari
- Assistant Professor, ECE., UT Austin (Jack Kilby/Texas Instruments Fellow)
- Visiting Faculty Researcher, Google Research
- Core Member, NSF AI Inst. for Foundations of Machine Learning (IFML)
- Core Member, NSF AI Inst. for Future Edge Networks & Distributed Intelligence (AI-EDGE)
- Google Scholar / CV
Research Interests:
- Optimization Theory and Algorithms:
- Convex and Nonconvex Optimization, Minimax Optimization, Bilevel Optimization, Parameter-free and Adaptive Optimization
- Machine Learning Theory:
- Representation Learning, Multi-task Learning, Contrastive Learning, In-Context Learning, Machine Unlearning
News
-
- Dec. ’24: Plenary talk at Optimization for ML Workshop at NeurIPS. Slides are available here.
- Sep. ’24: Papers at NeurIPS 2024:
- Non-asymptotic Global Convergence Analysis of BFGS with the Armijo-Wolfe Line Search [pdf] (Spotlight)
- In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness [pdf] (Spotlight)
- Adaptive and Optimal Second-order Optimistic Methods for Minimax Optimization [pdf]
- An Accelerated Gradient Method for Simple Bilevel Optimization with Convex Lower-level Problem [pdf]
- Stochastic Newton Proximal Extragradient Method [pdf]
- July ’24: Qiujiang (joining Goldman Sachs), Isidoros (joining Meta), Liam (joining Snap Research), and Devyani (joining Goldman Sachs) have graduated! Wishing them all the best as they embark on their new journeys!
- May ’24: Paper at ICML 2024:
- Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks [pdf] (Oral Presentation)
- Apr. ’24: Google Research Scholar Award. (Announcement)
- Jan. ’24: NSF CAREER Award. (Announcement)
- Jan. ’24: Paper at AISTATS 2024:
- Krylov Cubic Regularized Newton: A Subspace Second-Order Method with Dimension-Free Convergence Rate [pdf]
Support
My research has been supported by an NSF CAREER Award CCF-2338846, NSF Grants CCF-2007668 and ECCS-2127697, an ARO Early Career Program (ECP) Award, a Google Research Scholar Award, the NSF AI Institute for Foundations of Machine Learning (IFML), the NSF AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE), and the Machine Learning Laboratory at UT Austin.