I am an Assistant Professor in the Department of Electrical and Computer Engineering (ECE) at the University of Texas at Austin (UT Austin) where I hold the Fellow of Texas Instruments/Kilby Endowed Chair.
I am a core member of the Machine Learning Laboratory (MLL), the NSF AI Institute for Foundations of Machine Learning (IFML), and the Wireless Networking & Communications Group (WNCG) at UT Austin. I am also a member of the NSF AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE).
My current research focuses on the theory and applications of convex and non-convex optimization in large-scale machine learning and data science problems.
If you are interested in working with me: Please apply to the ECE graduate program and mention my name in your application. If you are already at UT Austin, please send me an email and we can arrange a time to meet.
Selected Recent News
Jan. 2022: Our paper “Minimax Optimization: The Case of Convex-Submodular” is accepted to AISTATS 2022 for Oral Presentation.
Sept. 2021: Our following papers are accepted to NeurIPS 2021:
– Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach [pdf]
– Generalization of Model-Agnostic Meta-Learning Algorithms: Recurring and Unseen Tasks [pdf]
– On the Convergence Theory of Debiased Model-Agnostic Meta-Reinforcement Learning [pdf]
Sept. 2021: Honored to be appointed as a Texas Instruments/Kilby Fellow.
Aug. 2021: I’m grateful to the NSF for their generous support of my work. [Award Link]
May 2021: Honored to receive the ARO Early Career Program (ECP) Award (previously known as ARO YIP). [News release]
***(For the full list of news please check the News tab.)
My research is supported by NSF Grants CCF-2007668, ECCS-2127697, and IIS-2112471, an ARO Early Career Program (ECP) Award, the Machine Learning Laboratory at UT Austin, and the NSF AI Institutes IFML and AI-EDGE.