I am an Assistant Professor in the Department of Electrical and Computer Engineering (ECE) at the University of Texas at Austin (UT Austin) where I hold the Fellow of Texas Instruments/Kilby Endowed Chair.
I am a core member of the Machine Learning Laboratory (MLL), the NSF AI Institute for Foundations of Machine Learning (IFML), and the Wireless Networking & Communications Group (WNCG) at UT Austin. I am also a member of the NSF AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE).
My current research focuses on the theory and applications of convex and non-convex optimization in large-scale machine learning and data science problems.
If you are interested in working with me: Please apply to the ECE graduate program and mention my name in your application. If you are already at UT Austin, please send me an email and we can arrange a time to meet.
ECE Foundations of Data Science Seminars: Since Fall 2019, I organize the Foundations of Data Science seminar series at the ECE department, sponsored by an NSF Tripods Institute. You can find more information about our seminars here.
Selected Recent News
September 2021: Our following papers are accepted to NeurIPS 2021:
– Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach [pdf]
– Generalization of Model-Agnostic Meta-Learning Algorithms: Recurring and Unseen Tasks [pdf]
– On the Convergence Theory of Debiased Model-Agnostic Meta-Reinforcement Learning [pdf]
September 2021: Honored to be appointed as a Texas Instruments/Kilby Fellow.
August 2021: I’m grateful to the NSF for their generous support of my work. [Award Link]
May 2021: Honored to receive the ARO Early Career Program (ECP) Award (previously known as ARO YIP). [News release]
May 2021: Our paper “Exploiting Shared Representations for Personalized Federated Learning” is accepted to ICML 2021.
April 2021: Check out my talk on “Towards Communication-Efficient Personalized Federated Learning via Representation Learning and Meta-Learning” in the NSF Workshop on Communication Efficient Distributed Optimization. [Video]
March 2021: Check out my talk on “Exploiting Fast Local Convergence of Second-Order Methods Globally: Adaptive Sample Size Methods” in the Beyond First-Order Methods in Machine Learning Mini-symposium at the SIAM Conference on Computational Science and Engineering (CSE21).
January 2021: Our paper “Federated Learning with Compression: Unified Analysis and Sharp Guarantees” is accepted to AISTATS 2021.
***(For the full list of news please check the News tab.)