I am an Assistant Professor in the Department of Electrical and Computer Engineering at the University of Texas at Austin (UT Austin). I am also a member of UT-MINDS (Machine INtelligence & Decision Systems) and WNCG (Wireless Networking & Communications Group).
Before joining UT Austin, I was a Postdoctoral Associate in the Laboratory for Information and Decision Systems (LIDS) at MIT. Prior to that, I was a Research Fellow at the Simons Institute for the Theory of Computing at UC Berkeley for the program on Bridging Continuous and Discrete Optimization. I obtained my Ph.D. in Electrical and Systems Engineering from the University of Pennsylvania. For a slightly more formal third-person bio please check the Bio tab.
My current research focuses on the theory and applications of convex and non-convex optimization in large-scale machine learning and data science problems.
If you are interested in working with me: Please apply to the ECE graduate program and mention my name in your application. If you are already at UT Austin, please send me an email and we can arrange a time to meet.
- October 2019: New paper out “One Sample Stochastic Frank-Wolfe”
- October 2019: New paper out “FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization”
- October 2019: Our paper “A Primal-Dual Quasi-Newton Method for Exact Consensus Optimization” is accepted for publication in Trans. on Signal Processing.
- September 2019: Our following papers are accepted to NeurIPS 2019
– “Robust and Communication-Efficient Collaborative Learning”
– “Stochastic Continuous Greedy++: When Upper and Lower Bounds Match”
- August 2019: New paper out “On the Convergence Theory of Gradient-Based Model-Agnostic Meta-Learning Algorithms”
- August 2019: Officially started at UT Austin as an assistant professor.
- July 2019: New paper out “Robust and Communication-Efficient Collaborative Learning”
- July 2019: Our paper “An Exact Quantized Decentralized Gradient Descent Algorithm” is accepted for publication in Trans. on Signal Processing.
- May 2019: New paper out “Proximal Point Approximations Achieving a Convergence Rate of O(1/k) for Smooth Convex-Concave Saddle Point Problems: Optimistic Gradient and Extra-gradient Methods”
- February 2019: Delivered the talk “Achieving Acceleration via Direct Discretization of Heavy-Ball Ordinary Differential Equation” at the ITA 2019.
- February 2019: New paper out “Stochastic Conditional Gradient++: (Non-)Convex Minimization and Continuous Submodular Maximization ”
- February 2019: New paper out “Quantized Frank-Wolfe: Communication-Efficient Distributed Optimization”
- January 2019: New paper out “A Unified Analysis of Extra-gradient and Optimistic Gradient Methods for Saddle Point Problems: Proximal Point Approach“
- December 2018: Our paper “Efficient Nonconvex Empirical Risk Minimization via Adaptive Sample Size Methods” is accepted to AISTATS 2019.
(For the full list of news please check the News tab.)