
Ainesh Bakshi
Email: ainesh (at) mit (dot) edu
Office: 2174

I am currently a postdoc in the Computer Science and Mathematics departments at MIT, where I collaborate closely with Ankur Moitra, Sam Hopkins and Piotr Indyk. I am broadly interested in Theoretical Computer Science and my current research interests are robust algorithm design, the sumofsquares convex programming hierarchy, randomized numerical linear algebra and high dimensional probability.
I recently finished my PhD at Carnegie Mellon University, where I was extremely fortunate to be coadvised by Pravesh Kothari and David Woodruff. During the summer of 2021, I interned with Madhur Tulsiani and Yury Makarychev at Toyota Technological InstituteChicago and in summer 2020, I interned with Ken Clarkson at IBM Almaden.
Even earlier, I was an undergrad at Rutgers, New Brunswick, where I had the pleasure of working with Martin FarachColton
and Pranjal Awasthi.
Dissertation
Publications
 An Improved Classical Singular Value Transform for Quantum Machine Learning
Ainesh Bakshi, Ewin Tang
Preprint [arXiv].
 A New Approach to Learning a Linear Dynamical System
Ainesh Bakshi, Allen Liu, Ankur Moitra, Morris Yau
STOC 2023 [arXiv].
 SubQuadratic Algorithms for Kernel Matrices via Kernel Density Estimation
Ainesh Bakshi, Piotr Indyk, Praneeth Kacham, Sandeep Silwal and Samson Zhou
ICLR 2023, Selected for Spotlight Presentation [arXiv].
 LowRank Approximation with 1/epsilon^(1/3) MatrixVector Products
Ainesh Bakshi, Ken Clarkson and David Woodruff
STOC 2022 [arXiv]
[Long Talk @ CMU] [Talk @ STOC].
 Robustly Learning Mixtures of k Arbitrary Gaussians
Ainesh Bakshi, Ilias Diakonikolas, He Jia, Daniel Kane, Pravesh Kothari and Santosh Vempala
STOC 2022 [arXiv] [Long Talk @ Northwestern] [Talk @ STOC].
 Robust Linear Regression: Optimal Rates in Polynomial Time
Ainesh Bakshi, Adarsh Prasad
STOC 2021 [arXiv] [Talk @ STOC].
 Learning a Latent Simplex in Input Sparsity Time
Ainesh Bakshi, Chiranjib Bhattacharyya, Ravi Kannan, David Woodruff and Samson Zhou
ICLR 2021, Selected for Spotlight Presentation [arXiv].
 ListDecodable Subspace Recovery: Dimension Independent Error in Polynomial Time
Ainesh Bakshi, Pravesh Kothari
SODA 2021 [arXiv].
 Testing Positive SemiDefiniteness via Random Submatrices
Ainesh Bakshi, Nadiia Chepurko, Rajesh Jayaram
FOCS 2020 [arXiv].
 OutlierRobust Clustering of Nonspherical Mixtures
Ainesh Bakshi, Pravesh Kothari
FOCS 2020 [arXiv], [Joint Talk @ FOCS'20], [Long Talk @ TTIC]
Conference version to be merged with this paper.
 Robust and Sample Optimal Algorithms for PSD Low Rank Approximation
Ainesh Bakshi, Nadiia Chepurko, David Woodruff
FOCS 2020 [arXiv], [Talk @ WOLA'20], [Joint Talk @ FOCS'20].
 Weighted Maximum Independent Set of Geometric Objects in Turnstile Streams
Ainesh Bakshi, Nadiia Chepurko, David Woodruff
APPROX 2020 [arXiv] [Talk @ Approx].
 Robust CommunicationOptimal Distributed Clustering Algorithms
Pranjal Awasthi, Ainesh Bakshi, Nina Balcan, Colin White, David Woodruff
ICALP 2019 [arXiv].
 Learning Two Layer Rectified Neural Networks in Polynomial Time
Ainesh Bakshi, Rajesh Jayaram,
David Woodruff
COLT 2019 [arXiv].
 Sublinear Time LowRank Approximation of Distance Matrices
Ainesh Bakshi,
David Woodruff
NeurIPS 2018, Selected for Spotlight Presentation [arXiv], [Talk @ CMU TL].
Undergraduate Work

"How to Fragment your File System"
A. Conway,
A. Bakshi,
Y. Jiao,
Y. Zhan,
M. Bender,
W. Jannen,
R. Johnson,
B. Kuzmaul,
D. Porter,
J. Yuan,
M. FarachColton
;login: magazine, vol. 42(2), 2017.

"File Systems fated for Senescence? Nonsense, Says Science!"
A. Conway,
A. Bakshi,
Y. Jiao,
Y. Zhan,
M. Bender,
W. Jannen,
R. Johnson,
B. Kuzmaul,
D. Porter,
J. Yuan,
M. FarachColton
Proceedings of the 15th USENIX Conference on File and Storage Technologies (FAST '17), Santa Clara, CA, February 2017.
Teaching