|
Ainesh Bakshi
Email: ainesh (at) mit (dot) edu
Office: 2-174
|
I am currently a postdoc in the Computer Science and Mathematics departments at MIT, where I collaborate closely with Ankur Moitra, Sam Hopkins and Piotr Indyk. I am broadly interested in Theoretical Computer Science and my current research interests are robust algorithm design, the sum-of-squares convex programming hierarchy, randomized numerical linear algebra and high dimensional probability.
I recently finished my PhD at Carnegie Mellon University, where I was extremely fortunate to be co-advised by Pravesh Kothari and David Woodruff. During the summer of 2021, I interned with Madhur Tulsiani and Yury Makarychev at Toyota Technological Institute-Chicago and in summer 2020, I interned with Ken Clarkson at IBM Almaden.
Even earlier, I was an undergrad at Rutgers, New Brunswick, where I had the pleasure of working with Martin Farach-Colton
and Pranjal Awasthi.
Dissertation
Publications
- Krylov Methods are (nearly) Optimal for Low-Rank Approximation
Ainesh Bakshi, Shyam Narayanan
Preprint [arXiv].
- An Improved Classical Singular Value Transform for Quantum Machine Learning
Ainesh Bakshi, Ewin Tang
Preprint [arXiv].
- Tensor Decompositions Meet Control Theory:
Learning General Mixtures of Linear Dynamical Systems
Ainesh Bakshi, Allen Liu, Ankur Moitra, Morris Yau
ICML 2023.
- A New Approach to Learning a Linear Dynamical System
Ainesh Bakshi, Allen Liu, Ankur Moitra, Morris Yau
STOC 2023 [arXiv].
- Sub-Quadratic Algorithms for Kernel Matrices via Kernel Density Estimation
Ainesh Bakshi, Piotr Indyk, Praneeth Kacham, Sandeep Silwal, Samson Zhou
ICLR 2023, Selected for Spotlight Presentation [arXiv].
- Low-Rank Approximation with 1/epsilon^(1/3) Matrix-Vector Products
Ainesh Bakshi, Ken Clarkson, David Woodruff
STOC 2022 [arXiv]
[Long Talk @ CMU] [Talk @ STOC].
- Robustly Learning Mixtures of k Arbitrary Gaussians
Ainesh Bakshi, Ilias Diakonikolas, He Jia, Daniel Kane, Pravesh Kothari, Santosh Vempala
STOC 2022 [arXiv] [Long Talk @ Northwestern] [Talk @ STOC].
- Robust Linear Regression: Optimal Rates in Polynomial Time
Ainesh Bakshi, Adarsh Prasad
STOC 2021 [arXiv] [Talk @ STOC].
- Learning a Latent Simplex in Input Sparsity Time
Ainesh Bakshi, Chiranjib Bhattacharyya, Ravi Kannan, David Woodruff, Samson Zhou
ICLR 2021, Selected for Spotlight Presentation [arXiv].
- List-Decodable Subspace Recovery: Dimension Independent Error in Polynomial Time
Ainesh Bakshi, Pravesh Kothari
SODA 2021 [arXiv].
- Testing Positive Semi-Definiteness via Random Submatrices
Ainesh Bakshi, Nadiia Chepurko, Rajesh Jayaram
FOCS 2020 [arXiv].
- Outlier-Robust Clustering of Non-spherical Mixtures
Ainesh Bakshi, Pravesh Kothari
FOCS 2020 [arXiv], [Joint Talk @ FOCS'20], [Long Talk @ TTI-C]
Conference version to be merged with this paper.
- Robust and Sample Optimal Algorithms for PSD Low Rank Approximation
Ainesh Bakshi, Nadiia Chepurko, David Woodruff
FOCS 2020 [arXiv], [Talk @ WOLA'20], [Joint Talk @ FOCS'20].
- Weighted Maximum Independent Set of Geometric Objects in Turnstile Streams
Ainesh Bakshi, Nadiia Chepurko, David Woodruff
APPROX 2020 [arXiv] [Talk @ Approx].
- Robust Communication-Optimal Distributed Clustering Algorithms
Pranjal Awasthi, Ainesh Bakshi, Nina Balcan, Colin White, David Woodruff
ICALP 2019 [arXiv].
- Learning Two Layer Rectified Neural Networks in Polynomial Time
Ainesh Bakshi, Rajesh Jayaram,
David Woodruff
COLT 2019 [arXiv].
- Sublinear Time Low-Rank Approximation of Distance Matrices
Ainesh Bakshi,
David Woodruff
NeurIPS 2018, Selected for Spotlight Presentation [arXiv], [Talk @ CMU TL].
Undergraduate Work
-
"How to Fragment your File System"
A. Conway,
A. Bakshi,
Y. Jiao,
Y. Zhan,
M. Bender,
W. Jannen,
R. Johnson,
B. Kuzmaul,
D. Porter,
J. Yuan,
M. Farach-Colton
;login: magazine, vol. 42(2), 2017.
-
"File Systems fated for Senescence? Nonsense, Says Science!"
A. Conway,
A. Bakshi,
Y. Jiao,
Y. Zhan,
M. Bender,
W. Jannen,
R. Johnson,
B. Kuzmaul,
D. Porter,
J. Yuan,
M. Farach-Colton
Proceedings of the 15th USENIX Conference on File and Storage Technologies (FAST '17), Santa Clara, CA, February 2017.
Teaching