Vignesh Kothapalli

Hi , I’m a first year CS PhD student at Stanford University. I’m primarily interested in understanding the computational aspects of learning in neural networks, and using these insights to develop novel modeling techniques.
Previously, I was a senior ML engineer at LinkedIn AI building recommendation foundation models. I earned my MSc in Computer Science at NYU Courant, and had an amazing time being a part of the Math and Data Group. I have also contributed to TensorFlow and served as a maintainer for TensorFlow-IO during my time at IBM. I hold a B.Tech in Electronics and Communication Engineering from IIT Guwahati.
Whats New
- [2025.09]
Started Ph.D. at Stanford!
- [2025.09]
Our work on distillation and compression of production-grade LLMs at LinkedIn is accepted as an Oral presentation at EMNLP (Industry Track). We also open-sourced our approaches in the “fmchisel” library.
- [2025.08]
“From Spikes to Heavy Tails: Unveiling the Spectral Evolution of Neural Networks” has been accepted to TMLR!
- [2025.06]
Our “Liger Kernel” paper has been accepted to CODE ML Workshop, ICML 2025!
- [2025.05]
The “CoT-ICL Lab” paper has been accepted to ACL Main 2025!
- [2025.04]
“Can Kernel Methods Explain How the Data Affects Neural Collapse?” has been accepted to TMLR!
- [2025.01]
The 360Brew foundation model tech-report is available on arxiv!
- [2024.10]
Our tech-report on Liger-Kernels is now available on arxiv. (GPU Mode/Lightning AI/AMD+Embedded LLM/Blog)
Academic Service
- Conference Reviewer: NeurIPS 2024 (Top-Reviewer), ICML 2025, ICLR 2026
- Journal Reviewer: IEEE Transactions on Cybernetics, IEEE Access, IEEE Transactions on Industrial Informatics, TMLR