Vignesh Kothapalli

vk_prof_pic.jpeg

I am a first-year CS PhD student at Stanford University, advised by Prof. Jure Leskovec. I am interested in the computational aspects of learning in neural networks. My research is supported by the Stanford School of Engineering Fellowship.

Previously I was a senior ML engineer at LinkedIn AI, building foundation models for recommendation. I earned my MSc in Computer Science at NYU Courant, where I was advised by Prof. Joan Bruna in the Math and Data Group. I have also contributed to TensorFlow and maintained TensorFlow-IO at IBM, and hold a B.Tech in Electronics and Communication Engineering from IIT Guwahati.

Email / Scholar / GitHub / LinkedIn / X

Publications

  1. PluRel: Synthetic Data unlocks Scaling Laws for Relational Foundation Models
    Vignesh Kothapalli, Rishabh Ranjan, Valter Hudovernik, Vijay Prakash Dwivedi, Johannes Hoffart, Carlos Guestrin, and Jure Leskovec
    International Conference on Machine Learning, 2026
  2. Distilling the Essence: Efficient Reasoning Distillation via Sequence Truncation
    Wei-Rui Chen, Vignesh Kothapalli, Ata Fatahibaarzi, Hejian Sang, Shao Tang, Qingquan Song, Zhipeng Wang, and Muhammad Abdul-Mageed
    In Findings of the Association for Computational Linguistics , 2026
  3. To Think or Not to Think: The Hidden Cost of Meta-Training with Excessive CoT Examples
    Vignesh Kothapalli, Ata Fatahibaarzi, Hamed Firooz, and Maziar Sanjabi
    In Proceedings of the Association for Computational Linguistics , 2026
  4. Scaling Down, Serving Fast: Compressing and Deploying Efficient LLMs for Recommendation Systems
    Kayhan Behdin, Ata Fatahibaarzi, Qingquan Song, Yun Dai, Aman Gupta, Zhipeng Wang, Hejian Sang, Shao Tang, Gregory Dexter, Sirou Zhu, Siyu Zhu, Tejas Dharamsi, Vignesh Kothapalli, Zhoutong Fu, Yihan Cao, Pin-Lun Hsu, Fedor Borisyuk, Natesh S. Pillai, Luke Simon, and Rahul Mazumder
    In Conference on Empirical Methods in Natural Language Processing (Industry Track) , 2025
  5. CoT-ICL Lab: A Synthetic Framework for Studying Chain-of-Thought Learning from In-Context Demonstrations
    Vignesh Kothapalli, Hamed Firooz, and Maziar Sanjabi
    In Proceedings of the Association for Computational Linguistics , 2025
  6. From Spikes to Heavy Tails: Unveiling the Spectral Evolution of Neural Networks
    Vignesh Kothapalli, Tianyu Pang, Shenyang Deng, Zongmin Liu, and Yaoqing Yang
    Transactions on Machine Learning Research, 2025
  7. Liger-Kernel: Efficient Triton Kernels for LLM Training
    Pin-Lun Hsu, Yun Dai, Vignesh Kothapalli, Qingquan Song, Shao Tang, Siyu Zhu, Steven Shimizu, Shivam Sahni, Haowen Ning, Yanning Chen, and Zhipeng Wang
    Championing Open-source DEvelopment in ML Workshop @ ICML25, 2025
  8. Can Kernel Methods Explain How the Data Affects Neural Collapse?
    Vignesh Kothapalli, and Tom Tirer
    Transactions on Machine Learning Research, 2025
  9. A Neural Collapse Perspective on Feature Evolution in Graph Neural Networks
    Vignesh Kothapalli, Tom Tirer, and Joan Bruna
    Advances in Neural Information Processing Systems, 2023
  10. Randomized Schur Complement Views for Graph Contrastive Learning
    Vignesh Kothapalli
    In International Conference on Machine Learning , 2023
  11. Neural Collapse: A Review on Modelling Principles and Generalization
    Vignesh Kothapalli
    Transactions on Machine Learning Research, 2023
  12. Abnormal Event Detection on BMTT-PETS 2017 Surveillance Challenge
    Vignesh Kothapalli, Gaurav Yadav, and Amit Sethi
    In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops , 2017