Ashish Vaswani | |
|---|---|
| Born | 1986 (age 38–39) |
| Alma mater | |
| Known for | Transformer (deep learning architecture) |
| Scientific career | |
| Fields | |
| Institutions |
|
| Thesis | Smaller, Faster, and Accurate Models for Statistical Machine Translation (2014) |
| Doctoral advisor |
|
Ashish Vaswani (born 1986) [1] is an Indian computer scientist. Vaswani conducted research at Google Brain and, earlier in his career, was affiliated with the Information Sciences Institute at the University of Southern California.
Vaswani is a co-author of the 2017 paper "Attention Is All You Need," which introduced the Transformer neural network architecture. [2] The Transformer model has been used in the development of subsequent NLP models BERT, ChatGPT, and their successors.
Vaswani completed his engineering in Computer Science from Birla Institute of Technology, Mesra (BIT Mesra) in 2002. In 2004, he enrolled at the University of Southern California for graduate studies. [3] He earned his PhD in Computer Science at the University of Southern California supervised by David Chiang. [4] During his research career at Google, [5] Vaswani was part of the Google Brain team, where he conducted the work leading to the 'Attention Is All You Need' publication. Prior to joining Google, he was affiliated with the Information Sciences Institute at the University of Southern California.
After Google, Vaswani co-founded Adept AI, a machine learning-focused startup that developed AI agents and tools for software automation. He has since left the company. [6] [7] He is currently co-founder and CEO of Essential AI.
Vaswani's most notable paper, "Attention Is All You Need", was published in 2017. [8] The paper introduced the Transformer model, which uses self-attention mechanisms instead of recurrence for sequence-to-sequence tasks.
The Transformer architecture has become foundational to modern language models and NLP systems, including BERT (2018), [9] GPT-2, and GPT-3. (2019-2020), and more recent models such as ChatGPT, GPT-4, and GPT-5. The 'Attention Is All You Need' paper is among the most cited papers in machine learning.