Our website is made possible by displaying online advertisements to our visitors.
Please consider supporting us by disabling your ad blocker.

Responsive image


Attention Is All You Need

An illustration of main components of the transformer model from the paper

"Attention Is All You Need"[1] is a 2017 landmark[2][3] research paper in machine learning authored by eight scientists working at Google. The paper introduced a new deep learning architecture known as the transformer, based on the attention mechanism proposed in 2014 by Bahdanau et al.[4] It is considered a foundational[5] paper in modern artificial intelligence, as the transformer approach has become the main architecture of large language models like those based on GPT.[6][7] At the time, the focus of the research was on improving Seq2seq techniques for machine translation, but the authors go further in the paper, foreseeing the technique's potential for other tasks like question answering and what is now known as multimodal Generative AI.[1]

The paper's title is a reference to the song "All You Need Is Love" by the Beatles.[8] The name "Transformer" was picked because Jakob Uszkoreit, one of the paper's authors, liked the sound of that word.[9]

An early design document was titled "Transformers: Iterative Self-Attention and Processing for Various Tasks", and included an illustration of six characters from the Transformers animated show. The team was named Team Transformer.[8]

Some early examples that the team tried their Transformer architecture on included English-to-German translation, generating Wikipedia articles on "The Transformer", and parsing. These convinced the team that the Transformer is a general purpose language model, and not just good for translation.[9]

As of 2024, the paper has been cited more than 140,000 times.[10]

  1. ^ a b Vaswani, Ashish; Shazeer, Noam; Parmar, Niki; Uszkoreit, Jakob; Jones, Llion; Gomez, Aidan N; Kaiser, Łukasz; Polosukhin, Illia (2017). "Attention is All you Need". Advances in Neural Information Processing Systems. 30. Curran Associates, Inc. arXiv:1706.03762.
  2. ^ Love, Julia (10 July 2023). "AI Researcher Who Helped Write Landmark Paper Is Leaving Google". Bloomberg News. Retrieved 1 April 2024.
  3. ^ Goldman, Sharon (20 March 2024). "'Attention is All You Need' creators look beyond Transformers for AI at Nvidia GTC: 'The world needs something better'". VentureBeat. Retrieved 1 April 2024.
  4. ^ Bahdanau, Dzmitry; Cho, Kyunghyun; Bengio, Yoshua (19 May 2016). "Neural Machine Translation by Jointly Learning to Align and Translate". arXiv:1409.0473 [cs.CL].
  5. ^ Shinde, Gitanjali; Wasatkar, Namrata; Mahalle, Parikshit (6 June 2024). Data-Centric Artificial Intelligence for Multidisciplinary Applications. CRC Press. p. 75. ISBN 9781040031131.
  6. ^ Cite error: The named reference Forbes was invoked but never defined (see the help page).
  7. ^ Cite error: The named reference Financial Times was invoked but never defined (see the help page).
  8. ^ a b Levy, Steven. "8 Google Employees Invented Modern AI. Here's the Inside Story". Wired. ISSN 1059-1028. Retrieved 20 March 2024.
  9. ^ a b Marche, Stephen (23 August 2024). "Was Linguistic A.I. Created by Accident?". The New Yorker. ISSN 0028-792X. Retrieved 24 August 2024.
  10. ^ "Meet the $4 Billion AI Superstars That Google Lost". Bloomberg. 13 July 2023 – via www.bloomberg.com.

Previous Page Next Page






Attention Is All You Need Japanese Attention Is All You Need Turkish Увага — це все, що вам треба Ukrainian

Responsive image

Responsive image