Allen Institute for AI

Last updated
Allen Institute for AI
Formation2014;11 years ago (2014)
Founder Paul Allen
Type Non-profit research institute
82-4083177
Location
CEO
Ali Farhadi
Key people
Peter Clark, Yejin Choi, Noah Smith, Hannaneh Hajishirzi, Dan Weld, Chris Bretherton, Ani Kembhavi, Jes Lefcourt
Website allenai.org OOjs UI icon edit-ltr-progressive.svg

The Allen Institute for AI (abbreviated AI2) is a 501(c)(3) non-profit scientific research institute founded by late Microsoft co-founder and philanthropist Paul Allen in 2014. The institute seeks to conduct high-impact AI research and engineering in service of the common good. [1] AI2 is based in Seattle, and also has an active office in Tel Aviv, Israel. [2]

Contents

History

Oren Etzioni was appointed by Paul Allen in September 2013 to direct the research at the institute. [3] After leading the organization for nine years, Etzioni stepped down from his role as CEO on September 30, 2022. [4] He was replaced in an interim capacity by the leading researcher of the company's Aristo project, Peter Clark. [5] On June 20, 2023, AI2 announced Ali Farhadi as its next CEO starting July 31, 2023. [5]

Teams

Generative AI

AI2 has been actively involved in the development of open-source artificial intelligence through the release of fully open large language models, datasets, and model training assets.

OLMo model family

On May 11, 2023, AI2 announced they were developing OLMo, an open language model aiming to match the performance of other state-of-the-art language models. [14] In February 2024, 1B and 7B parameter variants of the model were open-sourced, including code, model weights with intermediate snapshots and logs, and contents of their Dolma training dataset, [15] making it the most open state-of-the-art model available. [16] [17] In November 2024, AI2 released the second iteration of OLMo, OLMo 2, with the initial release including 7B and 13B parameter models. [18] [19] In March 2025, AI2 released a 32B variant of OLMo 2, claiming to have released "the first fully-open model (all data, code, weights, and details are freely available) to outperform GPT3.5-Turbo and GPT-4o mini". [20]

Tulu models and post-training recipes

In addition to the fully-open OLMo family of models, AI2 has also developed Tulu, a family of instruction-tuned models and open post-training recipes that build on open-weights base models (e.g., Meta's Llama) to provide fully transparent alternatives to proprietary instruction-tuning methods. [21] AI2 released the first iteration of Tulu in June 2023, [22] with subsequent iterations being released in November 2023 (Tulu 2 [23] ) and November 2024 (Tulu 3 [24] ).

See also

References

  1. "About — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  2. "AI2 Israel — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  3. Cook, John (2013-09-04). "Going beyond Siri and Watson: Microsoft co-founder Paul Allen taps Oren Etzioni to lead new Artificial Intelligence Institute". GeekWire. Retrieved 2023-05-26.
  4. Schlosser, Kurt (2022-06-15). "Oren Etzioni stepping down as CEO of Allen Institute for AI after nine years at research hub". GeekWire. Retrieved 2023-05-26.
  5. 1 2 Bishop, Todd (2023-06-20). "Apple machine learning leader Ali Farhadi named CEO of Allen Institute for Artificial Intelligence". GeekWire . Archived from the original on 2023-06-20.
  6. "Aristo — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  7. "Aristo — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  8. "PRIOR". prior.allenai.org. Retrieved 2023-05-26.
  9. "AI2-THOR". Allen Institute for AI. Retrieved 2023-05-26.
  10. Rodriguez, Jesus (2021-07-08). "🔹🔸Edge#104: AllenNLP Makes Cutting-Edge NLP Models Look Easy". TheSequence. Retrieved 2023-05-26.
  11. "Semantic Scholar | Product". www.semanticscholar.org. Retrieved 2023-05-26.
  12. "AllenNLP — Allen Institute for AI". allenai.org. Retrieved 2023-05-26.
  13. Dormehl, Luke (2018-04-13). "Forget Cloning, A.I. is the Real Way to Let Your Family Pooch Live Forever". Digital Trends. Retrieved 2023-05-26.
  14. Groeneveld, Dirk; Beltagy, Iz; Walsh, Pete; Bhagia, Akshita; Kinney, Rodney; Tafjord, Oyvind; Jha, Ananya Harsh; Ivison, Hamish; Magnusson, Ian (2024). "OLMo: Accelerating the Science of Language Models". arXiv: 2402.00838 [cs.CL].
  15. Soldaini, Luca; Kinney, Rodney; Bhagia, Akshita; Schwenk, Dustin; Atkinson, David; Authur, Russell; Bogin, Ben; Chandu, Khyathi; Dumas, Jennifer (2024). "Dolma: an Open Corpus of Three Trillion Tokens for Language Model Pretraining Research". arXiv: 2402.00159 [cs.CL].
  16. AI2 (2023-05-18). "Announcing AI2 OLMo, an open language model made by scientists, for scientists". Medium. Retrieved 2023-05-26.{{cite web}}: CS1 maint: numeric names: authors list (link)
  17. Wiggers, Kyle (2024-02-01). "AI2 open sources text-generating AI models -- and the data used to train them". TechCrunch. Retrieved 2024-02-03.
  18. "OLMo 2: The best fully open language model to date | Ai2". allenai.org. Retrieved 2025-08-26.
  19. OLMo, Team; Walsh, Pete; Soldaini, Luca; Groeneveld, Dirk; Lo, Kyle; Arora, Shane; Bhagia, Akshita; Gu, Yuling; Huang, Shengyi (2025). "2 OLMo 2 Furious". arXiv: 2501.00656 [cs.CL].
  20. "OLMo 2 32B: First fully open model to outperform GPT 3.5 and GPT 4o mini | Ai2". allenai.org. Retrieved 2025-08-26.
  21. Coldewey, Devin (2024-11-21). "Ai2's open source Tülu 3 lets anyone play the AI post-training game". TechCrunch. Retrieved 2025-08-26.
  22. Wang, Yizhong; Ivison, Hamish; Dasigi, Pradeep; Hessel, Jack; Khot, Tushar; Chandu, Khyathi Raghavi; Wadden, David; MacMillan, Kelsey; Smith, Noah A. (2023). "How Far Can Camels Go? Exploring the State of Instruction Tuning on Open Resources". arXiv: 2306.04751 [cs.CL].
  23. Ivison, Hamish; Wang, Yizhong; Pyatkin, Valentina; Lambert, Nathan; Peters, Matthew; Dasigi, Pradeep; Jang, Joel; Wadden, David; Smith, Noah A. (2023). "Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2". arXiv: 2311.10702 [cs.CL].
  24. Lambert, Nathan; Morrison, Jacob; Pyatkin, Valentina; Huang, Shengyi; Ivison, Hamish; Brahman, Faeze; Miranda, Lester James V.; Liu, Alisa; Dziri, Nouha (2025-04-14). "Tulu 3: Pushing Frontiers in Open Language Model Post-Training". arXiv: 2411.15124 [cs.CL].