Jeff Dean

Last updated

Jeff Dean
Jeff Dean at Purdue Engineering (May 2024) - img 02.jpg
Dean in 2024
Born (1968-07-23) July 23, 1968 (age 56)
NationalityAmerican
Alma mater University of Minnesota, B.S. Computer Science and Engineering (1990)
University of Washington, Ph.D. Computer Science (1996)
Known for MapReduce, Bigtable, Spanner, TensorFlow
Scientific career
FieldsComputer Technology
Institutions Google; Digital Equipment Corporation
Thesis Whole-program optimization of object-oriented languages  (1996)
Doctoral advisor Craig Chambers

Jeffrey Adgate "Jeff" Dean (born July 23, 1968) is an American computer scientist and software engineer. Since 2018, he has been the lead of Google AI. [1] He was appointed Google's chief scientist in 2023 after the merger of DeepMind and Google Brain into Google DeepMind. [2]

Contents

Education

Dean received a B.S., summa cum laude, from the University of Minnesota in computer science and economics in 1990. [3] His undergraduate thesis was on neural networks in C programming, advised by Vipin Kumar. [4] [5]

He received a Ph.D. in computer science from the University of Washington in 1996, working under Craig Chambers on compilers [6] and whole-program optimization techniques for object-oriented programming languages. [7] He was elected to the National Academy of Engineering in 2009, which recognized his work on "the science and engineering of large-scale distributed computer systems". [8]

Career

Before joining Google, Dean worked at DEC/Compaq's Western Research Laboratory, [9] where he worked on profiling tools, microprocessor architecture and information retrieval. [10] Much of his work was completed in close collaboration with Sanjay Ghemawat. [11] [6]

Before graduate school, he worked at the World Health Organization's Global Programme on AIDS, developing software for statistical modeling and forecasting of the HIV/AIDS pandemic. [10]

Dean joined Google in mid-1999. He joined Google X in 2011 to investigate deep neural networks, which had just resurged in popularity. This ended with "the cat neuron paper", a deep belief network trained by unsupervised learning on YouTube videos. [12] This project morphed into Google Brain, also formed in 2011. Jeff Dean became its leader in 2012. In April 2018, he was appointed the head of Google's artificial intelligence division, after John Giannandrea left to lead Apple's AI projects. [13]

While at Google, he designed and implemented large portions of the company's advertising, crawling, indexing and query serving systems, along with various pieces of the distributed computing infrastructure that underlies most of Google's products. [6] At various times, he has also worked on improving search quality, statistical machine translation and internal software development tools and has had significant involvement in the engineering hiring process.

The projects Dean has worked on include:

He was an early member of Google Brain, [6] a team that studies large-scale artificial neural networks, and he has headed artificial intelligence efforts since they were split from Google Search. [16]

In 2020, after Timnit Gebru tried to publish a paper, Dean wrote that an internal review concluded that the paper "ignored too much relevant research" and did not meet Google's bar for publication, also noting that it was submitted one day instead of at least two weeks before the deadline. Gebru challenged Google's research review process and wrote that if her concerns were not addressed, they could "work on an end date". Google responded that they could not meet her conditions and accepted her resignation immediately. Gebru stated that she was fired, leading to a controversy. Dean later published a memo on Google's approach to the review process. [17] [18]

In 2023, DeepMind was merged with Google Brain to form a unified AI research unit, Google DeepMind. As part of this reorganization, Dean became Google's chief scientist. [2] [15]

Philanthropy

Dean and his wife, Heidi Hopper, started the Hopper-Dean Foundation and began making philanthropic grants in 2011. In 2016, the foundation gave $2 million each to UC Berkeley, Massachusetts Institute of Technology, University of Washington, Stanford University and Carnegie Mellon University to support programs that promote diversity in science, technology, engineering and mathematics (STEM). [19]

Personal life

Dean is married and has two daughters. [6]

He is the subject of an Internet meme for "Jeff Dean facts". Similar to Chuck Norris facts, the Jeff Dean facts exaggerate his programming powers. [20] For example: [21]

Once, in early 2002, when the index servers went down, Jeff Dean answered user queries manually for two hours. Evals showed a quality improvement of 5 points.

Awards and honors

Books

Dean was interviewed for the 2018 book Architects of Intelligence: The Truth About AI from the People Building it by the American futurist Martin Ford. [25]

Major publications

See also

Related Research Articles

Hsiang-Tsung Kung is a Taiwanese-born American computer scientist. He is the William H. Gates professor of computer science at Harvard University. His early research in parallel computing produced the systolic array in 1979, which has since become a core computational component of hardware accelerators for artificial intelligence, including Google's Tensor Processing Unit (TPU). Similarly, he proposed optimistic concurrency control in 1981, now a key principle in memory and database transaction systems, including MySQL, Apache CouchDB, Google's App Engine, and Ruby on Rails. He remains an active researcher, with ongoing contributions to computational complexity theory, hardware design, parallel computing, routing, wireless communication, signal processing, and artificial intelligence.

Bigtable is a fully managed wide-column and key-value NoSQL database service for large analytical and operational workloads as part of the Google Cloud portfolio.

Martín Abadi is an Argentine computer scientist, working at Google as of 2024. He earned his Doctor of Philosophy (PhD) in computer science from Stanford University in 1987 as a student of Zohar Manna.

Structured storage is computer storage for structured data, often in the form of a distributed database. Computer software formally known as structured storage systems include Apache Cassandra, Google's Bigtable and Apache HBase.

<span class="mw-page-title-main">Deep learning</span> Branch of machine learning

Deep learning is a subset of machine learning that focuses on utilizing neural networks to perform tasks such as classification, regression, and representation learning. The field takes inspiration from biological neuroscience and is centered around stacking artificial neurons into layers and "training" them to process data. The adjective "deep" refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.

LevelDB is an open-source on-disk key-value store written by Google fellows Jeffrey Dean and Sanjay Ghemawat. Inspired by Bigtable, LevelDB source code is hosted on GitHub under the New BSD License and has been ported to a variety of Unix-based systems, macOS, Windows, and Android.

<span class="mw-page-title-main">Spanner (database)</span> Cloud-based distributed SQL DBMS service

Spanner is a distributed SQL database management and storage service developed by Google. It provides features such as global transactions, strongly consistent reads, and automatic multi-site replication and failover. Spanner is used in Google F1, the database for its advertising business Google Ads, as well as Gmail and Google Photos.

Google Brain was a deep learning artificial intelligence research team that served as the sole AI branch of Google before being incorporated under the newer umbrella of Google AI, a research division at Google dedicated to artificial intelligence. Formed in 2011, it combined open-ended machine learning research with information systems and large-scale computing resources. It created tools such as TensorFlow, which allow neural networks to be used by the public, and multiple internal AI research projects, and aimed to create research opportunities in machine learning and natural language processing. It was merged into former Google sister company DeepMind to form Google DeepMind in April 2023.

<span class="mw-page-title-main">TensorFlow</span> Machine learning software library

TensorFlow is a software library for machine learning and artificial intelligence. It can be used across a range of tasks, but is used mainly for training and inference of neural networks. It is one of the most popular deep learning frameworks, alongside others such as PyTorch and PaddlePaddle. It is free and open-source software released under the Apache License 2.0.

<span class="mw-page-title-main">Tensor Processing Unit</span> AI accelerator ASIC by Google

Tensor Processing Unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google for neural network machine learning, using Google's own TensorFlow software. Google began using TPUs internally in 2015, and in 2018 made them available for third-party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.

An AI accelerator, deep learning processor or neural processing unit (NPU) is a class of specialized hardware accelerator or computer system designed to accelerate artificial intelligence and machine learning applications, including artificial neural networks and computer vision. Typical applications include algorithms for robotics, Internet of Things, and other data-intensive or sensor-driven tasks. They are often manycore designs and generally focus on low-precision arithmetic, novel dataflow architectures or in-memory computing capability. As of 2024, a typical AI integrated circuit chip contains tens of billions of MOSFETs.

Google Neural Machine Translation (GNMT) was a neural machine translation (NMT) system developed by Google and introduced in November 2016 that used an artificial neural network to increase fluency and accuracy in Google Translate. The neural network consisted of two main blocks, an encoder and a decoder, both of LSTM architecture with 8 1024-wide layers each and a simple 1-layer 1024-wide feedforward attention mechanism connecting them. The total number of parameters has been variously described as over 160 million, approximately 210 million, 278 million or 380 million. It used WordPiece tokenizer, and beam search decoding strategy. It ran on Tensor Processing Units.

Sanjay Ghemawat is an Indian American computer scientist and software engineer. He is currently a Senior Fellow at Google in the Systems Infrastructure Group. Ghemawat's work at Google, much of it in close collaboration with Jeff Dean, has included big data processing model MapReduce, the Google File System, and databases Bigtable and Spanner. Wired have described him as one of the "most important software engineers of the internet age".

Animashree (Anima) Anandkumar is the Bren Professor of Computing at California Institute of Technology. Previously, she was a senior director of Machine Learning research at NVIDIA and a principal scientist at Amazon Web Services. Her research considers tensor-algebraic methods, deep learning and non-convex problems.

<span class="mw-page-title-main">Timnit Gebru</span> Computer scientist

Timnit Gebru is an Eritrean Ethiopian-born computer scientist who works in the fields of artificial intelligence (AI), algorithmic bias and data mining. She is a co-founder of Black in AI, an advocacy group that has pushed for more Black roles in AI development and research. She is the founder of the Distributed Artificial Intelligence Research Institute (DAIR).

Samy Bengio is a Canadian computer scientist, Senior Director of AI and Machine Learning Research at Apple, and a former long-time scientist at Google known for leading a large group of researchers working in machine learning including adversarial settings. Bengio left Google shortly after the company fired his report, Timnit Gebru, without first notifying him. At the time, Bengio said that he had been "stunned" by what happened to Gebru. He is also among the three authors who developed Torch in 2002, the ancestor of PyTorch, one of today's two largest machine learning frameworks.

<span class="mw-page-title-main">Deborah Raji</span> Nigerian-Canadian computer scientist and activist

Inioluwa Deborah Raji is a Nigerian-Canadian computer scientist and activist who works on algorithmic bias, AI accountability, and algorithmic auditing. Raji has previously worked with Joy Buolamwini, Timnit Gebru, and the Algorithmic Justice League on researching gender and racial bias in facial recognition technology. She has also worked with Google’s Ethical AI team and been a research fellow at the Partnership on AI and AI Now Institute at New York University working on how to operationalize ethical considerations in machine learning engineering practice. A current Mozilla fellow, she has been recognized by MIT Technology Review and Forbes as one of the world's top young innovators.

Lê Viết Quốc, or in romanized form Quoc Viet Le, is a Vietnamese-American computer scientist and a machine learning pioneer at Google Brain, which he established with others from Google. He co-invented the doc2vec and seq2seq models in natural language processing. Le also initiated and lead the AutoML initiative at Google Brain, including the proposal of neural architecture search.

The Symposium on Operating Systems Design and Implementation (OSDI), organized by USENIX, is one of the two top academic conferences on systems research, along with SOSP. A number of notable systems were first published as OSDI papers, including MapReduce, Bigtable, Spanner, and TensorFlow.

References

  1. Vincent, James (April 3, 2018). "Google veteran Jeff Dean takes over as company's AI chief". The Verge. Retrieved November 29, 2023.
  2. 1 2 Elias, Jennifer (April 20, 2023). "Read the internal memo Alphabet sent in merging A.I.-focused groups DeepMind and Google Brain". CNBC. Retrieved June 22, 2023.
  3. "Jeff Dean".
  4. Dean, Jeffrey. "Parallel implementations of neural network training: Two back-propagation approaches." senior thesis, University of Minnesota (1990).
  5. https://x.com/jeffdean/status/1033874204548984833
  6. 1 2 3 4 5 6 7 "The Friendship That Made Google Huge". The New Yorker. Retrieved December 3, 2018.
  7. "STANFORD TALKS; Jeff Dean: TensorFlow Overview and Future Directions". Stanford University. January 21, 2016. Archived from the original on August 28, 2016. Retrieved August 15, 2016.
  8. "Jeff Dean elected to National Academy of Engineering". UW CSE News. University of Washington. February 5, 2009. Retrieved August 15, 2016.
    - "Jeffrey A Dean - Award Winner". Association for Computing Machinery. Retrieved August 15, 2018.
  9. Metz, Cade (August 8, 2008). "If Xerox PARC Invented the PC, Google Invented the Internet". Wired. Retrieved August 19, 2016.
  10. 1 2 "Jeff Dean". Speakerpedia. Retrieved August 19, 2016.
  11. Metz, Cade (August 8, 2012). "If Xerox PARC Invented the PC, Google Invented the Internet". Wired. Retrieved December 16, 2017.
  12. 1 2 Le, Quoc V. (May 2013). "Building high-level features using large scale unsupervised learning". 2013 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE. pp. 8595–8598. arXiv: 1112.6209 . doi:10.1109/icassp.2013.6639343. ISBN   978-1-4799-0356-6.
  13. Simonite, Tom. "Google's New AI Head Is So Smart He Doesn't Need AI". Wired. ISSN   1059-1028 . Retrieved November 29, 2024.
  14. Markoff, John (June 25, 2012). "How Many Computers to Identify a Cat? 16,000". The New York Times.
  15. 1 2 3 "Jeffrey Dean". Google Research. Retrieved October 25, 2024.
  16. D'Onfro, Jillian (April 2, 2018). "Google is splitting A.I. into its own business unit and shaking up its search leadership". CNBC. Retrieved April 3, 2018.
  17. Ghaffray, Shirin (December 4, 2020). "The controversy behind a star Google AI researcher's departure". Vox. Retrieved December 5, 2020.
  18. Johnson, Khari (December 3, 2020). "Google AI ethics co-lead Timnit Gebru says she was fired over an email". VentureBeat. Retrieved November 2, 2024.
  19. "$1M Hopper-Dean Foundation Gift for Diversity in CS". UC Berkeley. Retrieved January 29, 2019.
    - Williams, Tate (August 10, 2016). "One of Google's Top Programmers Has Made STEM Diversity a Philanthropic Cause". Inside Philanthropy. Retrieved August 15, 2016.
    - "$1 million gift to support diversity in STEM education". Massachusetts Institute of Technology. Retrieved October 25, 2020.
  20. Carlson, Nicholas. "Astounding 'Facts' About Google's Most Badass Engineer, Jeff Dean". Business Insider. Retrieved November 30, 2024.
  21. Ritzdorf, Lucas (October 23, 2024), LRitzdorf/TheJeffDeanFacts , retrieved November 29, 2024
  22. ACM-Infosys Foundation Award
  23. "The Mark Weiser Award". ACM SIGOPS. Retrieved July 5, 2019.
  24. Newly Elected Members, American Academy of Arts and Sciences, April 2016, retrieved April 20, 2016
  25. Ford, Marin (2018). Architects of Intelligence: The Truth About AI from the People Building it. Packt Publishing Ltd. ISBN   9781789131260.