John Ball (cognitive scientist)

Last updated

John Samuel Ball (born 1963) is an American cognitive scientist, an expert in machine intelligence, [1] computer architecture and the inventor of Patom Theory. [2]

Contents

John Ball John Samuel Ball.jpeg
John Ball

Biography

Born in Iowa USA whilst his Australian father Samuel Ball was working on his PhD in Educational Psychology, Ball returned with the family to Australia in 1978 to finish his secondary schooling on the north shore of Sydney. Ball received a Bachelor of Science in 1984 from the University of Sydney, a Masters of Cognitive Science from University of NSW in 1989 and a Master of Business Administration from MGSM (Macquarie Graduate School of Management) in 1997.

From a young age, Ball was fascinated by computers having been exposed to early mainframes at Educational Testing Service (ETS) in Princeton in the 1970s.

He was challenged by a lecturer as an undergraduate to pursue machine intelligence when she announced that computers would never be able to perform human like functions such as language or visual recognition.

Work

His career begun at IBM Australia as a mainframe engineer, leading to country support specialist responsible for supporting and training hardware engineers across Australia and New Zealand on mainframe and I/O devices. His expertise was in the IBM 370 I/O architecture, learning from global designer of channel architecture, Kenneth Trowell. Following IBM in 1996 he worked in other large Australian corporations managing and defining the commercials of complex IT contracts between stakeholders.

Always interested in how machines could better emulate human brain functions, he postulated Patom theory  the word representing a combination of pattern matching and atom. This reflected his belief that the brain simply stores, matches and uses hierarchical, bidirectional linkset patterns (sequences and sets) as sufficient to explain human capabilities. This he claimed was the approach of the human brain to language and vision and was first publicly aired in 2000, on Robyn Williams’ Okham's Razor. [3]

Over the years, exchanges with Artificial Intelligence experts such as Marvin Minsky led him to work on a prototype to demonstrate and prove his theory. [4]

Ball left corporate life to focus full-time on proving a natural language understanding (NLU) system with samples across diverse languages including Mandarin, Korean, German, Japanese, Spanish, English, French, Italian and Portuguese. Since 2007, Ball has filed two patents. [5]

In 2011 Ball came across a book of Emma L. Pavey's [6] whilst visiting a Barnes & Noble store in Princeton, New Jersey. This included a reference to a linguistic theory developed by Professor Robert Van Valin, Jr. and Professor William A. Foley, called Role & Reference Grammar (RRG). [7] Ball determined the explanation of a meaning based linguistic framework described in Pavey's book, to be the missing link for implementation of his theory. He contacted Van Valin and began integrating RRG into his prototype. Unlike dominant linguistic theories such as Universal Grammar, by Noam Chomsky, Ball's approach focused on meaning and provided a way for computers to break down any human language by meaning enabling communications between man and machine. In Van Valin's Paper, From NLP to NLU, [8] Van Valin talks about progressing from natural language processing (NLP) to NLU with the introduction of meaning achieved by the combination of RRG & Patom theory.

In 2014, The University of Sydney completed an external review analyzing its capabilities across Word-sense disambiguation (WSD), context tracking, word boundary identification, machine translation and conversation. By 2015, Ball had included samples across nine languages and could demonstrate a solution to open scientific problems in the field of NLU, including:

In 2015, Ball wrote a seven-part series for Computerworld, [9] Speaking Artificial Intelligence in which he traced the dominant approaches of statistical analysis and machine learning, from the 1980s to the present.

Applications for this technology and its implications for intelligent machines have been published by Dr Hossein Eslambolchi in World Economic Forum. [10]

Ball's work to date refutes the commonly held belief that the human brain ‘processes’ information like a computer. His lab work and NLU demonstrate human-like conversation and accuracy in translation, written about in his papers "The Science of NLU" and "Patom Theory". [11] [12]

In December 2018, his machine intelligence company, Pat Inc received the award of 'Best New Algorithm for AI' by London-based Into.AI organization as recognition of his novel approach to AI-hard problem, natural-language understanding. Pat Inc also won the Best Technical Implementation for AI, 2019/2020 by Into.AI

Publications

Using NLU in Context for Question Answering: Improving on Facebook's bAbI Tasks

Machine Intelligence

Can Machines Talk

Series 'Patom Theory'

Speaking Artificial Intelligence

How Brains Work: Patom Theory’s Support from RRG Linguistics

John Ball's Medium account

Related Research Articles

Artificial intelligence (AI), in its broadest sense, is intelligence exhibited by machines, particularly computer systems. It is a field of research in computer science that develops and studies methods and software that enable machines to perceive their environment and use learning and intelligence to take actions that maximize their chances of achieving defined goals. Such machines may be called AIs.

<span class="mw-page-title-main">Functional linguistics</span> Approach to linguistics

Functional linguistics is an approach to the study of language characterized by taking systematically into account the speaker's and the hearer's side, and the communicative needs of the speaker and of the given language community. Linguistic functionalism spawned in the 1920s to 1930s from Ferdinand de Saussure's systematic structuralist approach to language (1916).

Natural language processing (NLP) is an interdisciplinary subfield of computer science and artificial intelligence. It is primarily concerned with providing computers with the ability to process data encoded in natural language and is thus closely related to information retrieval, knowledge representation and computational linguistics, a subfield of linguistics. Typically data is collected in text corpora, using either rule-based, statistical or neural-based approaches in machine learning and deep learning.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

Role and reference grammar (RRG) is a model of grammar developed by William A. Foley and Robert Van Valin, Jr. in the 1980s, which incorporates many of the points of view of current functional grammar theories.

Natural language understanding (NLU) or natural language interpretation (NLI) is a subset of natural language processing in artificial intelligence that deals with machine reading comprehension. NLU has been considered an AI-hard problem.

Natural language generation (NLG) is a software process that produces natural language output. A widely-cited survey of NLG methods describes NLG as "the subfield of artificial intelligence and computational linguistics that is concerned with the construction of computer systems that can produce understandable texts in English or other human languages from some underlying non-linguistic representation of information".

Artificial general intelligence (AGI) is a type of artificial intelligence (AI) that matches or surpasses human cognitive capabilities across a wide range of cognitive tasks. This contrasts with narrow AI, which is limited to specific tasks. AGI is considered one of the definitions of strong AI.

<span class="mw-page-title-main">School of Informatics, University of Edinburgh</span> Higher education institution

The School of Informatics is an academic unit of the University of Edinburgh, in Scotland, responsible for research, teaching, outreach and commercialisation in informatics. It was created in 1998 from the former department of artificial intelligence, the Centre for Cognitive Science and the department of computer science, along with the Artificial Intelligence Applications Institute (AIAI) and the Human Communication Research Centre.

An artificial brain is software and hardware with cognitive abilities similar to those of the animal or human brain.

The philosophy of artificial intelligence is a branch of the philosophy of mind and the philosophy of computer science that explores artificial intelligence and its implications for knowledge and understanding of intelligence, ethics, consciousness, epistemology, and free will. Furthermore, the technology is concerned with the creation of artificial animals or artificial people so the discipline is of considerable interest to philosophers. These factors contributed to the emergence of the philosophy of artificial intelligence.

In the history of artificial intelligence, an AI winter is a period of reduced funding and interest in artificial intelligence research. The field has experienced several hype cycles, followed by disappointment and criticism, followed by funding cuts, followed by renewed interest years or even decades later.

The following outline is provided as an overview of and topical guide to artificial intelligence:

Language and Communication Technologies is the scientific study of technologies that explore language and communication. It is an interdisciplinary field that encompasses the fields of computer science, linguistics and cognitive science.

The following outline is provided as an overview of and topical guide to natural-language processing:

<i>How to Create a Mind</i> 2012 non-fiction book by Ray Kurzweil

How to Create a Mind: The Secret of Human Thought Revealed is a non-fiction book about brains, both human and artificial, by the inventor and futurist Ray Kurzweil. First published in hardcover on November 13, 2012 by Viking Press it became a New York Times Best Seller. It has received attention from The Washington Post, The New York Times and The New Yorker.

Cognitive computing refers to technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision, human–computer interaction, dialog and narrative generation, among other technologies.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

Larry R. Harris is an American researcher and businessperson. He is best known for his work in artificial intelligence, and is founder of the companies AICorp and EasyAsk, originally known as Linguistic Technology Corporation.

References

  1. Ball, John (16 April 2016). Machine Intelligence (2 ed.). Hired Pen Publishing.
  2. Ball, John. "Mr" (PDF). Heinrich Heine University. Retrieved 7 May 2016.
  3. Williams, Robyn (16 January 2000). "Our Brain, the Patom-Matcher". ABC Radio National.
  4. Ball, John (16 April 2016). Machine Intelligence (2 ed.). Hired Pen Publishing.
  5. Ball, John (2007). "USPTO".
  6. Pavey, Emma L (31 August 2010). The Structure of Language. The Cambridge University Press. Retrieved 8 May 2016.
  7. Van Valin, Robert. "A Summary of Role and Reference Grammar" (PDF). The State University of New York at Buffalo. Retrieved 8 May 2016.
  8. Van Valin, Robert. "From NLP to NLU" (PDF). Heinrich Heine University. Retrieved 8 May 2016.
  9. Ball, John (2015). "Speaking Artificial Intelligence". Computerworld.
  10. Eslambolchi, Hossein (2015). "When will we be able to have a conversation with a computer?". World Economic Forum.
  11. Ball, John. "Patom Theory" (PDF).
  12. Ball, John. "The Science of NLU" (PDF).

https://pat.ai/