John Ball (cognitive scientist)

Last updated

John Samuel Ball (born 1963) is an American cognitive scientist, an expert in machine intelligence, [1] computer architecture and the inventor of Patom Theory. [2]

Contents

John Ball John Samuel Ball.jpeg
John Ball

Biography

Born in Iowa USA whilst his Australian father Samuel Ball was working on his PhD in Educational Psychology, Ball returned with the family to Australia in 1978 to finish his secondary schooling on the north shore of Sydney. Ball received a Bachelor of Science in 1984 from the University of Sydney, a Masters of Cognitive Science from University of NSW in 1989 and a Master of Business Administration from MGSM (Macquarie Graduate School of Management) in 1997.

From a young age, Ball was fascinated by computers having been exposed to early mainframes at Educational Testing Service (ETS) in Princeton in the 1970s.

He was challenged by a lecturer as an undergraduate to pursue machine intelligence when she announced that computers would never be able to perform human like functions such as language or visual recognition.

Work

His career begun at IBM Australia as a mainframe engineer, leading to country support specialist responsible for supporting and training hardware engineers across Australia and New Zealand on mainframe and I/O devices. His expertise was in the IBM 370 I/O architecture, learning from global designer of channel architecture, Kenneth Trowell. Following IBM in 1996 he worked in other large Australian corporations managing and defining the commercials of complex IT contracts between stakeholders.

Always interested in how machines could better emulate human brain functions, he postulated Patom theory  the word representing a combination of pattern matching and atom. This reflected his belief that the brain simply stores, matches and uses hierarchical, bidirectional linkset patterns (sequences and sets) as sufficient to explain human capabilities. This he claimed was the approach of the human brain to language and vision and was first publicly aired in 2000, on Robyn Williams’ Okham's Razor. [3]

Over the years, exchanges with Artificial Intelligence experts such as Marvin Minsky led him to work on a prototype to demonstrate and prove his theory. [4]

Ball left corporate life to focus full-time on proving a natural language understanding (NLU) system with samples across diverse languages including Mandarin, Korean, German, Japanese, Spanish, English, French, Italian and Portuguese. Since 2007, Ball has filed two patents. [5]

In 2011 Ball came across a book of Emma L. Pavey's [6] whilst visiting a Barnes & Noble store in Princeton, New Jersey. This included a reference to a linguistic theory developed by Professor Robert Van Valin, Jr. and Professor William A. Foley, called Role & Reference Grammar (RRG). [7] Ball determined the explanation of a meaning based linguistic framework described in Pavey's book, to be the missing link for implementation of his theory. He contacted Van Valin and began integrating RRG into his prototype. Unlike dominant linguistic theories such as Universal Grammar, by Noam Chomsky, Ball's approach focused on meaning and provided a way for computers to break down any human language by meaning enabling communications between man and machine. In Van Valin's Paper, From NLP to NLU, [8] Van Valin talks about progressing from natural language processing (NLP) to NLU with the introduction of meaning achieved by the combination of RRG & Patom theory.

In 2014, The University of Sydney completed an external review analyzing its capabilities across Word Sense Disambiguation (WSD), context tracking, word boundary identification, machine translation and conversation. By 2015, Ball had included samples across nine languages and could demonstrate a solution to open scientific problems in the field of NLU, including:

In 2015, Ball wrote a seven-part series for Computerworld, [9] Speaking Artificial Intelligence in which he traced the dominant approaches of statistical analysis and machine learning, from the 1980s to the present.

Applications for this technology and its implications for intelligent machines have been published by Dr Hossein Eslambolchi in World Economic Forum. [10]

Ball's work to date refutes the commonly held belief that the human brain ‘processes’ information like a computer. His lab work and NLU demonstrate human-like conversation and accuracy in translation, written about in his papers "The Science of NLU" and "Patom Theory". [11] [12]

In December 2018, his machine intelligence company, Pat Inc received the award of 'Best New Algorithm for AI' by London-based Into.AI organization as recognition of his novel approach to AI-hard problem, natural-language understanding. Pat Inc also won the Best Technical Implementation for AI, 2019/2020 by Into.AI

Publications

Using NLU in Context for Question Answering: Improving on Facebook's bAbI Tasks

Machine Intelligence

Can Machines Talk

Series 'Patom Theory'

Speaking Artificial Intelligence

How Brains Work: Patom Theory’s Support from RRG Linguistics

John Ball's Medium account

Related Research Articles

Artificial intelligence (AI) is intelligence demonstrated by machines, as opposed to the natural intelligence displayed by animals including humans. AI research has been defined as the field of study of intelligent agents, which refers to any system that perceives its environment and takes actions that maximize its chance of achieving its goals.

Computational linguistics is an interdisciplinary field concerned with the computational modelling of natural language, as well as the study of appropriate computational approaches to linguistic questions. In general, computational linguistics draws upon linguistics, computer science, artificial intelligence, mathematics, logic, philosophy, cognitive science, cognitive psychology, psycholinguistics, anthropology and neuroscience, among others.

Functional linguistics is an approach to the study of language characterized by taking systematically into account the speaker's and the hearer's side, and the communicative needs of the speaker and of the given language community. Linguistic functionalism spawned in the 1920s to 1930s from Ferdinand de Saussure's systematic structuralist approach to language (1916).

Natural language processing Field of computer science and linguistics

Natural language processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to process and analyze large amounts of natural language data. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

Role and reference grammar (RRG) is a model of grammar developed by William A. Foley and Robert Van Valin, Jr. in the 1980s, which incorporates many of the points of view of current functional grammar theories.

Natural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem.

Robert D. Van Valin Jr. is an American linguist and the principal researcher behind the development of Role and Reference Grammar, a functional theory of grammar encompassing syntax, semantics and discourse pragmatics. His 1997 book Syntax: structure, meaning and function is an attempt to provide a model for syntactic analysis which is just as relevant for languages like Dyirbal and Lakhota as it is for more commonly studied Indo-European languages.

Natural language generation (NLG) is a software process that produces natural language output. In one of the most widely-cited survey of NLG methods, NLG is characterized as "the subfield of artificial intelligence and computational linguistics that is concerned with the construction of computer systems than can produce understandable texts in English or other human languages from some underlying non-linguistic representation of information".

Artificial general intelligence (AGI) is the ability of an intelligent agent to understand or learn any intellectual task that a human being can. It is a primary goal of some artificial intelligence research and a common topic in science fiction and futures studies. AGI can also be referred to as strong AI, full AI, or general intelligent action, although some academic sources reserve the term "strong AI" for computer programs that experience sentience or consciousness.

An artificial brain is software and hardware with cognitive abilities similar to those of the animal or human brain.

The following outline is provided as an overview of and topical guide to artificial intelligence:

Yorick Wilks British computer scientist (born 1939)

Yorick Wilks FBCS, a British computer scientist, is Emeritus Professor of Artificial Intelligence at the University of Sheffield, Visiting Professor of Artificial Intelligence at Gresham College, Former Senior Research Fellow at the Oxford Internet Institute, Senior Scientist at the Florida Institute for Human and Machine Cognition, and a member of the Epiphany Philosophers.

Language and Communication Technologies is the scientific study of technologies that explore language and communication. It is an interdisciplinary field that encompasses the fields of computer science, linguistics and cognitive science.

The following outline is provided as an overview of and topical guide to natural-language processing:

Cognitive computing (CC) refers to technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision, human–computer interaction, dialog and narrative generation, among other technologies.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

Larry R. Harris is an American researcher and businessperson. He is best known for his work in artificial intelligence, and is founder of the companies AICorp and EasyAsk, originally known as Linguistic Technology Corporation.

Meta AI is an artificial intelligence laboratory that belongs to Meta Platforms Inc. Meta AI seeks to develop artificial intelligence in the digital world, enhancing its augmented and artificial reality technologies. Meta AI is an academic research laboratory focused on generating knowledge for the broader AI community. This is in contrast to Facebook's Applied Machine Learning (AML) team, which focuses on practical applications on its products.

References

  1. Ball, John (16 April 2016). Machine Intelligence (2 ed.). Hired Pen Publishing.
  2. Ball, John. "Mr" (PDF). Heinrich Heine University. Retrieved 7 May 2016.
  3. Williams, Robyn (16 January 2000). "Our Brain, the Patom-Matcher". ABC Radio National.
  4. Ball, John (16 April 2016). Machine Intelligence (2 ed.). Hired Pen Publishing.
  5. Ball, John (2007). "USPTO".
  6. Pavey, Emma L (31 August 2010). The Structure of Language. The Cambridge University Press. Retrieved 8 May 2016.
  7. Van Valin, Robert. "A Summary of Role and Reference Grammar" (PDF). The State University of New York at Buffalo. Retrieved 8 May 2016.
  8. Van Valin, Robert. "From NLP to NLU" (PDF). Heinrich Heine University. Retrieved 8 May 2016.
  9. Ball, John (2015). "Speaking Artificial Intelligence". Computerworld.
  10. Eslambolchi, Hossein (2015). "When will we be able to have a conversation with a computer?". World Economic Forum.
  11. Ball, John. "Patom Theory" (PDF).
  12. Ball, John. "The Science of NLU" (PDF).

https://pat.ai/