Cognitive computing

Last updated

Cognitive computing refers to technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision (object recognition), human–computer interaction, dialog and narrative generation, among other technologies. [1] [2]

Contents

Definition

At present, there is no widely agreed upon definition for cognitive computing in either academia or industry. [1] [3] [4]

In general, the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain [5] [6] [7] [8] [9] (2004) and helps to improve human decision-making. [10] In this sense, cognitive computing is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. Cognitive computing applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, cognitive computing hardware and applications strive to be more affective and more influential by design.

Basic scheme of a cognitive system. With sensors, such as keyboards, touchscreens, cameras, microphones or temperature sensors, signals from the real world environment can be detected. For perception, these signals are recognised by the cognition of the cognitive system and converted into digital information. This information can be documented and is processed. The result of deliberation can also be documented and is used to control and execute an action in the real world environment with the help of actuators, such as engines, loudspeakers, displays or air conditioners for example. Cognitive.System.Scheme.png
Basic scheme of a cognitive system. With sensors, such as keyboards, touchscreens, cameras, microphones or temperature sensors, signals from the real world environment can be detected. For perception, these signals are recognised by the cognition of the cognitive system and converted into digital information. This information can be documented and is processed. The result of deliberation can also be documented and is used to control and execute an action in the real world environment with the help of actuators, such as engines, loudspeakers, displays or air conditioners for example.

The term "cognitive system" also applies to any artificial construct able to perform a cognitive process where a cognitive process is the transformation of data, information, knowledge, or wisdom to a new level in the DIKW Pyramid. [11] While many cognitive systems employ techniques having their origination in artificial intelligence research, cognitive systems, themselves, may not be artificially intelligent. For example, a neural network trained to recognize cancer on an MRI scan may achieve a higher success rate than a human doctor. This system is certainly a cognitive system but is not artificially intelligent.

Cognitive systems may be engineered to feed on dynamic data in real-time, or near real-time, [12] and may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided). [13]

Cognitive analytics

Cognitive computing-branded technology platforms typically specialize in the processing and analysis of large, unstructured datasets. [14]

Applications

Education
Even if cognitive computing can not take the place of teachers, it can still be a heavy driving force in the education of students. Cognitive computing being used in the classroom is applied by essentially having an assistant that is personalized for each individual student. This cognitive assistant can relieve the stress that teachers face while teaching students, while also enhancing the student's learning experience over all. [15] Teachers may not be able to pay each and every student individual attention, this being the place that cognitive computers fill the gap. Some students may need a little more help with a particular subject. For many students, Human interaction between student and teacher can cause anxiety and can be uncomfortable. With the help of Cognitive Computer tutors, students will not have to face their uneasiness and can gain the confidence to learn and do well in the classroom. [16] While a student is in class with their personalized assistant, this assistant can develop various techniques, like creating lesson plans, to tailor and aid the student and their needs.
Healthcare
Numerous tech companies are in the process of developing technology that involves cognitive computing that can be used in the medical field. The ability to classify and identify is one of the main goals of these cognitive devices. [17] This trait can be very helpful in the study of identifying carcinogens. This cognitive system that can detect would be able to assist the examiner in interpreting countless numbers of documents in a lesser amount of time than if they did not use Cognitive Computer technology. This technology can also evaluate information about the patient, looking through every medical record in depth, searching for indications that can be the source of their problems.
Commerce
Together with Artificial Intelligence, it has been used in warehouse management systems  to collect, store, organize and analyze all related supplier data. All these aims at improving efficiency, enabling faster decision-making, monitoring inventory and fraud detection [18]
Human Cognitive Augmentation
In situations where humans are using or working collaboratively with cognitive systems, called a human/cog ensemble, results achieved by the ensemble are superior to results obtainable by the human working alone. Therefore, the human is cognitively augmented. [19] [20] [21] In cases where the human/cog ensemble achieves results at, or superior to, the level of a human expert then the ensemble has achieved synthetic expertise. [22] In a human/cog ensemble, the "cog" is a cognitive system employing virtually any kind of cognitive computing technology.
Other use cases

Industry work

Cognitive computing in conjunction with big data and algorithms that comprehend customer needs, can be a major advantage in economic decision making.

The powers of cognitive computing and artificial intelligence hold the potential to affect almost every task that humans are capable of performing. This can negatively affect employment for humans, as there would be no such need for human labor anymore. It would also increase the inequality of wealth; the people at the head of the cognitive computing industry would grow significantly richer, while workers without ongoing, reliable employment would become less well off. [23]

The more industries start to use cognitive computing, the more difficult it will be for humans to compete. [23] Increased use of the technology will also increase the amount of work that AI-driven robots and machines can perform. Only extraordinarily talented, capable and motivated humans would be able to keep up with the machines. The influence of competitive individuals in conjunction with artificial intelligence/cognitive computing with has the potential to change the course of humankind. [24]

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive science</span> Interdisciplinary scientific study of cognitive processes

Cognitive science is the interdisciplinary, scientific study of the mind and its processes with input from linguistics, psychology, neuroscience, philosophy, computer science/artificial intelligence, and anthropology. It examines the nature, the tasks, and the functions of cognition. Cognitive scientists study intelligence and behavior, with a focus on how nervous systems represent, process, and transform information. Mental faculties of concern to cognitive scientists include language, perception, memory, attention, reasoning, and emotion; to understand these faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial intelligence, philosophy, neuroscience, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."

Accuracy and precision are two measures of observational error. Accuracy is how close a given set of measurements are to their true value, while precision is how close the measurements are to each other.

Artificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. The implementation of neuromorphic computing on the hardware level can be realized by oxide-based memristors, spintronic memories, threshold switches, transistors, among others. Training software-based neuromorphic systems of spiking neural networks can be achieved using error backpropagation, e.g., using Python based frameworks such as snnTorch, or using canonical learning rules from the biological learning literature, e.g., using BindsNet.

<span class="mw-page-title-main">Artificial general intelligence</span> Hypothetical human-level or stronger AI

An artificial general intelligence (AGI) is a hypothetical type of intelligent agent. If realized, an AGI could learn to accomplish any intellectual task that human beings or animals can perform. Alternatively, AGI has been defined as an autonomous system that surpasses human capabilities in the majority of economically valuable tasks. Creating AGI is a primary goal of some artificial intelligence research and of companies such as OpenAI, DeepMind, and Anthropic. AGI is a common topic in science fiction and futures studies.

A superintelligence is a hypothetical agent that possesses intelligence far surpassing that of the brightest and most gifted human minds. "Superintelligence" may also refer to a property of problem-solving systems whether or not these high-level intellectual competencies are embodied in agents that act in the world. A superintelligence may or may not be created by an intelligence explosion and associated with a technological singularity.

A cognitive architecture refers to both a theory about the structure of the human mind and to a computational instantiation of such a theory used in the fields of artificial intelligence (AI) and computational cognitive science. The formalized models can be used to further refine a comprehensive theory of cognition and as a useful artificial intelligence program. Successful cognitive architectures include ACT-R and SOAR. The research on cognitive architectures as software instantiation of cognitive theories was initiated by Allen Newell in 1990.

An artificial brain is software and hardware with cognitive abilities similar to those of the animal or human brain.

Computational cognition is the study of the computational basis of learning and inference by mathematical modeling, computer simulation, and behavioral experiments. In psychology, it is an approach which develops computational models based on experimental results. It seeks to understand the basis behind the human method of processing of information. Early on computational cognitive scientists sought to bring back and create a scientific form of Brentano's psychology.

Neuroinformatics is the field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics has to be applied:

<span class="mw-page-title-main">Intelligence amplification</span> Use of information technology to augment human intelligence

Intelligence amplification (IA) refers to the effective use of information technology in augmenting human intelligence. The idea was first proposed in the 1950s and 1960s by cybernetics and early computer pioneers.

In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition. The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher, and cognitive scientist Jerry Fodor in the 1960s, 1970s, and 1980s. It was vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others.

Affective design describes the design of user interfaces in which emotional information is communicated to the computer from the user in a natural and comfortable way. The computer processes the emotional information and adapts or responds to try to improve the interaction in some way. The notion of affective design emerged from the field of human–computer interaction (HCI), specifically from the developing area of affective computing. Affective design serves an important role in user experience (UX) as it contributes to the improvement of the user's personal condition in relation to the computing system. The goals of affective design focus on providing users with an optimal, proactive experience. Amongst overlap with several fields, applications of affective design include ambient intelligence, human–robot interaction, and video games.

<span class="mw-page-title-main">Outline of artificial intelligence</span> Overview of and topical guide to artificial intelligence

The following outline is provided as an overview of and topical guide to artificial intelligence:

Augmented cognition is an interdisciplinary area of psychology and engineering, attracting researchers from the more traditional fields of human-computer interaction, psychology, ergonomics and neuroscience. Augmented cognition research generally focuses on tasks and environments where human–computer interaction and interfaces already exist. Developers, leveraging the tools and findings of neuroscience, aim to develop applications which capture the human user's cognitive state in order to drive real-time computer systems. In doing so, these systems are able to provide operational data specifically targeted for the user in a given context. Three major areas of research in the field are: Cognitive State Assessment (CSA), Mitigation Strategies (MS), and Robust Controllers (RC). A subfield of the science, Augmented Social Cognition, endeavours to enhance the "ability of a group of people to remember, think, and reason."

Ashwin Ram is an Indian-American computer scientist. He was chief innovation officer at PARC from 2011 to 2016, and published books and scientific articles and helped start at least two companies.

A cognitive computer is a computer that hardwires artificial intelligence and machine learning algorithms into an integrated circuit that closely reproduces the behavior of the human brain. It generally adopts a neuromorphic engineering approach. Synonyms include neuromorphic chip and cognitive chip.

<span class="mw-page-title-main">Glossary of artificial intelligence</span> List of definitions of terms and concepts commonly used in the study of artificial intelligence

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

References

  1. 1 2 Kelly III, Dr. John (2015). "Computing, cognition and the future of knowing" (PDF). IBM Research: Cognitive Computing. IBM Corporation. Retrieved February 9, 2016.
  2. Augmented intelligence, helping humans make smarter decisions. Hewlett Packard Enterprise. http://h20195.www2.hpe.com/V2/GetPDF.aspx/4AA6-4478ENW.pdf Archived April 27, 2016, at the Wayback Machine
  3. "Cognitive Computing". April 27, 2014. Archived from the original on July 11, 2019. Retrieved April 18, 2016.
  4. Gutierrez-Garcia, J. Octavio; López-Neri, Emmanuel (November 30, 2015). "Cognitive Computing: A Brief Survey and Open Research Challenges". 2015 3rd International Conference on Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence. pp. 328–333. doi:10.1109/ACIT-CSI.2015.64. ISBN   978-1-4673-9642-4. S2CID   15229045.
  5. Terdiman, Daniel (2014) .IBM's TrueNorth processor mimics the human brain.http://www.cnet.com/news/ibms-truenorth-processor-mimics-the-human-brain/
  6. Knight, Shawn (2011). IBM unveils cognitive computing chips that mimic human brain TechSpot: August 18, 2011, 12:00 PM
  7. Hamill, Jasper (2013). Cognitive computing: IBM unveils software for its brain-like SyNAPSE chips The Register: August 8, 2013
  8. Denning. P.J. (2014). "Surfing Toward the Future". Communications of the ACM. 57 (3): 26–29. doi:10.1145/2566967. S2CID   20681733.
  9. Dr. Lars Ludwig (2013). Extended Artificial Memory. Toward an integral cognitive theory of memory and technology (pdf) (Thesis). Technical University of Kaiserslautern. Retrieved February 7, 2017.
  10. "Automate Complex Workflows Using Tactical Cognitive Computing: Coseer". thesiliconreview.com. Retrieved July 31, 2017.
  11. Fulbright, Ron (2020). Democratization of Expertise: How Cognitive Systems Will Revolutionize Your Life (1st ed.). Boca Raton, FL: CRC Press. ISBN   978-0367859459.
  12. Ferrucci, David; Brown, Eric; Chu-Carroll, Jennifer; Fan, James; Gondek, David; Kalyanpur, Aditya A.; Lally, Adam; Murdock, J. William; Nyberg, Eric; Prager, John; Schlaefer, Nico; Welty, Chris (July 28, 2010). "Building Watson: An Overview of the DeepQA Project" (PDF). AI Magazine. 31 (3): 59–79. doi:10.1609/aimag.v31i3.2303. S2CID   1831060. Archived from the original (PDF) on February 28, 2020.
  13. Deanfelis, Stephen (2014). Will 2014 Be the Year You Fall in Love With cognitive computing? Wired: 2014-04-21
  14. "Cognitive analytics - The three-minute guide" (PDF). 2014. Retrieved August 18, 2017.
  15. Sears, Alec (April 14, 2018). "The Role Of Artificial Intelligence In The Classroom". ElearningIndustry. Retrieved April 11, 2019.
  16. Coccoli, Mauro; Maresca, Paolo; Stanganelli, Lidia (May 21, 2016). "Cognitive computing in education". Journal of e-Learning and Knowledge Society. 12 (2).
  17. Dobrescu, Edith Mihaela; Dobrescu, Emilian M. (2018). "Artificial Intelligence (Ai) - The Technology That Shapes The World" (PDF). Global Economic Observer. 6 (2): 71–81. ProQuest   2176184267.
  18. "Smart Procurement Technologies for the Construction Sector". publication.sipmm.edu.sg. October 25, 2021. Retrieved March 2, 2022.
  19. Fulbright, Ron (2020). Democratization of Expertise: How Cognitive Systems Will Revolutionize Your Life. Boca Raton, FL: CRC Press. ISBN   978-0367859459.
  20. Fulbright, Ron (2019). "Calculating Cognitive Augmentation – A Case Study". Augmented Cognition. Lecture Notes in Computer Science. Vol. 11580. pp. 533–545. arXiv: 2211.06479 . doi:10.1007/978-3-030-22419-6_38. ISBN   978-3-030-22418-9. S2CID   195891648.
  21. Fulbright, Ron (2018). "On Measuring Cognition and Cognitive Augmentation". Human Interface and the Management of Information. Information in Applications and Services. Lecture Notes in Computer Science. Vol. 10905. pp. 494–507. arXiv: 2211.06477 . doi:10.1007/978-3-319-92046-7_41. ISBN   978-3-319-92045-0. S2CID   51603737.
  22. Fulbright, Ron (2020). "Synthetic Expertise". Augmented Cognition. Human Cognition and Behavior. Lecture Notes in Computer Science. Vol. 12197. pp. 27–48. arXiv: 2212.03244 . doi:10.1007/978-3-030-50439-7_3. ISBN   978-3-030-50438-0. S2CID   220519330.
  23. 1 2 Makridakis, Spyros (June 2017). "The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms". Futures. 90: 46–60. doi:10.1016/j.futures.2017.03.006. S2CID   152199271.
  24. West, Darrell M. (2018). The Future of Work: Robots, AI, and Automation. Brookings Institution Press. ISBN   978-0-8157-3293-8. JSTOR   10.7864/j.ctt1vjqp2g.[ page needed ]

Further reading