Nicola Guarino

Last updated

Nicola Guarino (born 1954, in Messina) is an Italian computer scientist and researcher in the area of Formal Ontology for Information Systems, and the head of the Laboratory for Applied Ontology (LOA), part of the Italian National Research Council (CNR) in Trento. [1]

Contents

Work

Guarino's research interests are in the area of Artificial Intelligence, predominantly in Knowledge Representation. He may be best known in the Computer Science community for developing OntoClean, the first methodology for formal ontological analysis, with his colleague Chris Welty.

Knowledge Representation

He is arguably one of the founders of the field of ontology in computer science, but undoubtedly one of its most outspoken proponents. While most of the AI and KR researchers focused on reasoning algorithms and semantics of representation languages, and considered the actual knowledge expressed in these languages and reasoned over by these algorithms to be unimportant (just examples), Guarino spearheaded a counter-movement to study how knowledge should be expressed. The rallying cry of this movement undoubted came from the well-known "Naive Physics Manifesto" paper by Patrick J. Hayes.

Knowledge-based systems

Guarino's work in the early 1990s began to take shape as he applied his engineering background to understand how knowledge-based systems were built and, most importantly, how the knowledge was acquired. He was a familiar face in the early Knowledge Acquisition Workshops where he was best known for pointing to himself and saying, "I am not a class!" This remark referred to what Guarino believes to be an important and fundamental distinction between universals and particulars. While some representation systems allow classes to themselves be instances of other classes, and in certain contexts that makes sense, there are some instances which can never be classes (these are particulars).

Formal Ontology in Information Systems conference

His emphasis on formal rigor in specifying the type of knowledge that was eventually to be called "ontologies" by computer scientists, led him to the field of formal ontology in philosophy, where he began to study the metaphysics literature, focusing on the work of such notables as Quine, Strawson, and especially Simons.

Guarino founded the Formal Ontology in Information Systems conference in 1998, a recurring academic conference focused on ontologies themselves, not the languages they are represented in. He has worked tirelessly to promote research in ontology and maintain a level of scientific rigor.

Related Research Articles

<span class="mw-page-title-main">Cyc</span> Artificial intelligence project

Cyc is a long-term artificial intelligence project that aims to assemble a comprehensive ontology and knowledge base that spans the basic concepts and rules about how the world works.

Knowledge representation and reasoning is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can use to solve complex tasks such as diagnosing a medical condition or having a dialog in a natural language. Knowledge representation incorporates findings from psychology about how humans solve problems and represent knowledge, in order to design formalisms that will make complex systems easier to design and build. Knowledge representation and reasoning also incorporates findings from logic to automate various kinds of reasoning.

<span class="mw-page-title-main">Semantic Web</span> Extension of the Web to facilitate data exchange

The Semantic Web, sometimes known as Web 3.0, is an extension of the World Wide Web through standards set by the World Wide Web Consortium (W3C). The goal of the Semantic Web is to make Internet data machine-readable.

In information science, an ontology encompasses a representation, formal naming, and definitions of the categories, properties, and relations between the concepts, data, or entities that pertain to one, many, or all domains of discourse. More simply, an ontology is a way of showing the properties of a subject area and how they are related, by defining a set of terms and relational expressions that represent the entities in that subject area. The field which studies ontologies so conceived is sometimes referred to as applied ontology.

A modeling language is any artificial language that can be used to express data, information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure of a programming language.

In information science, an upper ontology is an ontology that consists of very general terms that are common across all domains. An important function of an upper ontology is to support broad semantic interoperability among a large number of domain-specific ontologies by providing a common starting point for the formulation of definitions. Terms in the domain ontology are ranked under the terms in the upper ontology, e.g., the upper ontology classes are superclasses or supersets of all the classes in the domain ontologies.

The Process Specification Language (PSL) is a set of logic terms used to describe processes. The logic terms are specified in an ontology that provides a formal description of the components and their relationships that make up a process. The ontology was developed at the National Institute of Standards and Technology (NIST), and has been approved as an international standard in the document ISO 18629.

Applied ontology is the application of Ontology for practical purposes. This can involve employing ontological methods or resources to specific domains, such as management, relationships, biomedicine, information science or geography. Alternatively, applied ontology can aim more generally at developing improved methodologies for recording and organizing knowledge.

Frames are an artificial intelligence data structure used to divide knowledge into substructures by representing "stereotyped situations". They were proposed by Marvin Minsky in his 1974 article "A Framework for Representing Knowledge". Frames are the primary data structure used in artificial intelligence frame languages; they are stored as ontologies of sets.

OntoClean is a methodology for analyzing ontologies based on formal, domain-independent properties of classes developed by Nicola Guarino and Chris Welty.

<span class="mw-page-title-main">Chris Welty</span> American computer scientist

Christopher A. Welty is an American computer scientist, who works at Google Research in New York. He is best known for his work on ontologies, in the Semantic Web, and on IBM's Watson. While on sabbatical from Vassar College from 1999 to 2000, he collaborated with Nicola Guarino on OntoClean; he was co-chair of the W3C Rule Interchange Format working group from 2005 to 2009.

The Laboratoire d'Informatique de Grenoble is the largest research laboratory of Informatics in Grenoble, France. It was created 1 January 2007, as the result of a union of the 24 research teams of the previous IMAG Institute and the INRIA Rhône-Alpes.

Thomas Robert Gruber is an American computer scientist, inventor, and entrepreneur with a focus on systems for knowledge sharing and collective intelligence. He did foundational work in ontology engineering and is well known for his definition of ontologies in the context of artificial intelligence.

<span class="mw-page-title-main">Ontology engineering</span> Field that studies the methods and methodologies for building ontologies

In computer science, information science and systems engineering, ontology engineering is a field which studies the methods and methodologies for building ontologies, which encompasses a representation, formal naming and definition of the categories, properties and relations between the concepts, data and entities of a given domain of interest. In a broader sense, this field also includes a knowledge construction of the domain using formal ontology representations such as OWL/RDF. A large-scale representation of abstract concepts such as actions, time, physical objects and beliefs would be an example of ontological engineering. Ontology engineering is one of the areas of applied ontology, and can be seen as an application of philosophical ontology. Core ideas and objectives of ontology engineering are also central in conceptual modeling.

In philosophy, a process ontology refers to a universal model of the structure of the world as an ordered wholeness. Such ontologies are fundamental ontologies, in contrast to the so-called applied ontologies. Fundamental ontologies do not claim to be accessible to any empirical proof in itself but to be a structural design pattern, out of which empirical phenomena can be explained and put together consistently. Throughout Western history, the dominating fundamental ontology is the so-called substance theory. However, fundamental process ontologies have become more important in recent times, because the progress in the discovery of the foundations of physics has spurred the development of a basic concept able to integrate such boundary notions as "energy," "object", and those of the physical dimensions of space and time.

Knowledge extraction is the creation of knowledge from structured and unstructured sources. The resulting knowledge needs to be in a machine-readable and machine-interpretable format and must represent knowledge in a manner that facilitates inferencing. Although it is methodically similar to information extraction (NLP) and ETL, the main criterion is that the extraction result goes beyond the creation of structured information or the transformation into a relational schema. It requires either the reuse of existing formal knowledge or the generation of a schema based on the source data.

In information technology a reasoning system is a software system that generates conclusions from available knowledge using logical techniques such as deduction and induction. Reasoning systems play an important role in the implementation of artificial intelligence and knowledge-based systems.

<span class="mw-page-title-main">Conceptualization (information science)</span> Abstract simplified view of selected part(s) of the world

In information science a conceptualization is an abstract simplified view of some selected part of the world, containing the objects, concepts, and other entities that are presumed of interest for some particular purpose and the relationships between them. An explicit specification of a conceptualization is an ontology, and it may occur that a conceptualization can be realized by several distinct ontologies. An ontological commitment in describing ontological comparisons is taken to refer to that subset of elements of an ontology shared with all the others. "An ontology is language-dependent", its objects and interrelations described within the language it uses, while a conceptualization is always the same, more general, its concepts existing "independently of the language used to describe it". The relation between these terms is shown in the figure to the right.

<span class="mw-page-title-main">Michael Gruninger</span> Canadian computer scientist

Michael Gruninger is a Canadian computer scientist and Professor of Industrial Engineering at the University of Toronto, known for his work on Ontologies in information science. particularly with the Process Specification Language, and in enterprise modelling on the TOVE Project with Mark S. Fox.

<span class="mw-page-title-main">Michael Uschold</span> American computer scientist

Michael F. Uschold is an American computer scientist, Artificial Intelligence researcher, and consultant known for his work on knowledge representation and ontology.

References

  1. "Nicola Guarino's homepage". loa.istc.cnr.it. Laboratory for Applied Ontology. Archived from the original on 2018-06-25. Retrieved 2024-07-17.