G. M. Nijssen

Last updated

Gerardus Maria "Sjir" Nijssen (born 18 October 1938, Schinnen) is a Dutch computer scientist, former professor of computer science at the University of Queensland, [1] consultant, and author. Nijssen is considered the founder of verbalization in computer science, and one of the founders of business modeling and information analysis based on natural language. [2]

Contents

Biography

Nijssen finished his study at the Eindhoven University of Technology in 1965, and started working at Philips at the department of Commercial Efficiency Research. From 1968 to 1970 he has been director of the educational institute "The Dutch Centre for Business and IT". In 1970 he moved to the Control Data Corporation, a pioneer in the field of computer science with the European headquarters in Brussels in Belgium. In those years he started fact-based modeling and developed NIAM. During this time, he was also associated with several academic institutions and international standards organizations. In 1974 he was co-founder of the IFIP WG 2.6 Database Experts group, where he served as its first chairman until 1983. [1] He was also a member of IFIP WG 8.1 on Information Systems and a member of ISO TC97/SC5/WG3 working group on Conceptual Schemas. [2]

During the period of 1982 to 1989 Nijssen was a full-time professor of Computer science at the University of Queensland in Brisbane, Australia, where he worked together with Terry Halpin amongst others in further developing NIAM. When returning to the Netherlands in 1989, he founded PNA Group, which stands for Professor Nijssen Associates, and accepted a position at the University of Maastricht, Netherlands. [ citation needed ]

In 2002 Nijssen retired as CEO at PNA Group. He remained active as member of the OMG SBVR 1.1 Revision Task Force (RTF), the OMG BPMN Revision Task Force (RTF), the OMG Architecture Ecosystem Special Interest Group (AE SIG) and the Fact Based Modeling Task Force. [2]

Work

Nijssen's research interests in the field of computer science have developed over the years. In the 1970s he was focussed on information systems and database technology. [1]

NIAM

At Control Data early 1970s Nijssen started with fact-based modeling and developed NIAM, a fact based business practice and notation. The acronym NIAM originally stood for "Nijssen's Information Analysis Methodology", and later generalised to "Natural language Information Analysis Methodology" and Binary Relationship Modeling since G. M. Nijssen was only one of many people involved in the development of the method. [3]

Conceptual schema and relational database design, 1989

In 1989 Nijssen and Terry Halpin published 'the book Conceptual schema and relational database design: a fact oriented approach. The introduction it declared the background of this work:

"Prof. G. M. Nijssen, the originator of NIAM design method, had for a long time given a higher priority to working on new aspects of the method and advancing it, than to writing a textbook about it; but at last, here it is. The NIAM method was initiated in the early 1970s, at a time when most researchers in the data base and information system field still were discussing data modeling on the level of record structures. Only a few acknowledged the need for semantic data modeling. Among these few was Prof. Nijssen, who realized its enormous potential for the practice of data base and information system development..." [4]

The introduction further explained, that NIAM was further developed in cooperation with several other scientists, such as with E.D. Falkenberg. Nijssen and Halpin stipulated:

"... the numerous fruitful discussions which ... with Prof. E.D. Falkenberg, while he was at the University of Stuttgart, Siemens Research center and at the University of Queensland. Some of these discussions were enjoyed in "high places", such as the Rigi and Saas Fee, in Switzerland. Various ideas contained in the NIAM design method were originated by Prof. Falkenberg, for example, the basic set of concepts and some aspects of the design procedure, including an algorithm for designing subtypes." [5]

Nijssen and Halpin further explain:

"While the "great debate" in 1974 between proponents of the CODASYL Network Model (C. W. Bachman) and of the Relational Model (Dr E. F. Codd) was the focus of attention in database research world, it was Prof. Falkenberg who said that: The debate is irrelevant for semantic data modeling. Now, years later, the debate on semantic data modeling is indeed concerned with issues quite different from those emphasised in the conventional data models." [5]

CogNIAM

Back in the Netherlands in the 1990s Nijssen developed Cognition enhanced Natural language Information Analysis Method (CogNIAM). Hereby he focused entirely on the most productive protocol for the development of business requirements and integrated business modeling.

Publications

Nijssen published more than 50 articles and 7 books. [6]

Articles, a selection

Related Research Articles

A conceptual schema or conceptual data model is a high-level description of informational needs underlying the design of a database. It typically includes only the main concepts and the main relationships among them. Typically this is a first-cut model, with insufficient detail to build an actual database. This level describes the structure of the whole database for a group of users. The conceptual model is also known as the data model that can be used to describe the conceptual schema when a database system is implemented. It hides the internal details of physical storage and targets the description of entities, datatypes, relationships and constraints.

<span class="mw-page-title-main">Data model</span> Model that organizes elements of data and how they relate to one another and to real-world entities.

A data model is an abstract model that organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities. For instance, a data model may specify that the data element representing a car be composed of a number of other elements which, in turn, represent the color and size of the car and define its owner.

<span class="mw-page-title-main">IDEF</span> Family of modeling languages

IDEF, initially an abbreviation of ICAM Definition and renamed in 1999 as Integration Definition, is a family of modeling languages in the field of systems and software engineering. They cover a wide range of uses from functional modeling to data, simulation, object-oriented analysis and design, and knowledge acquisition. These definition languages were developed under funding from U.S. Air Force and, although still most commonly used by them and other military and United States Department of Defense (DoD) agencies, are in the public domain.

Terence Aidan (Terry) Halpin is an Australian computer scientist who is known for his formalization of the Object Role Modeling notation.

<span class="mw-page-title-main">Entity–relationship model</span> Model or diagram describing interrelated things

An entity–relationship model describes interrelated things of interest in a specific domain of knowledge. A basic ER model is composed of entity types and specifies relationships that can exist between entities.

<span class="mw-page-title-main">Data modeling</span> Creating a model of the data in a system

Data modeling in software engineering is the process of creating a data model for an information system by applying certain formal techniques. It may be applied as part of broader Model-driven engineering (MDE) concept.

<span class="mw-page-title-main">Object–role modeling</span> Programming technique

Object–role modeling (ORM) is used to model the semantics of a universe of discourse. ORM is often used for data modeling and software engineering.

A logical data model or logical schema is a data model of a specific problem domain expressed independently of a particular database management product or storage technology but in terms of data structures such as relational tables and columns, object-oriented classes, or XML tags. This is as opposed to a conceptual data model, which describes the semantics of an organization without reference to technology.

<span class="mw-page-title-main">Information model</span>

An information model in software engineering is a representation of concepts and the relationships, constraints, rules, and operations to specify data semantics for a chosen domain of discourse. Typically it specifies relations between kinds of things, but may also include relations with individual things. It can provide sharable, stable, and organized structure of information requirements or knowledge for the domain context.

<span class="mw-page-title-main">IDEF1X</span>

Integration DEFinition for information modeling (IDEF1X) is a data modeling language for the development of semantic data models. IDEF1X is used to produce a graphical information model which represents the structure and semantics of information within an environment or system.

<span class="mw-page-title-main">Jan Dietz</span> Dutch computer scientist

Jean Leonardus Gerardus (Jan) Dietz is a Dutch Information Systems researcher, Professor Emeritus of Information Systems Design at the Delft University of Technology, known for the development of the Design & Engineering Methodology for Organisations. and his work on Enterprise Engineering.

<span class="mw-page-title-main">Semantic data model</span> Database model

A semantic data model (SDM) is a high-level semantics-based database description and structuring formalism for databases. This database model is designed to capture more of the meaning of an application environment than is possible with contemporary database models. An SDM specification describes a database in terms of the kinds of entities that exist in the application environment, the classifications and groupings of those entities, and the structural interconnections among them. SDM provides a collection of high-level modeling primitives to capture the semantics of an application environment. By accommodating derived information in a database structural specification, SDM allows the same information to be viewed in several ways; this makes it possible to directly accommodate the variety of needs and processing requirements typically present in database applications. The design of the present SDM is based on our experience in using a preliminary version of it. SDM is designed to enhance the effectiveness and usability of database systems. An SDM database description can serve as a formal specification and documentation tool for a database; it can provide a basis for supporting a variety of powerful user interface facilities, it can serve as a conceptual database model in the database design process; and, it can be used as the database model for a new kind of database management system.

Ronald K. (Ron) Stamper was a British computer scientist, formerly a researcher in the LSE and emeritus professor at the University of Twente, known for his pioneering work in Organisational semiotics, and the creation of the MEASUR methodology and the SEDITA framework.

NORMA is a conceptual modeling tool that implements the object-role modeling (ORM) method.

Jacobus Nicolaas (Sjaak) Brinkkemper is a Dutch computer scientist, and Full Professor of organisation and information at the Department of Information and Computing Sciences of Utrecht University.

The following is provided as an overview of and topical guide to databases:

Henderik Alex (Erik) Proper is a Dutch computer scientist, an FNR PEARL Laureate, and a senior research manager within the Computer Science (ITIS) department of the Luxembourg Institute of Science and Technology (LIST). He is also adjunct professor in data and knowledge engineering at the University of Luxembourg. He is known for work on conceptual modeling, enterprise architecture and enterprise engineering.

<span class="mw-page-title-main">Eckhard D. Falkenberg</span> German computer scientist

Eckhard D. Falkenberg is a German scientist and Professor Emeritus of Information Systems at the Radboud University Nijmegen. He is known for his contributions in the fields of information modelling, especially object-role modeling, and the conceptual foundations of information systems.

Cognition enhanced Natural language Information Analysis Method (CogNIAM) is a conceptual fact-based modelling method, that aims to integrate the different dimensions of knowledge: data, rules, processes and semantics. To represent these dimensions world standards SBVR, BPMN and DMN from the Object Management Group (OMG) are used. CogNIAM, a successor of NIAM, is based on the work of knowledge scientist Sjir Nijssen.

References

  1. 1 2 3 Australian Computer Journal, Vol. 19-20, 1987, p. 75.
  2. 1 2 3 Prof. dr. ir. G.M. Nijssen, PNA Group 2009, at archive.org, 2017. (Originally retrieved July 17, 2009).
  3. Wintraecken, J. J. V. R. (1990) [1987]. The NIAM information analysis method: theory and practice. Translation of: Informatie-analyse volgens NIAM. Dordrecht; Boston: Kluwer Academic Publishers. doi:10.1007/978-94-009-0451-4. ISBN   079230263X. OCLC   19554537. S2CID   30209824.
  4. Nijssen and Halpin (1989, ix)
  5. 1 2 Nijssen and Halpin (1989, xiii)
  6. G.M. Nijssen List of publications Archived 30 November 2009 at the Wayback Machine from the DBLP Bibliography Server.