Systemantics

Last updated
General Systemantics
Systemantics.jpg
1977 edition
Author John Gall
IllustratorR. O. Blechman
LanguageEnglish
Subject Systems science
PublisherGeneral Systemantics Press
Publication date
1975/78, 1986, 2002
Media typePrint

General Systemantics (retitled to Systemantics in its second edition and The Systems Bible in its third) is a systems engineering treatise by John Gall in which he offers practical principles of systems design based on experience and anecdotes.

Contents

It is offered from the perspective of how not to design systems, based on system engineering failures. The primary precept of the treatise is that large complex systems are extremely difficult to design correctly despite best intentions, so care must be taken to design smaller, less-complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness.

History

The book was initially self-published after Gall received rejection letters from 30 publishers. After several reviews in academic journals, it was picked up by Quadrangle–The New York Times Book Company, who published it in 1977. [1]

Title origin

The term systemantics is a commentary on prior work by Alfred Korzybski called general semantics which conjectured that all systems failures could be attributed to a single root cause – a failure to communicate. Gall observes that, instead, system failure is an intrinsic feature of systems. He thereby derives the term general systemantics in deference to the notion of a sweeping theory of system failure, but attributed to an intrinsic feature based on laws of system behavior. He observes as a side-note that system antics also playfully captures the concept that systems naturally "act up."

Contents

Background

Premise

  • Systems in general work poorly or not at all. [2]

This is more a universal observation than a law. The origin of this observation is traced back via:

  1. Murphy's Law that "if anything can go wrong, it will",
  2. Alfred Korzybski's general semantics notion of failure's root cause being a communication problem,
  3. Humorist Stephen Potter's One-upmanship on ways to "game" the system for personal benefit,
  4. Historian C. Northcote Parkinson's principle called Parkinson's Law – "Work expands so as to fill the time available for its completion"
  5. Educator Lawrence J. Peter's widely cited Peter Principle – "In a hierarchy every employee tends to rise to his level of incompetence ... in time every post tends to be occupied by an employee who is incompetent to carry out its duties ... Work is accomplished by those employees who have not yet reached their level of incompetence."

Scope

By systems, the author refers to those that "...involve human beings, particularly those very large systems such as national governments, nations themselves, religions, the railway system, the post office..." though the intention is that the principles are general to any system.

Additionally, the author observes:

  1. Everything is a system.
  2. Everything is part of a larger system.
  3. The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).
  4. All systems are infinitely complex.

First principles

Once a system is set up to solve some problem, the system itself engenders new problems relating to its development, operations and maintenance. The author points out that the additional energy required to support the system can consume the energy it was meant to save. This leads to the next principle:

The author defines anergy as the effort required to bring about a change. This is meant as a tongue-in-cheek analog of the law of conservation of energy.

One of the problems that a system creates is that it becomes an entity unto itself that not only persists but expands and encroaches on areas beyond the original system's purview.

Why systems behave poorly

The author cites a number of spectacular unexpected behaviors including:

  1. The Aswan Dam diverting the Nile River's fertilizing sediment to Lake Nasser (where it is useless) requiring the dam to operate at full electrical generating capacity to run the artificial fertilizer plants needed to replace the diverted sediment.
  2. The space Vehicle Assembly Building at Kennedy Space Center designed to protect vehicles from weather is so large that it produces its own weather.

Feedback

Not only do systems expand well beyond their original goals, but as they evolve they tend to oppose even their own original goals. This is seen as a systems theory analog of Le Chatelier's principle that suggests chemical and physical processes tend to counteract changed conditions that upset equilibrium until a new equilibrium is established. This same counteraction force can be seen in systems behavior. For example, incentive reward systems set up in business can have the effect of institutionalizing mediocrity. [5] This leads to the following principle:

What's in a name

People performing roles in systems often do not perform the role suggested by the name the system gives that person, nor does the system itself perform the role that its name suggests.

Inside systems

In other words, the system has a severely censored and distorted view of reality from biased and filtering sensory organs. This distorted view displaces understanding of the actual real-world, which in turn pales and tends to disappear. This displacement creates a type of sensory deprivation and a kind of hallucinogenic effect on those inside the systems, causing them to lose common sense. In addition to negatively affecting those inside the system, the system attracts to it people who are optimized for the pathological environment the system creates. Thus,

Elementary systems functions

  1. A complex system cannot be "made" to work. It either works or it does not.
  2. A simple system, designed from scratch, sometimes works.
  3. Some complex systems actually work.
  4. A complex system that works is invariably found to have evolved from a simple system that works.
  5. A complex system designed from scratch never works and cannot be patched up to make it work. One has to start over, beginning with a working simple system.

Advanced systems functions

  1. The Functional Indeterminacy Theorem (F.I.T.): in complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
  2. The Newtonian Law of Systems Inertia: a system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
  3. Systems develop goals of their own the instant they come into being.
  4. Intrasystem goals come first.

System failure

  1. The Fundamental Failure-Mode Theorem (F.F.T.): complex systems usually operate in a failure mode.
  2. A complex system can fail in an infinite number of ways. (If anything can go wrong, it will; see Murphy's law.)
  3. The mode of failure of a complex system cannot ordinarily be predicted from its structure.
  4. The crucial variables are discovered by accident.
  5. The larger the system, the greater the probability of unexpected failure.
  6. "Success" or "function" in any system may be failure in the larger or smaller systems to which the system is connected.
  7. The Fail-Safe Theorem: when a fail-safe system fails, it fails by failing to fail safe.

Practical systems design

  1. The Vector Theory of Systems: systems run better when designed to run downhill.
  2. Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)

Management and other myths

  1. Complex systems tend to produce complex responses (not solutions) to problems.
  2. Great advances are not produced by systems designed to produce great advances.

Other laws of systemantics

  1. As systems grow in size, they tend to lose basic functions.
  2. The larger the system, the less the variety in the product.
  3. Control of a system is exercised by the element with the greatest variety of behavioral responses.
  4. Colossal systems foster colossal errors.
  5. Choose systems with care.

Reception

Money stated in 1978 that the author "clearly set out to write another Peter Principle". [9] A 1977 review in Etc: A Review of General Semantics states that the book's aim is unclear, commenting, "As a put-down of institutional practices it works well, as good as anything in print", but "As a slam at systems theory the book is less successful, even ambiguous." [10] A Library Journal review from 1977 comments, "Like some of its predecessors, the book pretends to rebuke people for their manifold stupidities, but is, in fact, an invitation to take pleasure in them. That's not a failing, just a fact. Recommended." [11] A 2004 review in the American Society of Safety Professionals' Professional Safety says, "It is at once deadly serious with all the outrageous contrived irony of Gary Larson's 'Far Side' cartoons" and that "the book is one continuous insight after another." [12] PCMag calls the book "small but insightful". [13]

Related Research Articles

E-Prime denotes a restricted form of English in which authors avoid all forms of the verb to be.

Knowledge representation and reasoning is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can use to solve complex tasks such as diagnosing a medical condition or having a dialog in a natural language. Knowledge representation incorporates findings from psychology about how humans solve problems and represent knowledge in order to design formalisms that will make complex systems easier to design and build. Knowledge representation and reasoning also incorporates findings from logic to automate various kinds of reasoning, such as the application of rules or the relations of sets and subsets.

<span class="mw-page-title-main">Molecular nanotechnology</span> Technology

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis. This is distinct from nanoscale materials. Based on Richard Feynman's vision of miniature factories using nanomachines to build complex products, this advanced form of nanotechnology would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

<span class="mw-page-title-main">Peter principle</span> Management concept by Laurence J. Peter

The Peter principle is a concept in management developed by Laurence J. Peter which observes that people in a hierarchy tend to rise to "a level of respective incompetence": employees are promoted based on their success in previous jobs until they reach a level at which they are no longer competent, as skills in one job do not necessarily translate to another.

General semantics is concerned with how events translate to perceptions, how they are further modified by the names and labels we apply to them, and how we might gain a measure of control over our own cognitive, emotional, and behavioral responses. Proponents characterize general semantics as an antidote to certain kinds of delusional thought patterns in which incomplete and possibly warped mental constructs are projected onto the world and treated as reality itself. After partial launches under the names human engineering and humanology, Polish-American originator Alfred Korzybski (1879–1950) fully launched the program as general semantics in 1933 with the publication of Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics.

The twelve leverage points to intervene in a system were proposed by Donella Meadows, a scientist and system analyst who studied environmental limits to economic growth.

<span class="mw-page-title-main">Map–territory relation</span> Relationship between an object and a representation of that object

The map–territory relation is the relationship between an object and a representation of that object, as in the relation between a geographical territory and a map of it. Mistaking the map for the territory is a logical fallacy that occurs when someone confuses the semantics of a term with what it represents. Polish-American scientist and philosopher Alfred Korzybski remarked that "the map is not the territory" and that "the word is not the thing", encapsulating his view that an abstraction derived from something, or a reaction to it, is not the thing itself. Korzybski held that many people do confuse maps with territories, that is, confuse conceptual models of reality with reality itself. These ideas are crucial to general semantics, a system Korzybski originated.

Rational emotive behavior therapy (REBT), previously called rational therapy and rational emotive therapy, is an active-directive, philosophically and empirically based psychotherapy, the aim of which is to resolve emotional and behavioral problems and disturbances and to help people to lead happier and more fulfilling lives.

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.

John Gall was an American author, scholar, and pediatrician. Gall is known for his 1975 book General systemantics: an essay on how systems work, and especially how they fail..., a critique of systems theory. One of the statements from this book has become known as Gall's law.

<span class="mw-page-title-main">Problem solving</span> Approaches to problem solving

Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles. Another classification of problem-solving tasks is into well-defined problems with specific obstacles and goals, and ill-defined problems in which the current situation is troublesome but it is not clear what kind of resolution to aim for. Similarly, one may distinguish formal or fact-based problems requiring psychometric intelligence, versus socio-emotional problems which depend on the changeable emotions of individuals or groups, such as tactful behavior, fashion, or gift choices.

Fault tolerance is the resilient property that enables a system to continue operating properly in the event of failure or major dysfunction in one or more of its components. If its operating quality decreases at all, the decrease is proportional to the severity of the failure, as compared to a naively designed system, in which even a small failure can lead to total breakdown. Fault tolerance is particularly sought after in high-availability, mission-critical, or even life-critical systems. The ability of maintaining functionality when portions of a system break down is referred to as graceful degradation.

The Institute of General Semantics (IGS) is a not-for-profit corporation established in 1938 by Alfred Korzybski, to support research and publication on the topic of general semantics. The Institute publishes Korzybski's writings, including the seminal text Science & Sanity, and books by other authors who have studied or taught general semantics, such as Robert Pula, Irving J. Lee, Wendell Johnson, and Stuart Chase. Every year since 1952, it has sponsored the Alfred Korzybski Memorial Lecture, with presenters from a broad range of disciplines, from science to medicine to entertainment, including names like actor Steve Allen, psychologist Albert Ellis, scientist and visionary R. Buckminster Fuller, linguist Allen Walker Read, and philosopher F. S. C. Northrop. The Institute offers periodic seminars, workshops and conferences and is headquartered in New York City.

<span class="mw-page-title-main">Organizational architecture</span> Procedural structure of an organization

Organizational architecture, also known as organizational design, is a field concerned with the creation of roles, processes, and formal reporting relationships in an organization. It refers to architecture metaphorically, as a structure which fleshes out the organizations. The various features of a business's organizational architecture has to be internally consistent in strategy, architecture and competitive environment.

The law of triviality is C. Northcote Parkinson's 1957 argument that people within an organization commonly give disproportionate weight to trivial issues. Parkinson provides the example of a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bicycle shed, while neglecting the proposed design of the plant itself, which is far more important and a far more difficult and complex task.

Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

<i>Levels of Knowing and Existence</i>

Levels of Knowing and Existence: Studies in General Semantics is a textbook written by Professor Harry L. Weinberg that provides a broad overview of general semantics in language accessible to the layman.

Marjorie Kendig Gates (1892–1981), best known as M. Kendig, was an American administrator, director of the Institute of General Semantics from 1950 until 1965, and co-worker of Alfred Korzybski, who developed the theory of general semantics. She completed Korzybski's collected writings after his death in 1950.

In computer science, robustness is the ability of a computer system to cope with errors during execution and cope with erroneous input. Robustness can encompass many areas of computer science, such as robust programming, robust machine learning, and Robust Security Network. Formal techniques, such as fuzz testing, are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, fault injection can be used to test robustness. Various commercial products perform robustness testing of software analysis.

In information security, computer science, and other fields, the principle of least privilege (PoLP), also known as the principle of minimal privilege (PoMP) or the principle of least authority (PoLA), requires that in a particular abstraction layer of a computing environment, every module must be able to access only the information and resources that are necessary for its legitimate purpose.

References

  1. Serrin, Judith (1977-01-05). "Why Things Just Won't Work". Detroit Free Press . pp. 1C, 5C. Retrieved 2023-09-20 via Newspapers.com.
  2. Gall, John (1978). Systemantics. Pocket Books. pp.  22. ISBN   9780671819101.
  3. Gall, John (1978). Systemantics. Pocket Books. pp.  29. ISBN   9780671819101.
  4. Gall, John (1978). Systemantics. Pocket Books. pp.  40. ISBN   9780671819101.
  5. Pink, Daniel (2011). Drive. Penguin. ISBN   978-1594484803.
  6. Gall, John (1978). Systemantics. Pocket Books. pp.  48. ISBN   9780671819101.
  7. Gall, John (1978). Systemantics. Pocket Books. pp.  58. ISBN   9780671819101.
  8. Gall, John (1978). Systemantics. Pocket Books. pp.  65. ISBN   9780671819101.
  9. Harris, Marlys (January 1978). "Why Things Don't Work: Three books on systems". Money . Vol. 7, no. 1.
  10. Quinby, David L. (December 1977). "Review: General Sematics and General Systems: An Irreverent View". Etc: A Review of General Semantics . 34 (4) via JSTOR.
  11. Anderson, A. J. (1977-05-01). "Humor: Gall, John. Systemantics: How systems work and especially how they fail". Library Journal . 102 (9): 1018 via EBSCO.
  12. Metzgar, Carl R. (October 2004). "Writing Worth Reading: Review: The Systems Bible". Professional Safety. American Society of Safety Professionals. 49 (10): 20, 72 via JSTOR.
  13. "Definition of Systemantics". PCMag . Retrieved 2023-09-20.

Sources