This article possibly contains original research .(November 2012) |
Author | John Gall |
---|---|
Illustrator | R. O. Blechman |
Language | English |
Subject | Systems science |
Publisher | General Systemantics Press |
Publication date | 1975/78, 1986, 2002 |
Media type |
General Systemantics (retitled to Systemantics in its second edition and The Systems Bible in its third) is a systems engineering treatise by John Gall in which he offers practical principles of systems design based on experience and anecdotes.
It is offered from the perspective of how not to design systems, based on system engineering failures. The primary precept of the treatise is that large complex systems are extremely difficult to design correctly despite best intentions, so care must be taken to design smaller, less-complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness.
The book was initially self-published after Gall received rejection letters from 30 publishers. After several reviews in academic journals, it was picked up by Quadrangle–The New York Times Book Company, who published it in 1977. [1]
The term systemantics is a commentary on prior work by Alfred Korzybski called general semantics which conjectured that all systems failures could be attributed to a single root cause – a failure to communicate. Gall observes that, instead, system failure is an intrinsic feature of systems. He thereby derives the term general systemantics in deference to the notion of a sweeping theory of system failure, but attributed to an intrinsic feature based on laws of system behavior. He observes as a side-note that system antics also playfully captures the concept that systems naturally "act up."
This is more a universal observation than a law. The origin of this observation is traced back via:
By systems, the author refers to those that "...involve human beings, particularly those very large systems such as national governments, nations themselves, religions, the railway system, the post office..." though the intention is that the principles are general to any system.
Additionally, the author observes:
Once a system is set up to solve some problem, the system itself engenders new problems relating to its development, operations and maintenance. The author points out that the additional energy required to support the system can consume the energy it was meant to save. This leads to the next principle:
The author defines anergy as the effort required to bring about a change. This is meant as a tongue-in-cheek analog of the law of conservation of energy.
One of the problems that a system creates is that it becomes an entity unto itself that not only persists but expands and encroaches on areas beyond the original system's purview.
The author cites a number of spectacular unexpected behaviors including:
Not only do systems expand well beyond their original goals, but as they evolve they tend to oppose even their own original goals. This is seen as a systems theory analog of Le Chatelier's principle that suggests chemical and physical processes tend to counteract changed conditions that upset equilibrium until a new equilibrium is established. This same counteraction force can be seen in systems behavior. For example, incentive reward systems set up in business can have the effect of institutionalizing mediocrity. [5] This leads to the following principle:
People performing roles in systems often do not perform the role suggested by the name the system gives that person, nor does the system itself perform the role that its name suggests.
In other words, the system has a severely censored and distorted view of reality from biased and filtering sensory organs. This distorted view displaces understanding of the actual real-world, which in turn pales and tends to disappear. This displacement creates a type of sensory deprivation and a kind of hallucinogenic effect on those inside the systems, causing them to lose common sense. In addition to negatively affecting those inside the system, the system attracts to it people who are optimized for the pathological environment the system creates. Thus,
Money stated in 1978 that the author "clearly set out to write another Peter Principle". [9] A 1977 review in Etc: A Review of General Semantics states that the book's aim is unclear, commenting, "As a put-down of institutional practices it works well, as good as anything in print", but "As a slam at systems theory the book is less successful, even ambiguous." [10] A Library Journal review from 1977 comments, "Like some of its predecessors, the book pretends to rebuke people for their manifold stupidities, but is, in fact, an invitation to take pleasure in them. That's not a failing, just a fact. Recommended." [11] A 2004 review in the American Society of Safety Professionals' Professional Safety says, "It is at once deadly serious with all the outrageous contrived irony of Gary Larson's 'Far Side' cartoons" and that "the book is one continuous insight after another." [12] PCMag calls the book "small but insightful". [13]
E-Prime denotes a restricted form of English in which authors avoid all forms of the verb to be.
Knowledge representation and reasoning is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can use to solve complex tasks such as diagnosing a medical condition or having a dialog in a natural language. Knowledge representation incorporates findings from psychology about how humans solve problems and represent knowledge in order to design formalisms that will make complex systems easier to design and build. Knowledge representation and reasoning also incorporates findings from logic to automate various kinds of reasoning, such as the application of rules or the relations of sets and subsets.
Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis. This is distinct from nanoscale materials. Based on Richard Feynman's vision of miniature factories using nanomachines to build complex products, this advanced form of nanotechnology would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.
The Peter principle is a concept in management developed by Laurence J. Peter which observes that people in a hierarchy tend to rise to "a level of respective incompetence": employees are promoted based on their success in previous jobs until they reach a level at which they are no longer competent, as skills in one job do not necessarily translate to another.
General semantics is concerned with how events translate to perceptions, how they are further modified by the names and labels we apply to them, and how we might gain a measure of control over our own cognitive, emotional, and behavioral responses. Proponents characterize general semantics as an antidote to certain kinds of delusional thought patterns in which incomplete and possibly warped mental constructs are projected onto the world and treated as reality itself. After partial launches under the names human engineering and humanology, Polish-American originator Alfred Korzybski (1879–1950) fully launched the program as general semantics in 1933 with the publication of Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics.
The twelve leverage points to intervene in a system were proposed by Donella Meadows, a scientist and system analyst who studied environmental limits to economic growth.
The map–territory relation is the relationship between an object and a representation of that object, as in the relation between a geographical territory and a map of it. Mistaking the map for the territory is a logical fallacy that occurs when someone confuses the semantics of a term with what it represents. Polish-American scientist and philosopher Alfred Korzybski remarked that "the map is not the territory" and that "the word is not the thing", encapsulating his view that an abstraction derived from something, or a reaction to it, is not the thing itself. Korzybski held that many people do confuse maps with territories, that is, confuse conceptual models of reality with reality itself. These ideas are crucial to general semantics, a system Korzybski originated.
Rational emotive behavior therapy (REBT), previously called rational therapy and rational emotive therapy, is an active-directive, philosophically and empirically based psychotherapy, the aim of which is to resolve emotional and behavioral problems and disturbances and to help people to lead happier and more fulfilling lives.
In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.
John Gall was an American author, scholar, and pediatrician. Gall is known for his 1975 book General systemantics: an essay on how systems work, and especially how they fail..., a critique of systems theory. One of the statements from this book has become known as Gall's law.
Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles. Another classification of problem-solving tasks is into well-defined problems with specific obstacles and goals, and ill-defined problems in which the current situation is troublesome but it is not clear what kind of resolution to aim for. Similarly, one may distinguish formal or fact-based problems requiring psychometric intelligence, versus socio-emotional problems which depend on the changeable emotions of individuals or groups, such as tactful behavior, fashion, or gift choices.
Fault tolerance is the resilient property that enables a system to continue operating properly in the event of failure or major dysfunction in one or more of its components. If its operating quality decreases at all, the decrease is proportional to the severity of the failure, as compared to a naively designed system, in which even a small failure can lead to total breakdown. Fault tolerance is particularly sought after in high-availability, mission-critical, or even life-critical systems. The ability of maintaining functionality when portions of a system break down is referred to as graceful degradation.
The Institute of General Semantics (IGS) is a not-for-profit corporation established in 1938 by Alfred Korzybski, to support research and publication on the topic of general semantics. The Institute publishes Korzybski's writings, including the seminal text Science & Sanity, and books by other authors who have studied or taught general semantics, such as Robert Pula, Irving J. Lee, Wendell Johnson, and Stuart Chase. Every year since 1952, it has sponsored the Alfred Korzybski Memorial Lecture, with presenters from a broad range of disciplines, from science to medicine to entertainment, including names like actor Steve Allen, psychologist Albert Ellis, scientist and visionary R. Buckminster Fuller, linguist Allen Walker Read, and philosopher F. S. C. Northrop. The Institute offers periodic seminars, workshops and conferences and is headquartered in New York City.
Organizational architecture, also known as organizational design, is a field concerned with the creation of roles, processes, and formal reporting relationships in an organization. It refers to architecture metaphorically, as a structure which fleshes out the organizations. The various features of a business's organizational architecture has to be internally consistent in strategy, architecture and competitive environment.
The law of triviality is C. Northcote Parkinson's 1957 argument that people within an organization commonly give disproportionate weight to trivial issues. Parkinson provides the example of a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bicycle shed, while neglecting the proposed design of the plant itself, which is far more important and a far more difficult and complex task.
Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.
Levels of Knowing and Existence: Studies in General Semantics is a textbook written by Professor Harry L. Weinberg that provides a broad overview of general semantics in language accessible to the layman.
Marjorie Kendig Gates (1892–1981), best known as M. Kendig, was an American administrator, director of the Institute of General Semantics from 1950 until 1965, and co-worker of Alfred Korzybski, who developed the theory of general semantics. She completed Korzybski's collected writings after his death in 1950.
In computer science, robustness is the ability of a computer system to cope with errors during execution and cope with erroneous input. Robustness can encompass many areas of computer science, such as robust programming, robust machine learning, and Robust Security Network. Formal techniques, such as fuzz testing, are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, fault injection can be used to test robustness. Various commercial products perform robustness testing of software analysis.
In information security, computer science, and other fields, the principle of least privilege (PoLP), also known as the principle of minimal privilege (PoMP) or the principle of least authority (PoLA), requires that in a particular abstraction layer of a computing environment, every module must be able to access only the information and resources that are necessary for its legitimate purpose.