Technical failure

Last updated
Software error.

A technical failure is an (unwanted) error of technology based systems.

Causality

The motor assisted input switch of a HiFi amplifier is faulty and switches randomly and fast between the inputs.

Causalities include fatigue and attenuation distortions.

See also


Related Research Articles

Artificial intelligence Intelligence demonstrated by machines

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen. 'Strong' AI is usually labelled as AGI while attempts to emulate 'natural' intelligence have been called ABI. Leading AI textbooks define the field as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. Colloquially, the term "artificial intelligence" is often used to describe machines that mimic "cognitive" functions that humans associate with the human mind, such as "learning" and "problem solving".

Data integrity is the maintenance of, and the assurance of, data accuracy and consistency over its entire life-cycle and is a critical aspect to the design, implementation, and usage of any system that stores, processes, or retrieves data. The term is broad in scope and may have widely different meanings depending on the specific context – even under the same general umbrella of computing. It is at times used as a proxy term for data quality, while data validation is a pre-requisite for data integrity. Data integrity is the opposite of data corruption. The overall intent of any data integrity technique is the same: ensure data is recorded exactly as intended. Moreover, upon later retrieval, ensure the data is the same as when it was originally recorded. In short, data integrity aims to prevent unintentional changes to information. Data integrity is not to be confused with data security, the discipline of protecting data from unauthorized parties.

An error is an action which is inaccurate or incorrect. In some usages, an error is synonymous with a mistake.

Business software is any software or set of computer programs used by business users to perform various business functions. These business applications are used to increase productivity, to measure productivity, and to perform other business functions accurately.

An error-tolerant design is one that does not unduly penalize user or human errors. It is the human equivalent of fault tolerant design that allows equipment to continue functioning in the presence of hardware faults, such as a "limp-in" mode for an automobile electronics unit that would be employed if something like the oxygen sensor failed.

Retail loss prevention

Retail loss prevention is a set of practices employed by retail companies to preserve profit. Profit preservation is any business activity specifically designed to reduce preventable losses. A preventable loss is any business cost caused by deliberate or inadvertent human actions, colloquially known as "shrinkage". Loss prevention is mainly found within the retail sector but also can be found within other business environments.

Human reliability is related to the field of human factors and ergonomics, and refers to the reliability of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

Redundancy (engineering) Duplication of critical components to increase reliability of a system

In engineering, redundancy is the duplication of critical components or functions of a system with the intention of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing.

Poka-yoke is a Japanese term that means "mistake-proofing" or "inadvertent error prevention". A poka-yoke is any mechanism in a process that helps an equipment operator avoid (yokeru) mistakes (poka) defects by preventing, correcting, or drawing attention to human errors as they occur. The concept was formalized, and the term adopted, by Shigeo Shingo as part of the Toyota Production System.

User interface design

User interface (UI) design or user interface engineering is the design of user interfaces for machines and software, such as computers, home appliances, mobile devices, and other electronic devices, with the focus on maximizing usability and the user experience. The goal of user interface design is to make the user's interaction as simple and efficient as possible, in terms of accomplishing user goals.

Pilot error Decision, action or inaction by a pilot of an aircraft

Pilot error generally refers to an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. Chicago Convention defines accident as "An occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of the "pilot error" does not include deliberate crash.

User error Term used by computer technicians as a joke for to define when a computer error exists between the keyboard and chair

A user error is an error made by the human user of a complex system, usually a computer system, in interacting with it. Although the term is sometimes used by human–computer interaction practitioners, the more formal human error term is used in the context of human reliability.

The philosophy of artificial intelligence is a branch of the philosophy of technology that explores artificial intelligence and its implications for knowledge and understanding of intelligence, ethics, consciousness, epistemology, and free will. Furthermore, the technology is concerned with the creation of artificial animals or artificial people so the discipline is of considerable interest to philosophers. These factors contributed to the emergence of the philosophy of artificial intelligence. Some scholars argue that the AI community's dismissal of philosophy is detrimental.

Criticism of Hinduism refers to aspects of Hinduism which have been criticised. For prejudice against Hindus, see Anti-Hindu sentiment.

Human error refers to something having been done that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

Swiss cheese model

The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of Swiss cheese, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure. The model was originally formally propounded by Dante Orlandella and James T. Reason of the University of Manchester, and has since gained widespread acceptance. It is sometimes called the "cumulative act effect".

Marmes Rockshelter United States historic place

The Marmes Rockshelter is an archaeological site first excavated in 1962, near Lyons Ferry Park and the confluence of the Snake and Palouse Rivers, in Franklin County, southeastern Washington. This rockshelter is remarkable in the level of preservation of organic materials, the depth of stratified deposits, and the apparent age of the associated Native American human remains. The site was discovered on the property of Roland Marmes, and was the site of the oldest human remains in North America at that time. In 1966, the site became, along with Chinook Point and the American and English Camps on San Juan Island, the first National Historic Landmarks listed in Washington. In 1969, the site was submerged in water when a levee protecting it from waters rising behind the then newly constructed Lower Monumental Dam, which was 20 miles (32 km) down the Snake River, failed to hold back water that leaked into the protected area through gravel under the soil, creating Lake Herbert G. West.

Sustainability Process of maintaining change in a balanced fashion

Sustainability is the ability to exist constantly. In the 21st century, it refers generally to the capacity for Earth's biosphere and human civilization to co-exist. It is also defined as the process of people maintaining change in a homeostasis-balanced environment, in which the exploitation of resources, the direction of investments, the orientation of technological development, and institutional change are all in harmony and enhance both current and future potential to meet human needs and aspirations. For many in the field, sustainability is defined through the following interconnected domains or pillars: environmental, economic and social, which according to Fritjof Capra, is based on the principles of systems thinking. Sub-domains of sustainable development have been considered also: cultural, technological and political. According to Our Common Future, sustainable development is defined as development that "meets the needs of the present without compromising the ability of future generations to meet their own needs." Sustainable development may be the organizing principle of sustainability, yet others may view the two terms as paradoxical.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results.

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. - M.A. Blumenberg, 1996

Automation bias Propensity for humans to favor suggestions from automated decision-making systems

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.