Mitigating control (financial auditing)

Last updated

A mitigating control is type of control used in auditing to discover and prevent mistakes that may lead to uncorrected and/or unrecorded misstatements that would generally be related to control deficiencies. [1] For example, a Company's financial accounting may fail to record a financial transaction and the error may go unnoticed for several reporting periods. A mitigating control would be instrumental in finding and therefore, preventing such mistakes. If a key control fails and a mitigating control is in place, it may prevent the resulting potential financial statement error from becoming material.

Related Research Articles

Software bug Error, flaw, failure, or fault in a computer program or system

A software bug is an error, flaw or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways. The process of finding and fixing bugs is termed "debugging" and often uses formal techniques or tools to pinpoint bugs, and since the 1950s, some computer systems have been designed to also deter, detect or auto-correct various computer bugs during operations.

An error is an action which is inaccurate or incorrect. In some usages, an error is synonymous with a mistake.

In engineering, a fail-safe is a design feature or practice that in the event of a specific type of failure, inherently responds in a way that will cause minimal or no harm to other equipment, to the environment or to people. Unlike inherent safety to a particular hazard, a system being "fail-safe" does not mean that failure is impossible or improbable, but rather that the system's design prevents or mitigates unsafe consequences of the system's failure. That is, if and when a "fail-safe" system fails, it remains at least as safe as it was before the failure. Since many types of failure are possible, failure mode and effects analysis is used to examine failure situations and recommend safety design and procedures.

Safety-critical system System whose failure or malfunction may result in death, injury or damage to equipment or the environment

A safety-critical system (SCS) or life-critical system is a system whose failure or malfunction may result in one of the following outcomes:

Cascading failure Systemic risk of failure

A cascading failure is a process in a system of interconnected parts in which the failure of one or few parts can trigger the failure of other parts and so on. Such a failure may happen in many types of systems, including power transmission, computer networking, finance, transportation systems, organisms, the human body, and ecosystems.

Official scorer Person who records the official record of events in a baseball game

In the game of baseball, the official scorer is a person appointed by the league to record the events on the field, and to send the official scoring record of the game back to the league offices. In addition to recording the events on the field such as the outcome of each plate appearance and the circumstances of any baserunner's advance around the bases, the official scorer is also charged with making judgment calls that do not affect the progress or outcome of the game. Judgment calls are primarily made about errors, unearned runs, fielder's choice, the value of hits in certain situations, and wild pitches, all of which are included in the record compiled. This record is used to compile statistics for each player and team. A box score is a summary of the official scorer's game record.

A medical error is a preventable adverse effect of care ("iatrogenesis"), whether or not it is evident or harmful to the patient. This might include an inaccurate or incomplete diagnosis or treatment of a disease, injury, syndrome, behavior, infection, or other ailment. Globally, it is estimated that 142,000 people died in 2013 from adverse effects of medical treatment; this is an increase from 94,000 in 1990. However, a 2016 study of the number of deaths that were a result of medical error in the U.S. placed the yearly death rate in the U.S. alone at 251,454 deaths, which suggests that the 2013 global estimation may not be accurate. In line with the high importance of the research area, a 2019 study identified 12,415 scientific publications related to medical errors, and outlined as frequently researched and impactful themes errors related to drugs/medications, applications related to medicinal information technology, errors related to critical/intensive care units, to children, and mental conditions associated with medical errors.

A typographical error, also called misprint, is a mistake made in the typing of printed material. Historically, this referred to mistakes in manual type-setting (typography). Technically, the term includes errors due to mechanical failure or slips of the hand or finger, but excludes errors of ignorance, such as spelling errors, or changing and misuse of words such as "than" and "then". Before the arrival of printing, the "copyist's mistake" or "scribal error" was the equivalent for manuscripts. Most typos involve simple duplication, omission, transposition, or substitution of a small number of characters.

An error-tolerant design is one that does not unduly penalize user or human errors. It is the human equivalent of fault tolerant design that allows equipment to continue functioning in the presence of hardware faults, such as a "limp-in" mode for an automobile electronics unit that would be employed if something like the oxygen sensor failed.

Poka-yoke is a Japanese term that means "mistake-proofing" or "inadvertent error prevention". A poka-yoke is any mechanism in a process that helps an equipment operator avoid (yokeru) mistakes (poka) and defects by preventing, correcting, or drawing attention to human errors as they occur. The concept was formalized, and the term adopted, by Shigeo Shingo as part of the Toyota Production System.

Fault tolerance is the property that enables a system to continue operating properly in the event of the failure of one or more faults within some of its components. If its operating quality decreases at all, the decrease is proportional to the severity of the failure, as compared to a naively designed system, in which even a small failure can cause total breakdown. Fault tolerance is particularly sought after in high-availability or life-critical systems. The ability of maintaining functionality when portions of a system break down is referred to as graceful degradation.

Pilot error Decision, action or inaction by a pilot of an aircraft

Pilot error generally refers to an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. Chicago Convention defines accident as "An occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of the "pilot error" does not include deliberate crash.

Internal control, as defined by accounting and auditing, is a process for assuring of an organization's objectives in operational effectiveness and efficiency, reliable financial reporting, and compliance with laws, regulations and policies. A broad concept, internal control involves everything that controls risks to an organization.

Information rights management (IRM) is a subset of digital rights management (DRM), technologies that protect sensitive information from unauthorized access. It is sometimes referred to as E-DRM or Enterprise Digital Rights Management. This can cause confusion, because digital rights management (DRM) technologies are typically associated with business-to-consumer systems designed to protect rich media such as music and video. IRM is a technology which allows for information to be ‘remote controlled’.

Professional liability insurance (PLI), also called professional indemnity insurance (PII) but more commonly known as errors & omissions (E&O) in the US, is a form of liability insurance which helps protect professional advice- and service-providing individuals and companies from bearing the full cost of defending against a negligence claim made by a client, and damages awarded in such a civil lawsuit. The coverage focuses on alleged failure to perform on the part of, financial loss caused by, and error or omission in the service or product sold by the policyholder. These are causes for legal action that would not be covered by a more general liability insurance policy which addresses more direct forms of harm. Professional liability insurance may take on different forms and names depending on the profession, especially medical and legal, and is sometimes required under contract by other businesses that are the beneficiaries of the advice or service.

An emergency procedure is a plan of actions to be conducted in a certain order or manner, in response to a specific class of reasonably foreseeable emergency, a situation that poses an immediate risk to health, life, property, or the environment. Where a range of emergencies are reasonably foreseeable, an emergency plan may be drawn up to manage each threat. Most emergencies require urgent intervention to prevent a worsening of the situation, although in some situations, mitigation may not be possible and agencies may only be able to offer palliative care for the aftermath. The emergency plan should allow for these possibilities.

Reconciliation (accounting)

In accounting, reconciliation is the process of ensuring that two sets of records are in agreement. Reconciliation is used to ensure that the money leaving an account matches the actual money spent. This is done by making sure the balances match at the end of a particular accounting period.

<i>The Fat Tail</i> 2009 book by Ian Bremmer

The Fat Tail: The Power of Political Knowledge for Strategic Investing is a book by political scientists Ian Bremmer and Preston Keat. Bremmer and Keat are President and Research Director, respectively, of Eurasia Group, a global political risk consultancy.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results.

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. - M.A. Blumenberg, 1996

Process Risk is considered to be a sub-component of operational risk. It exists when the process that supports a business activity lacks both efficiency and effectiveness, which may then lead to financial, customer, and reputational loss. This form of risk may be present within any stage of a business transactions. For instance, an error in pricing may be seen as loss in sales revenue, while a disruption in the fulfillment process may cause financial losses in terms of production quality and customer relationships. The majority of operational risk events occur due to losses from ineffective processing of business transactions or process management, and from inadequate relations with trade counter parties and vendors.

References