Dummy data

Last updated

In Informatics, dummy data is benign information that does not contain any useful data, but serves to reserve space where real data is nominally present. Dummy data can be used as a placeholder for both testing and operational purposes. For testing, dummy data can also be used as stubs or pad to avoid software testing issues by ensuring that all variables and data fields are occupied. In operational use, dummy data may be transmitted for OPSEC purposes. Dummy data must be rigorously evaluated and documented to ensure that it does not cause unintended effects.

See also


Related Research Articles

<span class="mw-page-title-main">Acceptance testing</span> Test to determine if the requirements of a specification or contract are met

In engineering and its various subdisciplines, acceptance testing is a test conducted to determine if the requirements of a specification or contract are met. It may involve chemical tests, physical tests, or performance tests.

<span class="mw-page-title-main">Software testing</span> Checking software against a standard

Software testing is the act of checking whether software satisfies expectations.

<span class="mw-page-title-main">Configuration management</span> Process for maintaining consistency of a product attributes with its design

Configuration management (CM) is a systems engineering process for establishing and maintaining consistency of a product's performance, functional, and physical attributes with its requirements, design, and operational information throughout its life. The CM process is widely used by military engineering organizations to manage changes throughout the system lifecycle of complex systems, such as weapon systems, military vehicles, and information systems. Outside the military, the CM process is also used with IT service management as defined by ITIL, and with other domain models in the civil engineering and other industrial engineering segments such as roads, bridges, canals, dams, and buildings.

<span class="mw-page-title-main">Crash test dummy</span> Full-scale anthropomorphic test devices that simulate human bodies in vehicle crash testing

A crash test dummy, or simply dummy, is a full-scale anthropomorphic test device (ATD) that simulates the dimensions, weight proportions and articulation of the human body during a traffic collision. Dummies are used by researchers, automobile and aircraft manufacturers to predict the injuries a person might sustain in a crash. Modern dummies are usually instrumented to record data such as velocity of impact, crushing force, bending, folding, or torque of the body, and deceleration rates during a collision.

In computing, stress testing can be applied to either hardware or software. It is used to determine the maximum capability of a computer system and is often used for purposes such as scaling for production use and ensuring reliability and stability. Stress tests typically involve running a large amount of resource-intensive processes until the system either crashes or nearly does so.

<span class="mw-page-title-main">Crash test</span> Form of destructive testing

A crash test is a form of destructive testing usually performed in order to ensure safe design standards in crashworthiness and crash compatibility for various modes of transportation or related systems and components.

Test-driven development (TDD) is a way of writing code that involves writing an automated unit-level test case that fails, then writing just enough code to make the test pass, then refactoring both the test code and the production code, then repeating with another new test case.

<span class="mw-page-title-main">Systems development life cycle</span> Systems engineering terms

In systems engineering, information systems and software engineering, the systems development life cycle (SDLC), also referred to as the application development life cycle, is a process for planning, creating, testing, and deploying an information system. The SDLC concept applies to a range of hardware and software configurations, as a system can be composed of hardware only, software only, or a combination of both. There are usually six stages in this cycle: requirement analysis, design, development and testing, implementation, documentation, and evaluation.

In software project management, software testing, and software engineering, verification and validation is the process of checking that a software engineer system meets specifications and requirements so that it fulfills its intended purpose. It may also be referred to as software quality control. It is normally the responsibility of software testers as part of the software development lifecycle. In simple terms, software verification is: "Assuming we should build X, does our software achieve its goals without any bugs or gaps?" On the other hand, software validation is: "Was X what we should have built? Does X meet the high-level requirements?"

<span class="mw-page-title-main">Dummy load</span> Device used to simulate an electrical load

A dummy load is a device used to simulate an electrical load, usually for testing purposes. In radio a dummy antenna is connected to the output of a radio transmitter and electrically simulates an antenna, to allow the transmitter to be adjusted and tested without radiating radio waves. In audio systems, a dummy load is connected to the output of an amplifier to electrically simulate a loudspeaker, allowing the amplifier to be tested without producing sound. Load banks are connected to electrical power supplies to simulate the supply's intended electrical load for testing purposes.

<span class="mw-page-title-main">Department of Defense Architecture Framework</span> Enterprise architecture framework

The Department of Defense Architecture Framework (DoDAF) is an architecture framework for the United States Department of Defense (DoD) that provides visualization infrastructure for specific stakeholders concerns through viewpoints organized by various views. These views are artifacts for visualizing, understanding, and assimilating the broad scope and complexities of an architecture description through tabular, structural, behavioral, ontological, pictorial, temporal, graphical, probabilistic, or alternative conceptual means. The current release is DoDAF 2.02.

A mock object is an object that imitates a production object in limited ways.

Environmental stress screening (ESS) refers to the process of exposing a newly manufactured or repaired product or component to stresses such as thermal cycling and vibration in order to force latent defects to manifest themselves by permanent or catastrophic failure during the screening process. The surviving population, upon completion of screening, can be assumed to have a higher reliability than a similar unscreened population.

<span class="mw-page-title-main">Live fire exercise</span> Military exercise using live munitions

A live fire exercise (LFX) is a military exercise in which live ammunition and ordnance is used, as opposed to blanks or dummies. The term can also be found in non-military usage.

DMAIC or define, measure, analyze, improve and control refers to a data-driven improvement cycle used for optimizing and stabilizing business processes and designs. The DMAIC improvement cycle is the core tool used to drive Six Sigma projects. However, DMAIC is not exclusive to Six Sigma and can be used as the framework for other improvement applications.

<span class="mw-page-title-main">Snap cap</span> Firearm accessory device

A snap cap is a firearm accessory device shaped like a standard cartridge/shotshell but contains no functional components, namely the primer, propellant (gunpowder) and projectile. It serves the same purpose as a dummy round, but different in that a dummy is usually modified from a real cartridge with its propellant and primer removed, while a snap cap has a monolithic outer shell and is specifically designed to be a fake cartridge from the very beginning.

An integrated test facility (ITF) creates a fictitious entity in a database to process test transactions simultaneously with live input.

Performance engineering encompasses the techniques applied during a systems development life cycle to ensure the non-functional requirements for performance will be met. It may be alternatively referred to as systems performance engineering within systems engineering, and software performance engineering or application performance engineering within software engineering.

Software Quality Management (SQM) is a management process that aims to develop and manage the quality of software in such a way so as to best ensure that the product meets the quality standards expected by the customer while also meeting any necessary regulatory and developer requirements, if any. Software quality managers require software to be tested before it is released to the market, and they do this using a cyclical process-based quality assessment in order to reveal and fix bugs before release. Their job is not only to ensure their software is in good shape for the consumer but also to encourage a culture of quality throughout the enterprise.

Test and evaluation master plan (TEMP) is a critical aspect of project management involving complex systems that must satisfy specification requirements. The TEMP is used to support programmatic events called milestone decisions that separate the individual phases of a project. For military systems, the level of funding determines the Acquisition Category and the organization responsible for the milestone decision.