In computer programming, undefined behavior (UB) is the result of executing a program whose behavior is prescribed to be unpredictable, in the language specification of the programming language in which the source code is written. This is different from unspecified behavior, for which the language specification does not prescribe a result, and implementation-defined behavior that defers to the documentation of another component of the platform (such as the ABI or the translator documentation).
In the C programming community, undefined behavior may be humorously referred to as "nasal demons", after a comp.std.c post that explained undefined behavior as allowing the compiler to do anything it chooses, even "to make demons fly out of your nose". [1]
Some programming languages allow a program to operate differently or even have a different control flow from the source code, as long as it exhibits the same user-visible side effects, if undefined behavior never happens during program execution. Undefined behavior is the name of a list of conditions that the program must not meet.
In the early versions of C, undefined behavior's primary advantage was the production of performant compilers for a wide variety of machines: a specific construct could be mapped to a machine-specific feature, and the compiler did not have to generate additional code for the runtime to adapt the side effects to match semantics imposed by the language. The program source code was written with prior knowledge of the specific compiler and of the platforms that it would support.
However, progressive standardization of the platforms has made this less of an advantage, especially in newer versions of C. Now, the cases for undefined behavior typically represent unambiguous bugs in the code, for example indexing an array outside of its bounds. By definition, the runtime can assume that undefined behavior never happens; therefore, some invalid conditions do not need to be checked against. For a compiler, this also means that various program transformations become valid, or their proofs of correctness are simplified; this allows for various kinds of optimizations whose correctness depend on the assumption that the program state never meets any such condition. The compiler can also remove explicit checks that may have been in the source code, without notifying the programmer; for example, detecting undefined behavior by testing whether it happened is not guaranteed to work, by definition. This makes it hard or impossible to program a portable fail-safe option (non-portable solutions are possible for some constructs).
Current compiler development usually evaluates and compares compiler performance with benchmarks designed around micro-optimizations, even on platforms that are mostly used on the general-purpose desktop and laptop market (such as amd64). Therefore, undefined behavior provides ample room for compiler performance improvement, as the source code for a specific source code statement is allowed to be mapped to anything at runtime.
For C and C++, the compiler is allowed to give a compile-time diagnostic in these cases, but is not required to: the implementation will be considered correct whatever it does in such cases, analogous to don't-care terms in digital logic. It is the responsibility of the programmer to write code that never invokes undefined behavior, although compiler implementations are allowed to issue diagnostics when this happens. Compilers nowadays have flags that enable such diagnostics, for example, -fsanitize=undefined
enables the "undefined behavior sanitizer" (UBSan) in gcc 4.9 [2] and in clang. However, this flag is not the default and enabling it is a choice of the person who builds the code.
Under some circumstances there can be specific restrictions on undefined behavior. For example, the instruction set specifications of a CPU might leave the behavior of some forms of an instruction undefined, but if the CPU supports memory protection then the specification will probably include a blanket rule stating that no user-accessible instruction may cause a hole in the operating system's security; so an actual CPU would be permitted to corrupt user registers in response to such an instruction, but would not be allowed to, for example, switch into supervisor mode.
The runtime platform can also provide some restrictions or guarantees on undefined behavior, if the toolchain or the runtime explicitly document that specific constructs found in the source code are mapped to specific well-defined mechanisms available at runtime. For example, an interpreter may document a particular behavior for some operations that are undefined in the language specification, while other interpreters or compilers for the same language may not. A compiler produces executable code for a specific ABI, filling the semantic gap in ways that depend on the compiler version: the documentation for that compiler version and the ABI specification can provide restrictions on undefined behavior. Relying on these implementation details makes the software non-portable, but portability may not be a concern if the software is not supposed to be used outside of a specific runtime.
Undefined behavior can result in a program crash or even in failures that are harder to detect and make the program look like it is working normally, such as silent loss of data and production of incorrect results.
Documenting an operation as undefined behavior allows compilers to assume that this operation will never happen in a conforming program. This gives the compiler more information about the code and this information can lead to more optimization opportunities.
An example for the C language:
intfoo(unsignedcharx){intvalue=2147483600;/* assuming 32-bit int and 8-bit char */value+=x;if(value<2147483600)bar();returnvalue;}
The value of x
cannot be negative and, given that signed integer overflow is undefined behavior in C, the compiler can assume that value < 2147483600
will always be false. Thus the if
statement, including the call to the function bar
, can be ignored by the compiler since the test expression in the if
has no side effects and its condition will never be satisfied. The code is therefore semantically equivalent to:
intfoo(unsignedcharx){intvalue=2147483600;value+=x;returnvalue;}
Had the compiler been forced to assume that signed integer overflow has wraparound behavior, then the transformation above would not have been legal.
Such optimizations become hard to spot by humans when the code is more complex and other optimizations, like inlining, take place. For example, another function may call the above function:
voidrun_tasks(unsignedchar*ptrx){intz;z=foo(*ptrx);while(*ptrx>60){run_one_task(ptrx,z);}}
The compiler is free to optimize away the while
-loop here by applying value range analysis: by inspecting foo()
, it knows that the initial value pointed to by ptrx
cannot possibly exceed 47 (as any more would trigger undefined behavior in foo()
); therefore, the initial check of *ptrx > 60
will always be false in a conforming program. Going further, since the result z
is now never used and foo()
has no side effects, the compiler can optimize run_tasks()
to be an empty function that returns immediately. The disappearance of the while
-loop may be especially surprising if foo()
is defined in a separately compiled object file.
Another benefit from allowing signed integer overflow to be undefined is that it makes it possible to store and manipulate a variable's value in a processor register that is larger than the size of the variable in the source code. For example, if the type of a variable as specified in the source code is narrower than the native register width (such as int
on a 64-bit machine, a common scenario), then the compiler can safely use a signed 64-bit integer for the variable in the machine code it produces, without changing the defined behavior of the code. If a program depended on the behavior of a 32-bit integer overflow, then a compiler would have to insert additional logic when compiling for a 64-bit machine, because the overflow behavior of most machine instructions depends on the register width. [3]
Undefined behavior also allows more compile-time checks by both compilers and static program analysis.[ citation needed ]
C and C++ standards have several forms of undefined behavior throughout, which offer increased liberty in compiler implementations and compile-time checks at the expense of undefined run-time behavior if present. In particular, the ISO standard for C has an appendix listing common sources of undefined behavior. [4] Moreover, compilers are not required to diagnose code that relies on undefined behavior. Hence, it is common for programmers, even experienced ones, to rely on undefined behavior either by mistake, or simply because they are not well-versed in the rules of the language that can span hundreds of pages. This can result in bugs that are exposed when a different compiler, or different settings, are used. Testing or fuzzing with dynamic undefined behavior checks enabled, e.g., the Clang sanitizers, can help to catch undefined behavior not diagnosed by the compiler or static analyzers. [5]
Undefined behavior can lead to security vulnerabilities in software. For example, buffer overflows and other security vulnerabilities in the major web browsers are due to undefined behavior. When GCC's developers changed their compiler in 2008 such that it omitted certain overflow checks that relied on undefined behavior, CERT issued a warning against the newer versions of the compiler. [6] Linux Weekly News pointed out that the same behavior was observed in PathScale C, Microsoft Visual C++ 2005 and several other compilers; [7] the warning was later amended to warn about various compilers. [8]
The major forms of undefined behavior in C can be broadly classified as: [9] spatial memory safety violations, temporal memory safety violations, integer overflow, strict aliasing violations, alignment violations, unsequenced modifications, data races, and loops that neither perform I/O nor terminate.
In C the use of any automatic variable before it has been initialized yields undefined behavior, as does integer division by zero, signed integer overflow, indexing an array outside of its defined bounds (see buffer overflow), or null pointer dereferencing. In general, any instance of undefined behavior leaves the abstract execution machine in an unknown state, and causes the behavior of the entire program to be undefined.
Attempting to modify a string literal causes undefined behavior: [10]
char*p="wikipedia";// valid C, deprecated in C++98/C++03, ill-formed as of C++11p[0]='W';// undefined behavior
Integer division by zero results in undefined behavior: [11]
intx=1;returnx/0;// undefined behavior
Certain pointer operations may result in undefined behavior: [12]
intarr[4]={0,1,2,3};int*p=arr+5;// undefined behavior for indexing out of boundsp=NULL;inta=*p;// undefined behavior for dereferencing a null pointer
In C and C++, the relational comparison of pointers to objects (for less-than or greater-than comparison) is only strictly defined if the pointers point to members of the same object, or elements of the same array. [13] Example:
intmain(void){inta=0;intb=0;return&a<&b;/* undefined behavior */}
Reaching the end of a value-returning function (other than main()
) without a return statement results in undefined behavior if the value of the function call is used by the caller: [14]
intf(){}/* undefined behavior if the value of the function call is used*/
Modifying an object between two sequence points more than once produces undefined behavior. [15] There are considerable changes in what causes undefined behavior in relation to sequence points as of C++11. [16] Modern compilers can emit warnings when they encounter multiple unsequenced modifications to the same object. [17] [18] The following example will cause undefined behavior in both C and C++.
intf(inti){returni+++i++;/* undefined behavior: two unsequenced modifications to i */}
When modifying an object between two sequence points, reading the value of the object for any other purpose than determining the value to be stored is also undefined behavior. [19]
a[i]=i++;// undefined behaviorprintf("%d %d\n",++n,power(2,n));// also undefined behavior
In C/C++ bitwise shifting a value by a number of bits which is either a negative number or is greater than or equal to the total number of bits in this value results in undefined behavior. The safest way (regardless of compiler vendor) is to always keep the number of bits to shift (the right operand of the <<
and >>
bitwise operators) within the range: [0, sizeof value * CHAR_BIT - 1
] (where value
is the left operand).
intnum=-1;unsignedintval=1<<num;// shifting by a negative number - undefined behaviornum=32;// or whatever number greater than 31val=1<<num;// the literal '1' is typed as a 32-bit integer - in this case shifting by more than 31 bits is undefined behaviornum=64;// or whatever number greater than 63unsignedlonglongval2=1ULL<<num;// the literal '1ULL' is typed as a 64-bit integer - in this case shifting by more than 63 bits is undefined behavior
While undefined behavior is never present in safe Rust, it is possible to invoke undefined behavior in unsafe Rust in many ways. [20] For example, creating an invalid reference (a reference which does not refer to a valid value) invokes immediate undefined behavior:
fnmain(){// The following line invokes immediate undefined behaviour.let_null_reference: &i32=unsafe{std::mem::zeroed()};}
It is not necessary to use the reference; undefined behavior is invoked merely from the creation of such a reference.
C is a general-purpose programming language. It was created in the 1970s by Dennis Ritchie and remains very widely used and influential. By design, C's features cleanly reflect the capabilities of the targeted CPUs. It has found lasting use in operating systems code, device drivers, and protocol stacks, but its use in application software has been decreasing. C is commonly used on computer architectures that range from the largest supercomputers to the smallest microcontrollers and embedded systems.
The Cyclone programming language was intended to be a safe dialect of the C language. It avoids buffer overflows and other vulnerabilities that are possible in C programs by design, without losing the power and convenience of C as a tool for system programming. It is no longer supported by its original developers, with the reference tooling not supporting 64-bit platforms. The Rust language is mentioned by the original developers for having integrated many of the same ideas Cyclone had.
In computing, a segmentation fault or access violation is a fault, or failure condition, raised by hardware with memory protection, notifying an operating system (OS) the software has attempted to access a restricted area of memory. On standard x86 computers, this is a form of general protection fault. The operating system kernel will, in response, usually perform some corrective action, generally passing the fault on to the offending process by sending the process a signal. Processes can in some cases install a custom signal handler, allowing them to recover on their own, but otherwise the OS default signal handler is used, generally causing abnormal termination of the process, and sometimes a core dump.
The syntax of the C programming language is the set of rules governing writing of software in C. It is designed to allow for programs that are extremely terse, have a close relationship with the resulting object code, and yet provide relatively high-level data abstraction. C was the first widely successful high-level language for portable operating-system development.
In computer science, a pointer is an object in many programming languages that stores a memory address. This can be that of another value located in computer memory, or in some cases, that of memory-mapped computer hardware. A pointer references a location in memory, and obtaining the value stored at that location is known as dereferencing the pointer. As an analogy, a page number in a book's index could be considered a pointer to the corresponding page; dereferencing such a pointer would be done by flipping to the page with the given page number and reading the text found on that page. The actual format and content of a pointer variable is dependent on the underlying computer architecture.
In computer science, a union is a value that may have any of several representations or formats within the same position in memory; that consists of a variable that may hold such a data structure. Some programming languages support special data types, called union types, to describe such values and variables. In other words, a union type definition will specify which of a number of permitted primitive types may be stored in its instances, e.g., "float or long integer". In contrast with a record, which could be defined to contain both a float and an integer; in a union, there is only one value at any given time.
In computing, aliasing describes a situation in which a data location in memory can be accessed through different symbolic names in the program. Thus, modifying the data through one name implicitly modifies the values associated with all aliased names, which may not be expected by the programmer. As a result, aliasing makes it particularly difficult to understand, analyze and optimize programs. Aliasing analysers intend to make and compute useful information for understanding aliasing in programs.
Short-circuit evaluation, minimal evaluation, or McCarthy evaluation is the semantics of some Boolean operators in some programming languages in which the second argument is executed or evaluated only if the first argument does not suffice to determine the value of the expression: when the first argument of the AND
function evaluates to false
, the overall value must be false
; and when the first argument of the OR
function evaluates to true
, the overall value must be true
.
Partial template specialization is a particular form of class template specialization. Usually used in reference to the C++ programming language, it allows the programmer to specialize only some arguments of a class template, as opposed to explicit full specialization, where all the template arguments are provided.
In computing, a null pointer or null reference is a value saved for indicating that the pointer or reference does not refer to a valid object. Programs routinely use null pointers to represent conditions such as the end of a list of unknown length or the failure to perform some action; this use of null pointers can be compared to nullable types and to the Nothing value in an option type.
In some programming languages, const is a type qualifier, which indicates that the data is read-only. While this can be used to declare constants, const in the C family of languages differs from similar constructs in other languages in that it is part of the type, and thus has complicated behavior when combined with pointers, references, composite data types, and type-checking. In other languages, the data is not in a single memory location, but copied at compile time for each use. Languages which use it include C, C++, D, JavaScript, Julia, and Rust.
In the C programming language, data types constitute the semantics and characteristics of storage of data elements. They are expressed in the language syntax in form of declarations for memory locations or variables. Data types also determine the types of operations or methods of processing of data elements.
A class in C++ is a user-defined type or data structure declared with any of the keywords class
, struct
or union
that has data and functions as its members whose access is governed by the three access specifiers private, protected or public. By default access to members of a C++ class declared with the keyword class
is private. The private members are not accessible outside the class; they can be accessed only through member functions of the class. The public members form an interface to the class and are accessible outside the class.
sizeof is a unary operator in the programming languages C and C++. It generates the storage size of an expression or a data type, measured in the number of char-sized units. Consequently, the construct sizeof (char) is guaranteed to be 1. The actual number of bits of type char is specified by the preprocessor macro CHAR_BIT, defined in the standard include file limits.h. On most modern computing platforms this is eight bits. The result of sizeof has an unsigned integer type that is usually denoted by size_t.
C++11 is a version of the ISO/IEC 14882 standard for the C++ programming language. C++11 replaced the prior version of the C++ standard, called C++03, and was later replaced by C++14. The name follows the tradition of naming language versions by the publication year of the specification, though it was formerly named C++0x because it was expected to be published before 2010.
In computer science, a type punning is any programming technique that subverts or circumvents the type system of a programming language in order to achieve an effect that would be difficult or impossible to achieve within the bounds of the formal language.
In computer programming, unspecified behavior is behavior that may vary on different implementations of a programming language. A program can be said to contain unspecified behavior when its source code may produce an executable that exhibits different behavior when compiled on a different compiler, or on the same compiler with different settings, or indeed in different parts of the same executable. While the respective language standards or specifications may impose a range of possible behaviors, the exact behavior depends on the implementation and may not be completely determined upon examination of the program's source code. Unspecified behavior will often not manifest itself in the resulting program's external behavior, but it may sometimes lead to differing outputs or results, potentially causing portability problems.
In C++ computer programming, copy elision refers to a compiler optimization technique that eliminates unnecessary copying of objects.
In the C programming language, an escape sequence is specially delimited text in a character or string literal that represents one or more other characters to the compiler. It allows a programmer to specify characters that are otherwise difficult or impossible to specify in a literal.
Objective-C is a high-level general-purpose, object-oriented programming language that adds Smalltalk-style messaging to the C programming language. Originally developed by Brad Cox and Tom Love in the early 1980s, it was selected by NeXT for its NeXTSTEP operating system. Due to Apple macOS’s direct lineage from NeXTSTEP, Objective-C was the standard programming language used, supported, and promoted by Apple for developing macOS and iOS applications until the introduction of the Swift programming language in 2014.