This article includes a list of general references, but it lacks sufficient corresponding inline citations .(April 2024) |
TEMPEST (Telecommunications Electronics Materials Protected from Emanating Spurious Transmissions [1] ) is a U.S. National Security Agency specification and a NATO certification [2] [3] referring to spying on information systems through leaking emanations, including unintentional radio or electrical signals, sounds, and vibrations. [4] [5] TEMPEST covers both methods to spy upon others and how to shield equipment against such spying. The protection efforts are also known as emission security (EMSEC), which is a subset of communications security (COMSEC). [6] The reception methods fall under the umbrella of radiofrequency MASINT.
The NSA methods for spying on computer emissions are classified, but some of the protection standards have been released by either the NSA or the Department of Defense. [7] Protecting equipment from spying is done with distance, shielding, filtering, and masking. [8] The TEMPEST standards mandate elements such as equipment distance from walls, amount of shielding in buildings and equipment, and distance separating wires carrying classified vs. unclassified materials, [7] filters on cables, and even distance and shielding between wires or equipment and building pipes. Noise can also protect information by masking the actual data. [8]
While much of TEMPEST is about leaking electromagnetic emanations, it also encompasses sounds and mechanical vibrations. [7] For example, it is possible to log a user's keystrokes using the motion sensor inside smartphones. [9] Compromising emissions are defined as unintentional intelligence-bearing signals which, if intercepted and analyzed (side-channel attack), may disclose the information transmitted, received, handled, or otherwise processed by any information-processing equipment. [10]
During World War II, The Bell System supplied the U.S. military with the 131-B2 mixer device that encrypted teleprinter signals by XOR’ing them with key material from one-time tapes (the SIGTOT system) or, earlier, a rotor-based key generator called SIGCUM. It used electromechanical relays in its operation. Later, Bell informed the Signal Corps that they were able to detect electromagnetic spikes at a distance from the mixer and recover the plain text. Meeting skepticism over whether the phenomenon they discovered in the laboratory could really be dangerous, they demonstrated their ability to recover plain text from a Signal Corps’ crypto center on Varick Street in Lower Manhattan. Now alarmed, the Signal Corps asked Bell to investigate further. Bell identified three problem areas: radiated signals, signals conducted on wires extending from the facility, and magnetic fields. As possible solutions, they suggested shielding, filtering and masking.
Bell developed a modified mixer, the 131-A1 with shielding and filtering, but it proved difficult to maintain and too expensive to deploy. Instead, relevant commanders were warned of the problem and advised to control a 100-foot (30 m)-diameter zone around their communications center to prevent covert interception, and things were left at that. Then in 1951, the CIA rediscovered the problem with the 131-B2 mixer and found they could recover plain text off the line carrying the encrypted signal from a quarter mile away. Filters for signal and power lines were developed, and the recommended control-perimeter radius was extended to 200 feet (61 m), based more on what commanders could be expected to accomplish than any technical criteria.
A long process of evaluating systems and developing possible solutions followed. Other compromising effects were discovered, such as fluctuations in the power line as rotors stepped. The question of exploiting the noise of electromechanical encryption systems had been raised in the late 1940s but was re-evaluated now as a possible threat. Acoustical emanations could reveal plain text, but only if the pick-up device was close to the source. Nevertheless, even mediocre microphones would do. Soundproofing the room made the problem worse by removing reflections and providing a cleaner signal to the recorder.
In 1956, the Naval Research Laboratory developed a better mixer that operated at much lower voltages and currents and therefore radiated far less. It was incorporated in newer NSA encryption systems. However, many users needed the higher signal levels to drive teleprinters at greater distances or where multiple teleprinters were connected, so the newer encryption devices included the option to switch the signal back up to the higher strength. The NSA began developing techniques and specifications for isolating sensitive-communications pathways through filtering, shielding, grounding, and physical separation: of those lines that carried sensitive plain text—from those intended to carry only non-sensitive data, the latter often extending outside of the secure environment. This separation effort became known as the Red/Black Concept. A 1958 joint policy called NAG-1 set radiation standards for equipment and installations based on a 50 ft (15 m) limit of control. It also specified the classification levels of various aspects of the TEMPEST problem. The policy was adopted by Canada and the UK the next year. Six organizations—the Navy, Army, Air Force, NSA, CIA, and the State Department—were to provide the bulk of the effort for its implementation.
Difficulties quickly emerged. Computerization was becoming important to processing intelligence data, and computers and their peripherals had to be evaluated, wherein many of them evidenced vulnerabilities. The Friden Flexowriter, a popular I/O typewriter at the time, proved to be among the strongest emitters, readable at distances up to 3,200 ft (0.98 km) in field tests. The U.S. Communications Security Board (USCSB) produced a Flexowriter Policy that banned its use overseas for classified information and limited its use within the U.S. to the Confidential level, and then only within a 400 ft (120 m) security zone, but users found the policy onerous and impractical. Later, the NSA found similar problems with the introduction of cathode-ray-tube displays (CRTs), which were also powerful radiators.
There was a multiyear process of moving from policy recommendations to more strictly enforced TEMPEST rules. The resulting Directive 5200.19, coordinated with 22 separate agencies, was signed by Secretary of Defense Robert McNamara in December 1964, but still took months to fully implement. The NSA's formal implementation took effect in June 1966.
Meanwhile, the problem of acoustic emanations became more critical with the discovery of some 900 microphones in U.S. installations overseas, most behind the Iron Curtain. The response was to build room-within-a-room enclosures, some transparent, nicknamed "fish bowls". Other units[ clarification needed ] were fully shielded[ clarification needed ] to contain electronic emanations, but were unpopular with the personnel who were supposed to work inside; they called the enclosures "meat lockers", and sometimes just left their doors open. Nonetheless, they were installed in critical locations, such as the embassy in Moscow, where two were installed: one for State Department use and one for military attachés. A unit installed at the NSA for its key-generation equipment cost $134,000.
TEMPEST standards continued to evolve in the 1970s and later, with newer testing methods and more nuanced guidelines that took account of the risks in specific locations and situations. [11] : Vol I, Ch. 10 During the 80s, security needs were often met with resistance. According to NSA's David G. Boak, "Some of what we still hear today in our own circles, when rigorous technical standards are whittled down in the interest of money and time, are frighteningly reminiscent of the arrogant Third Reich with their Enigma cryptomachine." : ibid p. 19
Many specifics of the TEMPEST standards are classified, but some elements are public. Current United States and NATO Tempest standards define three levels of protection requirements: [12]
Additional standards include:
The NSA and Department of Defense have declassified some TEMPEST elements after Freedom of Information Act requests, but the documents black out many key values and descriptions. The declassified version of the TEMPEST test standard is heavily redacted, with emanation limits and test procedures blacked out.[ citation needed ] [13] A redacted version of the introductory Tempest handbook NACSIM 5000 was publicly released in December 2000. Additionally, the current NATO standard SDIP-27 (before 2006 known as AMSG 720B, AMSG 788A, and AMSG 784) is still classified.
Despite this, some declassified documents give information on the shielding required by TEMPEST standards. For example, Military Handbook 1195 includes the chart at the right, showing electromagnetic shielding requirements at different frequencies. A declassified NSA specification for shielded enclosures offers similar shielding values, requiring, "a minimum of 100 dB insertion loss from 1 kHz to 10 GHz." [14] Since much of the current requirements are still classified, there are no publicly available correlations between this 100 dB shielding requirement and the newer zone-based shielding standards.
In addition, many separation distance requirements and other elements are provided by the declassified NSA red-black installation guidance, NSTISSAM TEMPEST/2-95. [15]
The information-security agencies of several NATO countries publish lists of accredited testing labs and of equipment that has passed these tests:
The United States Army also has a TEMPEST testing facility, as part of the U.S. Army Electronic Proving Ground, at Fort Huachuca, Arizona. Similar lists and facilities exist in other NATO countries.
TEMPEST certification must apply to entire systems, not just to individual components, since connecting a single unshielded component (such as a cable or device) to an otherwise secure system could dramatically alter the system RF characteristics.
TEMPEST standards require "RED/BLACK separation", i.e., maintaining distance or installing shielding between circuits and equipment used to handle plaintext classified or sensitive information that is not encrypted (RED) and secured circuits and equipment (BLACK), the latter including those carrying encrypted signals. Manufacture of TEMPEST-approved equipment must be done under careful quality control to ensure that additional units are built exactly the same as the units that were tested. Changing even a single wire can invalidate the tests.[ citation needed ]
One aspect of TEMPEST testing that distinguishes it from limits on spurious emissions (e.g., FCC Part 15) is a requirement of absolute minimal correlation between radiated energy or detectable emissions and any plaintext data that are being processed.
In 1985, Wim van Eck published the first unclassified technical analysis of the security risks of emanations from computer monitors. This paper caused some consternation in the security community, which had previously believed that such monitoring was a highly sophisticated attack available only to governments; Van Eck successfully eavesdropped on a real system, at a range of hundreds of metres, using just $15 worth of equipment plus a television set.
As a consequence of this research, such emanations are sometimes called "Van Eck radiation", and the eavesdropping technique Van Eck phreaking, although government researchers were already aware of the danger, as Bell Labs noted this vulnerability to secure teleprinter communications during World War II and was able to produce 75% of the plaintext being processed in a secure facility from a distance of 80 feet (24 metres) [20] Additionally the NSA published Tempest Fundamentals, NSA-82-89, NACSIM 5000, National Security Agency (Classified) on February 1, 1982. In addition, the Van Eck technique was successfully demonstrated to non-TEMPEST personnel in Korea during the Korean War in the 1950s. [21]
Markus Kuhn has discovered several low-cost techniques for reducing the chances that emanations from computer displays can be monitored remotely. [22] With CRT displays and analog video cables, filtering out high-frequency components from fonts before rendering them on a computer screen will attenuate the energy at which text characters are broadcast. [23] [24] With modern flat panel displays, the high-speed digital serial interface (DVI) cables from the graphics controller are a main source of compromising emanations. Adding random noise to the least significant bits of pixel values may render the emanations from flat-panel displays unintelligible to eavesdroppers but is not a secure method. Since DVI uses a certain bit code scheme that tries to transport a balanced signal of 0 bits and 1 bits, there may not be much difference between two pixel colors that differ very much in their color or intensity. The emanations can differ drastically even if only the last bit of a pixel's color is changed. The signal received by the eavesdropper also depends on the frequency where the emanations are detected. The signal can be received on many frequencies at once and each frequency's signal differs in contrast and brightness related to a certain color on the screen. Usually, the technique of smothering the RED signal with noise is not effective unless the power of the noise is sufficient to drive the eavesdropper's receiver into saturation thus overwhelming the receiver input.
LED indicators on computer equipment can be a source of compromising optical emanations. [25] One such technique involves the monitoring of the lights on a dial-up modem. Almost all modems flash an LED to show activity, and it is common for the flashes to be directly taken from the data line. As such, a fast optical system can easily see the changes in the flickers from the data being transmitted down the wire.
Recent research [26] has shown it is possible to detect the radiation corresponding to a keypress event from not only wireless (radio) keyboards, but also from traditional wired keyboards [the PS/2 keyboard, for example, contains a microprocessor which will radiate some amount of radio frequency energy when responding to keypresses], and even from laptop keyboards. From the 1970s onward, Soviet bugging of US Embassy IBM Selectric typewriters allowed the keypress-derived mechanical motion of bails, with attached magnets, to be detected by implanted magnetometers, and converted via hidden electronics to a digital radio frequency signal. Each eight character transmission provided Soviet access to sensitive documents, as they were being typed, at US facilities in Moscow and Leningrad. [27]
In 2014, researchers introduced "AirHopper", a bifurcated attack pattern showing the feasibility of data exfiltration from an isolated computer to a nearby mobile phone, using FM frequency signals. [28]
In 2015, "BitWhisper", a Covert Signaling Channel between Air-Gapped Computers using Thermal Manipulations was introduced. "BitWhisper" supports bidirectional communication and requires no additional dedicated peripheral hardware. [29] Later in 2015, researchers introduced GSMem, a method for exfiltrating data from air-gapped computers over cellular frequencies. The transmission - generated by a standard internal bus - renders the computer into a small cellular transmitter antenna. [30] In February 2018, research was published describing how low frequency magnetic fields can be used to escape sensitive data from Faraday-caged, air-gapped computers with malware code-named ’ODINI’ that can control the low frequency magnetic fields emitted from infected computers by regulating the load of CPU cores. [31]
In 2018, a class of side-channel attack was introduced at ACM and Black Hat by Eurecom's researchers: "Screaming Channels". [32] This kind of attack targets mixed-signal chips — containing an analog and digital circuit on the same silicon die — with a radio transmitter. The results of this architecture, often found in connected objects, is that the digital part of the chip will leak some metadata on its computations into the analog part, which leads to metadata's leak being encoded in the noise of the radio transmission. Thanks to signal-processing techniques, researchers were able to extract cryptographic keys used during the communication and decrypt the content. This attack class is supposed, by the authors, to be known already for many years by governmental intelligence agencies.
Communications security is the discipline of preventing unauthorized interceptors from accessing telecommunications in an intelligible form, while still delivering content to the intended recipients.
Electromagnetic compatibility (EMC) is the ability of electrical equipment and systems to function acceptably in their electromagnetic environment, by limiting the unintentional generation, propagation and reception of electromagnetic energy which may cause unwanted effects such as electromagnetic interference (EMI) or even physical damage to operational equipment. The goal of EMC is the correct operation of different equipment in a common electromagnetic environment. It is also the name given to the associated branch of electrical engineering.
In telecommunications, especially radio communication, spread spectrum are techniques by which a signal generated with a particular bandwidth is deliberately spread in the frequency domain over a wider frequency band. Spread-spectrum techniques are used for the establishment of secure communications, increasing resistance to natural interference, noise, and jamming, to prevent detection, to limit power flux density, and to enable multiple-access communications.
Computer and network surveillance is the monitoring of computer activity and data stored locally on a computer or data being transferred over computer networks such as the Internet. This monitoring is often carried out covertly and may be completed by governments, corporations, criminal organizations, or individuals. It may or may not be legal and may or may not require authorization from a court or other independent government agencies. Computer and network surveillance programs are widespread today and almost all Internet traffic can be monitored.
A Faraday cage or Faraday shield is an enclosure used to block some electromagnetic fields. A Faraday shield may be formed by a continuous covering of conductive material, or in the case of a Faraday cage, by a mesh of such materials. Faraday cages are named after scientist Michael Faraday, who first constructed one in 1836.
Wireless communication is the transfer of information (telecommunication) between two or more points without the use of an electrical conductor, optical fiber or other continuous guided medium for the transfer. The most common wireless technologies use radio waves. With radio waves, intended distances can be short, such as a few meters for Bluetooth, or as far as millions of kilometers for deep-space radio communications. It encompasses various types of fixed, mobile, and portable applications, including two-way radios, cellular telephones, personal digital assistants (PDAs), and wireless networking. Other examples of applications of radio wireless technology include GPS units, garage door openers, wireless computer mouse, keyboards and headsets, headphones, radio receivers, satellite television, broadcast television and cordless telephones. Somewhat less common methods of achieving wireless communications involve other electromagnetic phenomena, such as light and magnetic or electric fields, or the use of sound.
This is an index of articles relating to electronics and electricity or natural electricity and things that run on electricity and things that use or conduct electricity.
Van Eck phreaking, also known as Van Eck radiation, is a form of network eavesdropping in which special equipment is used for a side-channel attack on the electromagnetic emissions of electronic devices. While electromagnetic emissions are present in keyboards, printers, and other electronic devices, the most notable use of Van Eck phreaking is in reproducing the contents of a cathode ray tube (CRT) display at a distance.
Broadband over power lines (BPL) is a method of power-line communication (PLC) that allows relatively high-speed digital data transmission over public electric power distribution wiring. BPL uses higher frequencies, a wider frequency range, and different technologies compared to other forms of power-line communications to provide high-rate communication over longer distances. BPL uses frequencies that are part of the radio spectrum allocated to over-the-air communication services; therefore, the prevention of interference to, and from, these services is a very important factor in designing BPL systems.
The National Security Agency took over responsibility for all US government encryption systems when it was formed in 1952. The technical details of most NSA-approved systems are still classified, but much more about its early systems have become known and its most modern systems share at least some features with commercial products.
The red/black concept, sometimes called the red–black architecture or red/black engineering, refers to the careful segregation in cryptographic systems of signals that contain sensitive or classified plaintext information from those that carry encrypted information, or ciphertext. Therefore, the red side is usually considered the internal side, and the black side the more public side, with often some sort of guard, firewall or data-diode between the two.
An air gap, air wall, air gapping or disconnected network is a network security measure employed on one or more computers to ensure that a secure computer network is physically isolated from unsecured networks, such as the public Internet or an unsecured local area network. It means a computer or network has no network interface controllers connected to other networks, with a physical or conceptual air gap, analogous to the air gap used in plumbing to maintain water quality.
In telecommunication, a measuring receiver or measurement receiver is a calibrated laboratory-grade radio receiver designed to measure the characteristics of radio signals. The parameters of such receivers can be adjusted over a much more comprehensive range of values than other radio receivers. Their circuitry is optimized for stability and enables calibration and reproducible results. Some measurement receivers also have exceptionally robust input circuits that can survive brief impulses of more than 1000 V, as they can occur during measurements of radio signals on power lines and other conductors.
Radiofrequency MASINT is one of the six major disciplines generally accepted to make up the field of Measurement and Signature Intelligence (MASINT), with due regard that the MASINT subdisciplines may overlap, and MASINT, in turn, is complementary to more traditional intelligence collection and analysis disciplines such as SIGINT and IMINT. MASINT encompasses intelligence gathering activities that bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).
Countersurveillance refers to measures that are usually undertaken by the public to prevent surveillance, including covert surveillance. Countersurveillance may include electronic methods such as technical surveillance counter-measures, which is the process of detecting surveillance devices. It can also include covert listening devices, visual surveillance devices, and countersurveillance software to thwart unwanted cybercrime, such as accessing computing and mobile devices for various nefarious reasons. More often than not, countersurveillance will employ a set of actions (countermeasures) that, when followed, reduce the risk of surveillance. Countersurveillance is different from sousveillance, as the latter does not necessarily aim to prevent or reduce surveillance.
Rohde & Schwarz GmbH & Co KG is an international electronics group specializing in the fields of electronic test equipment, broadcast & media, cybersecurity, radiomonitoring and radiolocation, and radiocommunication. The company provides also products for the wireless communications, electronics industry, aerospace and defense, homeland security and critical infrastructures.
Computer security compromised by hardware failure is a branch of computer security applied to hardware. The objective of computer security includes protection of information and property from theft, corruption, or natural disaster, while allowing the information and property to remain accessible and productive to its intended users. Such secret information could be retrieved by different ways. This article focus on the retrieval of data thanks to misused hardware or hardware failure. Hardware could be misused or exploited to get secret data. This article collects main types of attack that can lead to data theft.
Air-gap malware is malware that is designed to defeat the air-gap isolation of secure computer systems using various air-gap covert channels.
In cryptography, electromagnetic attacks are side-channel attacks performed by measuring the electromagnetic radiation emitted from a device and performing signal analysis on it. These attacks are a more specific type of what is sometimes referred to as Van Eck phreaking, with the intention to capture encryption keys. Electromagnetic attacks are typically non-invasive and passive, meaning that these attacks are able to be performed by observing the normal functioning of the target device without causing physical damage. However, an attacker may get a better signal with less noise by depackaging the chip and collecting the signal closer to the source. These attacks are successful against cryptographic implementations that perform different operations based on the data currently being processed, such as the square-and-multiply implementation of RSA. Different operations emit different amounts of radiation and an electromagnetic trace of encryption may show the exact operations being performed, allowing an attacker to retrieve full or partial private keys.
Yuval Elovici is a computer scientist. He is a professor in the Department of Software and Information Systems Engineering at Ben-Gurion University of the Negev (BGU), where he is the incumbent of the Davide and Irene Sala Chair in Homeland Security Research. He is the director of the Cyber Security Research Center at BGU and the founder and director of the Telekom Innovation Laboratories at Ben-Gurion University. In addition to his roles at BGU, he also serves as the lab director of Singapore University of Technology and Design’s (SUTD) ST Electronics-SUTD Cyber Security Laboratory, as well as the research director of iTrust. In 2014 he co-founded Morphisec, a start-up company, that develops cyber security mechanisms related to moving target defense.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)All of the implants were quite sophisticated. Each implant had a magnetometer that converted the mechanical energy of key strokes into local magnetic disturbances. The electronics package in the implant responded to these disturbances, categorized the underlying data, and transmitted the results to a nearby listening post. Data were transmitted via radio frequency. The implant was enabled by remote control.[...] the movement of the bails determined which character had been typed because each character had a unique binary movement corresponding to the bails. The magnetic energy picked up by the sensors in the bar was converted into a digital electrical signal. The signals were compressed into a four-bit frequency select word. The bug was able to store up to eight four-bit characters. When the buffer was full, a transmitter in the bar sent the information out to Soviet sensors.