Security through obscurity

Last updated
Security through obscurity should not be used as the only security feature of a system. Security through obscurity hiding a key on a car tyre.jpg
Security through obscurity should not be used as the only security feature of a system.

In security engineering, security through obscurity is the practice of concealing the details or mechanisms of a system to enhance its security. This approach relies on the principle of hiding something in plain sight, akin to a magician's sleight of hand or the use of camouflage. It diverges from traditional security methods, such as physical locks, and is more about obscuring information or characteristics to deter potential threats. Examples of this practice include disguising sensitive information within commonplace items, like a piece of paper in a book, or altering digital footprints, such as spoofing a web browser's version number. While not a standalone solution, security through obscurity can complement other security measures in certain scenarios. [1]

Contents

Obscurity in the context of security engineering is the notion that information can be protected, to a certain extent, when it is difficult to access or comprehend. This concept hinges on the principle of making the details or workings of a system less visible or understandable, thereby reducing the likelihood of unauthorized access or manipulation. [2]

Security by obscurity alone is discouraged and not recommended by standards bodies.

History

An early opponent of security through obscurity was the locksmith Alfred Charles Hobbs, who in 1851 demonstrated to the public how state-of-the-art locks could be picked. In response to concerns that exposing security flaws in the design of locks could make them more vulnerable to criminals, he said: "Rogues are very keen in their profession, and know already much more than we can teach them." [3]

There is scant formal literature on the issue of security through obscurity. Books on security engineering cite Kerckhoffs' doctrine from 1883 if they cite anything at all. For example, in a discussion about secrecy and openness in nuclear command and control:

[T]he benefits of reducing the likelihood of an accidental war were considered to outweigh the possible benefits of secrecy. This is a modern reincarnation of Kerckhoffs' doctrine, first put forward in the nineteenth century, that the security of a system should depend on its key, not on its design remaining obscure. [4]

Peter Swire has written about the trade-off between the notion that "security through obscurity is an illusion" and the military notion that "loose lips sink ships", [5] as well as on how competition affects the incentives to disclose. [6] [ further explanation needed ]

There are conflicting stories about the origin of this term. Fans of MIT's Incompatible Timesharing System (ITS) say it was coined in opposition to Multics users down the hall, for whom security was far more an issue than on ITS. Within the ITS culture, the term referred, self-mockingly, to the poor coverage of the documentation and obscurity of many commands, and to the attitude that by the time a tourist figured out how to make trouble he'd generally got over the urge to make it, because he felt part of the community. One instance of deliberate security through obscurity on ITS has been noted: the command to allow patching the running ITS system (altmode altmode control-R) echoed as $$^D. Typing Alt Alt Control-D set a flag that would prevent patching the system even if the user later got it right. [7]

In January 2020, NPR reported that Democratic Party officials in Iowa declined to share information regarding the security of its caucus app, to "make sure we are not relaying information that could be used against us." Cybersecurity experts replied that "to withhold the technical details of its app doesn't do much to protect the system." [8]

Criticism

Security by obscurity alone is discouraged and not recommended by standards bodies. The National Institute of Standards and Technology (NIST) in the United States recommends against this practice: "System security should not depend on the secrecy of the implementation or its components." [9] The Common Weakness Enumeration project lists "Reliance on Security Through Obscurity" as CWE-656. [10]

A large number of telecommunication and digital rights management cryptosystems use security through obscurity, but have ultimately been broken. These include components of GSM, GMR encryption, GPRS encryption, a number of RFID encryption schemes, and most recently Terrestrial Trunked Radio (TETRA). [11]

One of the largest proponents of security through obscurity commonly seen today is anti-malware software. What typically occurs with this single point of failure, however, is an arms race of attackers finding novel ways to avoid detection and defenders coming up with increasingly contrived but secret signatures to flag on. [12]

The technique stands in contrast with security by design and open security, although many real-world projects include elements of all strategies.

Obscurity in architecture vs. technique

Knowledge of how the system is built differs from concealment and camouflage. The effectiveness of obscurity in operations security depends on whether the obscurity lives on top of other good security practices, or if it is being used alone. [13] When used as an independent layer, obscurity is considered a valid security tool. [14]

In recent years, more advanced versions of "security through obscurity" have gained support as a methodology in cybersecurity through Moving Target Defense and cyber deception. [15] NIST's cyber resiliency framework, 800-160 Volume 2, recommends the usage of security through obscurity as a complementary part of a resilient and secure computing environment. [16]

See also

Related Research Articles

In cryptography, key size or key length refers to the number of bits in a key used by a cryptographic algorithm.

<span class="mw-page-title-main">Computer security</span> Protection of computer systems from information disclosure, theft or damage

Computer security is the protection of computer software, systems and networks from threats that can lead to unauthorized information disclosure, theft or damage to hardware, software, or data, as well as from the disruption or misdirection of the services they provide.

<span class="mw-page-title-main">Encryption</span> Process of converting plaintext to ciphertext

In cryptography, encryption is the process of transforming information in a way that, ideally, only authorized parties can decode. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Despite its goal, encryption does not itself prevent interference but denies the intelligible content to a would-be interceptor.

In the field of computer security, independent researchers often discover flaws in software that can be abused to cause unintended behaviour; these flaws are called vulnerabilities. The process by which the analysis of these vulnerabilities is shared with third parties is the subject of much debate, and is referred to as the researcher's disclosure policy. Full disclosure is the practice of publishing analysis of software vulnerabilities as early as possible, making the data accessible to everyone without restriction. The primary purpose of widely disseminating information about vulnerabilities is so that potential victims are as knowledgeable as those who attack them.

A key in cryptography is a piece of information, usually a string of numbers or letters that are stored in a file, which, when processed through a cryptographic algorithm, can encode or decode cryptographic data. Based on the used method, the key can be different sizes and varieties, but in all cases, the strength of the encryption relies on the security of the key being maintained. A key's security strength is dependent on its algorithm, the size of the key, the generation of the key, and the process of key exchange.

Kerckhoffs's principle of cryptography was stated by Dutch-born cryptographer Auguste Kerckhoffs in the 19th century. The principle holds that a cryptosystem should be secure, even if everything about the system, except the key, is public knowledge. This concept is widely embraced by cryptographers, in contrast to security through obscurity, which is not.

Transport Layer Security (TLS) is a cryptographic protocol designed to provide communications security over a computer network, such as the Internet. The protocol is widely used in applications such as email, instant messaging, and voice over IP, but its use in securing HTTPS remains the most publicly visible.

<span class="mw-page-title-main">Auguste Kerckhoffs</span> Dutch linguist and cryptographer

Auguste Kerckhoffs was a Dutch linguist and cryptographer in the late 19th century.

Terrestrial Trunked Radio, a European standard for a trunked radio system, is a professional mobile radio and two-way transceiver specification. TETRA was specifically designed for use by government agencies, emergency services, for public safety networks, rail transport staff for train radios, transport services and the military. TETRA is the European version of trunked radio, similar to Project 25.

In cryptography, snake oil is any cryptographic method or product considered to be bogus or fraudulent. The name derives from snake oil, one type of patent medicine widely available in 19th century United States.

<span class="mw-page-title-main">Project 25</span> Set of Telecommunications Standards

Project 25 is a suite of standards for interoperable digital two-way radio products. P25 was developed by public safety professionals in North America and has gained acceptance for public safety, security, public service, and commercial applications worldwide. P25 radios are a direct replacement for analog UHF radios, adding the ability to transfer data as well as voice for more natural implementations of encryption and text messaging. P25 radios are commonly implemented by dispatch organizations, such as police, fire, ambulance and emergency rescue service, using vehicle-mounted radios combined with repeaters and handheld walkie-talkie use.

In cryptography and steganography, plausibly deniable encryption describes encryption techniques where the existence of an encrypted file or message is deniable in the sense that an adversary cannot prove that the plaintext data exists.

<span class="mw-page-title-main">End-to-end encryption</span> Encryption model where only the sender and recipient can read the ciphertext

End-to-end encryption (E2EE) is a method of implementing a secure communication system where only communicating users can participate. No one else, including the system provider, telecom providers, Internet providers or malicious actors, can access the cryptographic keys needed to read or send messages.

Data security means protecting digital data, such as those in a database, from destructive forces and from the unwanted actions of unauthorized users, such as a cyberattack or a data breach.

Secure by design, in software engineering, means that software products and capabilities have been designed to be foundationally secure.

<span class="mw-page-title-main">Cryptography</span> Practice and study of secure communication techniques

Cryptography, or cryptology, is the practice and study of techniques for secure communication in the presence of adversarial behavior. More generally, cryptography is about constructing and analyzing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, information security, electrical engineering, digital signal processing, physics, and others. Core concepts related to information security are also central to cryptography. Practical applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications.

Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are currently thought to be secure against a cryptanalytic attack by a quantum computer. Most widely-used public-key algorithms rely on the difficulty of one of three mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.

<span class="mw-page-title-main">Common Weakness Enumeration</span> Catalog of software weaknesses and vulnerabilities

The Common Weakness Enumeration (CWE) is a category system for hardware and software weaknesses and vulnerabilities. It is sustained by a community project with the goals of understanding flaws in software and hardware and creating automated tools that can be used to identify, fix, and prevent those flaws. The project is sponsored by the office of the U.S. Department of Homeland Security (DHS) Cybersecurity and Infrastructure Security Agency (CISA), which is operated by The MITRE Corporation, with support from US-CERT and the National Cyber Security Division of the U.S. Department of Homeland Security.

Active defense can refer to a defensive strategy in the military or cybersecurity arena.

Internet security awareness or Cyber security awareness refers to how much end-users know about the cyber security threats their networks face, the risks they introduce and mitigating security best practices to guide their behavior. End users are considered the weakest link and the primary vulnerability within a network. Since end-users are a major vulnerability, technical means to improve security are not enough. Organizations could also seek to reduce the risk of the human element. This could be accomplished by providing security best practice guidance for end users' awareness of cyber security. Employees could be taught about common threats and how to avoid or mitigate them.

References

  1. Zwicky, Elizabeth D.; Cooper, Simon; Chapman, D. Brent (2000-06-26). Building Internet Firewalls: Internet and Web Security. "O'Reilly Media, Inc.". ISBN   978-0-596-55188-9.
  2. Selinger, Evan and Hartzog, Woodrow, Obscurity and Privacy (May 21, 2014). Routledge Companion to Philosophy of Technology (Joseph Pitt & Ashley Shew, eds., 2014 Forthcoming), Available at SSRN: https://ssrn.com/abstract=2439866
  3. Stross, Randall (17 December 2006). "Theater of the Absurd at the T.S.A." The New York Times. Archived from the original on 8 December 2022. Retrieved 5 May 2015.
  4. Anderson, Ross (2001). Security Engineering: A Guide to Building Dependable Distributed Systems . New York, NY: John Wiley & Sons, Inc. p.  240. ISBN   0-471-38922-6.
  5. Swire, Peter P. (2004). "A Model for When Disclosure Helps Security: What is Different About Computer and Network Security?". Journal on Telecommunications and High Technology Law. 2. SSRN   531782.
  6. Swire, Peter P. (January 2006). "A Theory of Disclosure for Security and Competitive Reasons: Open Source, Proprietary Software, and Government Agencies". Houston Law Review. 42. SSRN   842228.
  7. "security through obscurity". The Jargon File. Archived from the original on 2010-03-29. Retrieved 2010-01-29.
  8. "Despite Election Security Fears, Iowa Caucuses Will Use New Smartphone App". NPR.org. Archived from the original on 2022-12-23. Retrieved 2020-02-06.
  9. "Guide to General Server Security" (PDF; 258 kB). National Institute of Standards and Technology. 2008-07-01. Archived (PDF) from the original on 2017-08-09.
  10. "CWE-656: Reliance on Security Through Obscurity". The MITRE Corporation. 2008-01-18. Archived from the original on 2023-09-28. Retrieved 2023-09-28.
  11. Midnight Blue (August 2023). ALL COPS ARE BROADCASTING: Breaking TETRA after decades in the shadows (slideshow) (PDF). Blackhat USA 2023. Archived (PDF) from the original on 2023-08-11. Retrieved 2023-08-11.
    Carlo Meijer; Wouter Bokslag; Jos Wetzels (August 2023). All cops are broadcasting: TETRA under scrutiny (paper) (PDF). Usenix Security 2023. Archived (PDF) from the original on 2023-08-11. Retrieved 2023-08-11.
  12. KPMG (May 2022). "The cat and mouse game of antivirus evasion". Archived from the original on 2023-08-28. Retrieved 2023-08-28.
  13. "Obscurity is a Valid Security Layer - Daniel Miessler". Daniel Miessler. Archived from the original on 2022-12-08. Retrieved 2018-06-20.
  14. "Cyber Deception | CSIAC". www.csiac.org. Archived from the original on 2021-04-20. Retrieved 2018-06-20.
  15. "CSD-MTD". Department of Homeland Security. 2013-06-25. Archived from the original on 2022-12-08. Retrieved 2018-06-20.
  16. Ross, Ron; Graubart, Richard; Bodeau, Deborah; McQuaid, Rosalie (2018-03-21). Systems Security Engineering: Cyber Resiliency Considerations for the Engineering of Trustworthy Secure Systems (Report). National Institute of Standards and Technology. Archived from the original on 2023-12-06. Retrieved 2024-04-05.