Security Controls for Computer Systems, commonly called the Ware report, [1] [2] is a 1970 text by Willis Ware that was foundational in the field of computer security. [3]
A defense contractor in St. Louis, Missouri, had bought an IBM mainframe computer, which it was using for classified work on a fighter aircraft. [4] To provide additional income, the contractor asked the Department of Defense (DoD) for permission to sell computer time on the mainframe to local businesses via remote terminals, while the classified work continued. [4]
At the time, the DoD did not have a policy to cover this. The DoD's Advanced Research Projects Agency (ARPA) asked Ware - a RAND employee - to chair a committee to examine and report on the feasibility of security controls for computer systems. [4] [5]
The committee's report was a classified document given in January 1970 to the Defense Science Board (DSB), which had taken over the project from ARPA. [4] After declassification, the report was published by RAND in October 1979. [4]
The IEEE Computer Society said the report was widely circulated, [1] and the IEEE Annals of the History of Computing said that it, together with Ware's 1967 Spring Joint Computer Conference session, marked the start of the field of computer security. [3] [6]
The report influenced security certification standards and processes, especially in the banking and defense industries, where the report was instrumental in creating the Orange Book. [2]
Computer security, cybersecurity, or information technology security is the protection of computer systems and networks from information disclosure, theft of or damage to their hardware, software, or electronic data, as well as from the disruption or misdirection of the services they provide.
Security engineering is the process of incorporating security controls into an information system so that the controls become an integral part of the system’s operational capabilities. It is similar to other systems engineering activities in that its primary motivation is to support the delivery of engineering solutions that satisfy pre-defined functional and user requirements, but it has the added dimension of preventing misuse and malicious behavior. Those constraints and restrictions are often asserted as a security policy.
In the security engineering subspecialty of computer science, a trusted system is one that is relied upon to a specified extent to enforce a specified security policy. This is equivalent to saying that a trusted system is one whose failure would break a security policy.
In computer security, a covert channel is a type of attack that creates a capability to transfer information objects between processes that are not supposed to be allowed to communicate by the computer security policy. The term, originated in 1973 by Butler Lampson, is defined as channels "not intended for information transfer at all, such as the service program's effect on system load," to distinguish it from legitimate channels that are subjected to access controls by COMPUSEC.
The Information Processing Techniques Office (IPTO), originally "Command and Control Research", was part of the Defense Advanced Research Projects Agency of the United States Department of Defense.
This article presents a timeline of events in the history of computer operating systems from 1951 to the current day. For a narrative explaining the overall developments, see the History of operating systems.
The Advanced Research Projects Agency Network (ARPANET) was the first wide-area packet-switched network with distributed control and one of the first networks to implement the TCP/IP protocol suite. Both technologies became the technical foundation of the Internet. The ARPANET was established by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.
In computing, security-evaluated operating systems have achieved certification from an external security-auditing organization, the most popular evaluations are Common Criteria (CC) and FIPS 140-2.
In computer security, mandatory access control (MAC) refers to a type of access control by which the operating system or database constrains the ability of a subject or initiator to access or generally perform some sort of operation on an object or target. In the case of operating systems, a subject is usually a process or thread; objects are constructs such as files, directories, TCP/UDP ports, shared memory segments, IO devices, etc. Subjects and objects each have a set of security attributes. Whenever a subject attempts to access an object, an authorization rule enforced by the operating system kernel examines these security attributes and decides whether the access can take place. Any operation by any subject on any object is tested against the set of authorization rules to determine if the operation is allowed. A database management system, in its access control mechanism, can also apply mandatory access control; in this case, the objects are tables, views, procedures, etc.
In computer security, discretionary access control (DAC) is a type of access control defined by the Trusted Computer System Evaluation Criteria "as a means of restricting access to objects based on the identity of subjects and/or groups to which they belong. The controls are discretionary in the sense that a subject with a certain access permission is capable of passing that permission on to any other subject ."
Multilevel security or multiple levels of security (MLS) is the application of a computer system to process information with incompatible classifications, permit access by users with different security clearances and needs-to-know, and prevent users from obtaining access to information for which they lack authorization. There are two contexts for the use of multilevel security. One is to refer to a system that is adequate to protect itself from subversion and has robust mechanisms to separate information domains, that is, trustworthy. Another context is to refer to an application of a computer that will require the computer to be strong enough to protect itself from subversion and possess adequate mechanisms to separate information domains, that is, a system we must trust. This distinction is important because systems that need to be trusted are not necessarily trustworthy.
A penetration test, colloquially known as a pen test or ethical hacking, is an authorized simulated cyberattack on a computer system, performed to evaluate the security of the system; this is not to be confused with a vulnerability assessment. The test is performed to identify weaknesses, including the potential for unauthorized parties to gain access to the system's features and data, as well as strengths, enabling a full risk assessment to be completed.
System Development Corporation (SDC) was a computer software company based in Santa Monica, California. Founded in 1955, it is considered the first company of its kind.
Secure Computing Corporation (SCC) was a public company that developed and sold computer security appliances and hosted services to protect users and data. McAfee acquired the company in 2008.
Blacker is a U.S. Department of Defense computer network security project designed to achieve A1 class ratings of the Trusted Computer System Evaluation Criteria (TCSEC).
Trusted Computer System Evaluation Criteria (TCSEC) is a United States Government Department of Defense (DoD) standard that sets basic requirements for assessing the effectiveness of computer security controls built into a computer system. The TCSEC was used to evaluate, classify, and select computer systems being considered for the processing, storage, and retrieval of sensitive or classified information.
Howard George Willis Ware, popularly known as Willis Howard Ware was an American computer pioneer who co-developed the IAS machine that laid down the blueprint of the modern day computer in the late 20th century. He was also a pioneer of privacy rights, social critic of technology policy, and a founder in the field of computer security.
Stephen Joseph Lukasik was an American physicist who served in multiple high-level defense and scientific related positions for advancing the technologies and techniques for national defense and the detection and control of diverse types of weapons of mass destruction, especially nuclear devices. He was the second longest serving Director of DARPA - the Defense Advanced Research Projects Agency – during which numerous new technologies including packet and internet protocols were developed. He was also the first Chief Scientist of the Federal Communications Commission where he created its Office of Science and Technology and which facilitated the commercial deployment of new technology that included spread spectrum technology.
Security and Privacy in Computer Systems is a paper by Willis Ware that was first presented to the public at the 1967 Spring Joint Computer Conference.
Edward G. Amoroso is an American computer security professional, entrepreneur, author, and educator based in the New York City area. His research interests have centered on techniques and criteria for measuring trustworthy software development. the application of these methods to secure software development for critical projects in the defense and aerospace industries, and redefining trust parameters for improved security in the cloud. Early on in his career, he was involved with the design of security protections for the Unix operating system in support of the US Government Orange Book security evaluation criteria. This research lead to real-time security design and trusted software protections for the United States Ballistic Missile Defense Program, also known as Star Wars. He has also pioneered concepts related to microsegmentation, a design strategy that allows for the creation of secure zones in data centers and cloud deployments.
Security Controls for Computer Systems, tech. report R-609-PR, RAND, Defense Science Board Task Force on Computer Security, 1972. R-609- 1-PR was reissued Oct. 1979. This widely circulated report was informally known as 'the Ware report.'
The heritage of most security certification standards in the banking industry can be traced back to ... 'Security Controls for Computer Systems' (commonly known as the Ware Report...), focussed on the problem of protecting classified information in multi-access, resource-sharing, computer systems which were at the time being increasingly used by both the government and defense contractors. The report included not only recommendations for what security functionality such systems should have in order to safely process classified information, but also proposed certification procedures for verifying whether a system meets these criteria. These certification procedures formed the basis for the Trusted Computer System Evaluation Criteria (TCSEC). The requirements and assessment criteria for TCSEC are given in 5200.28-STD, colloquially known as the 'Orange Book', but that publication is augmented by others in the 'Rainbow Series', expanding and clarifying various aspects.
The 1967 Spring Joint Computer Conference session organized by Willis Ware and the 1970 Ware Report are widely held by computer security practitioners and historians to have defined the field's origin.CS1 maint: date format (link)
Willis Ware (chair), 1967 Defense Science Board Study. Problem: Significant number of systems being acquired for military use. Charge: Formulate recommendations for hardware and software safeguards to protect classified information in multi-user, resource-sharing computer systems.
The 1970 (Willis H.) Ware Report and the 1967 Spring Joint Computer Conference (SJCC) Ware-led 'Computer Security and Privacy' session are focal points of historians and computer security scientists and are generally considered the beginning of multilevel computer security.CS1 maint: date format (link)