File integrity monitoring (FIM) is an internal control or process that performs the act of validating the integrity of operating system and application software files using a verification method between the current file state and a known, good baseline. This comparison method often involves calculating a known cryptographic checksum of the file's original baseline and comparing with the calculated checksum of the current state of the file. [1] Other file attributes can also be used to monitor integrity. [2]
Generally, the act of performing file integrity monitoring is automated using internal controls such as an application or process. Such monitoring can be performed randomly, at a defined polling interval, or in real-time.
Changes to configurations, files and file attributes across the IT infrastructure are common, but hidden within a large volume of daily changes can be the few that impact file or configuration integrity. These changes can also reduce security posture and in some cases may be leading indicators of a breach in progress. Values monitored for unexpected changes to files or configuration items include:
Multiple compliance objectives indicate file integrity monitoring as a requirement. Several examples of compliance objectives with the requirement for file integrity monitoring include:
Procedures and algorithms:
Applications, some examples (where FIM is used) include:
The Sarbanes–Oxley Act of 2002 is a United States federal law that mandates certain practices in financial record keeping and reporting for corporations. The act, Pub. L.Tooltip Public Law 107–204 (text)(PDF), 116 Stat. 745, enacted July 30, 2002, also known as the "Public Company Accounting Reform and Investor Protection Act" and "Corporate and Auditing Accountability, Responsibility, and Transparency Act" and more commonly called Sarbanes–Oxley, SOX or Sarbox, contains eleven sections that place requirements on all U.S. public company boards of directors and management and public accounting firms. A number of provisions of the Act also apply to privately held companies, such as the willful destruction of evidence to impede a federal investigation.
A cryptographic hash function (CHF) is a hash algorithm that has special properties desirable for a cryptographic application:
File verification is the process of using an algorithm for verifying the integrity of a computer file, usually by checksum. This can be done by comparing two files bit-by-bit, but requires two copies of the same file, and may miss systematic corruptions which might occur to both files. A more popular approach is to generate a hash of the copied file and comparing that to the hash of the original file.
Key management refers to management of cryptographic keys in a cryptosystem. This includes dealing with the generation, exchange, storage, use, crypto-shredding (destruction) and replacement of keys. It includes cryptographic protocol design, key servers, user procedures, and other relevant protocols.
In general, compliance means conforming to a rule, such as a specification, policy, standard or law. Compliance has traditionally been explained by reference to deterrence theory, according to which punishing a behavior will decrease the violations both by the wrongdoer and by others. This view has been supported by economic theory, which has framed punishment in terms of costs and has explained compliance in terms of a cost-benefit equilibrium. However, psychological research on motivation provides an alternative view: granting rewards or imposing fines for a certain behavior is a form of extrinsic motivation that weakens intrinsic motivation and ultimately undermines compliance.
The Federal Information Security Management Act of 2002 is a United States federal law enacted in 2002 as Title III of the E-Government Act of 2002. The act recognized the importance of information security to the economic and national security interests of the United States. The act requires each federal agency to develop, document, and implement an agency-wide program to provide information security for the information and information systems that support the operations and assets of the agency, including those provided or managed by another agency, contractor, or other source.
Information technology controls are specific activities performed by persons or systems to ensure that computer systems operate in a way that minimises risk. They are a subset of an organisation's internal control. IT control objectives typically relate to assuring the confidentiality, integrity, and availability of data and the overall management of the IT function. IT controls are often described in two categories: IT general controls (ITGC) and IT application controls. ITGC includes controls over the hardware, system software, operational processes, access to programs and data, program development and program changes. IT application controls refer to controls to ensure the integrity of the information processed by the IT environment. Information technology controls have been given increased prominence in corporations listed in the United States by the Sarbanes-Oxley Act. The COBIT Framework is a widely used framework promulgated by the IT Governance Institute, which defines a variety of ITGC and application control objectives and recommended evaluation approaches.
Information security standards are techniques generally outlined in published materials that attempt to protect the cyber environment of a user or organization. This environment includes users themselves, networks, devices, all software, processes, information in storage or transit, applications, services, and systems that can be connected directly or indirectly to networks.
In computing, off-site data protection, or vaulting, is the strategy of sending critical data out of the main location as part of a disaster recovery plan. Data is usually transported off-site using removable storage media such as magnetic tape or optical storage. Data can also be sent electronically via a remote backup service, which is known as electronic vaulting or e-vaulting. Sending backups off-site ensures systems and servers can be reloaded with the latest data in the event of a disaster, accidental error, or system crash. Sending backups off-site also ensures that there is a copy of pertinent data that is not stored on-site.
Security controls are safeguards or countermeasures to avoid, detect, counteract, or minimize security risks to physical property, information, computer systems, or other assets. In the field of information security, such controls protect the confidentiality, integrity and availability of information.
Continuous monitoring is the process and technology used to detect compliance and risk issues associated with an organization's financial and operational environment. The financial and operational environment consists of people, processes, and systems working together to support efficient and effective operations. Controls are put in place to address risks within these components. Through continuous monitoring of the operations and controls, weak or poorly designed or implemented controls can be corrected or replaced – thus enhancing the organization's operational risk profile. Investors, governments, the public, and other stakeholders continue to increase their demands for more effective corporate governance and business transparency.
The Security Content Automation Protocol (SCAP) is a method for using specific standards to enable automated vulnerability management, measurement, and policy compliance evaluation of systems deployed in an organization, including e.g., FISMA compliance. The National Vulnerability Database (NVD) is the U.S. government content repository for SCAP. An example of an implementation of SCAP is OpenSCAP. SCAP is a suite of tools that have been compiled to be compatible with various protocols for things like configuration management, compliance requirements, software flaws, or vulnerabilities patching. Accumulation of these standards provides a means for data to be communicated between humans and machines efficiently. The objective of the framework is to promote a communal approach to the implementation of automated security mechanisms that are not monopolized.
LogLogic is a technology company that specializes in Security Management, Compliance Reporting, and IT Operations products. LogLogic developed the first appliance-based log management platform. LogLogic's Log Management platform collects and correlates user activity and event data. LogLogic's products are used by many of the world's largest enterprises to rapidly identify and alert on compliance violations, policy breaches, cyber attacks, and insider threats.
The Log Management Knowledge Base is a free database of detailed descriptions on over 20,000 event logs generated by Windows systems, syslog devices and applications. Provided as a free service to the IT community by Prism Microsystems, the aim of the Knowledge Base is to help IT personnel make sense of the large amounts of cryptic and arcane log data generated by network systems and IT infrastructures.
Database activity monitoring is a database security technology for monitoring and analyzing database activity. DAM may combine data from network-based monitoring and native audit information to provide a comprehensive picture of database activity. The data gathered by DAM is used to analyze and report on database activity, support breach investigations, and alert on anomalies. DAM is typically performed continuously and in real-time.
Managed Trusted Internet Protocol Service (MTIPS) was developed by the US General Services Administration (GSA) to allow US Federal agencies to physically and logically connect to the public Internet and other external connections in compliance with the Office of Management and Budget's (OMB) Trusted Internet Connection (TIC) Initiative.
Security information and event management (SIEM) is a field within computer security that combines security information management (SIM) and security event management (SEM) to enable real-time analysis of security alerts generated by applications and network hardware. SIEM systems are central to the operation of security operations centers (SOCs), where they are employed to detect, investigate, and respond to security incidents. SIEM technology collects and aggregates data from various systems, allowing organizations to meet compliance requirements while safeguarding against threats.
Control system security, or industrial control system (ICS) cybersecurity, is the prevention of interference with the proper operation of industrial automation and control systems. These control systems manage essential services including electricity, petroleum production, water, transportation, manufacturing, and communications. They rely on computers, networks, operating systems, applications, and programmable controllers, each of which could contain security vulnerabilities. The 2010 discovery of the Stuxnet worm demonstrated the vulnerability of these systems to cyber incidents. The United States and other governments have passed cyber-security regulations requiring enhanced protection for control systems operating critical infrastructure.
LogRhythm, Inc. is a global security intelligence company that specializes in Security Information and Event Management (SIEM), log management, network monitoring, user behavior and security analytics. Headquartered in Boulder, Colorado, LogRhythm operates in North and South America, Europe, India, the Middle East, Turkey, Africa, and the Asia Pacific region.
NIST Special Publication 800-92, "Guide to Computer Security Log Management", establishes guidelines and recommendations for securing and managing sensitive log data. The publication was prepared by Karen Kent and Murugiah Souppaya of the National Institute of Science and Technology and published under the SP 800-Series; a repository of best practices for the InfoSec community. Log management is essential to ensuring that computer security records are stored in sufficient detail for an appropriate period of time.
{{cite journal}}
: Cite journal requires |journal=
(help)