File integrity monitoring

Last updated

File integrity monitoring (FIM) is an internal control or process that performs the act of validating the integrity of operating system and application software files using a verification method between the current file state and a known, good baseline. This comparison method often involves calculating a known cryptographic checksum of the file's original baseline and comparing with the calculated checksum of the current state of the file. [1] Other file attributes can also be used to monitor integrity. [2]

Contents

Generally, the act of performing file integrity monitoring is automated using internal controls such as an application or process. Such monitoring can be performed randomly, at a defined polling interval, or in real-time.

Security objectives

Changes to configurations, files and file attributes across the IT infrastructure are common, but hidden within a large volume of daily changes can be the few that impact file or configuration integrity. These changes can also reduce security posture and in some cases may be leading indicators of a breach in progress. Values monitored for unexpected changes to files or configuration items include:

Compliance objectives

Multiple compliance objectives indicate file integrity monitoring as a requirement. Several examples of compliance objectives with the requirement for file integrity monitoring include:

See also

Procedures and algorithms:

Applications, some examples (where FIM is used) include:

Related Research Articles

<span class="mw-page-title-main">Sarbanes–Oxley Act</span> 2002 U.S. law regarding corporate accounting

The Sarbanes–Oxley Act of 2002 is a United States federal law that mandates certain practices in financial record keeping and reporting for corporations.

<span class="mw-page-title-main">Health Insurance Portability and Accountability Act</span> United States federal law concerning health information

The Health Insurance Portability and Accountability Act of 1996 is a United States Act of Congress enacted by the 104th United States Congress and signed into law by President Bill Clinton on August 21, 1996. It modernized the flow of healthcare information, stipulates how personally identifiable information maintained by the healthcare and healthcare insurance industries should be protected from fraud and theft, and addressed some limitations on healthcare insurance coverage. It generally prohibits healthcare providers and healthcare businesses, called covered entities, from disclosing protected information to anyone other than a patient and the patient's authorized representatives without their consent. With limited exceptions, it does not restrict patients from receiving information about themselves. It does not prohibit patients from voluntarily sharing their health information however they choose, nor does it require confidentiality where a patient discloses medical information to family members, friends, or other individuals not a part of a covered entity.

<span class="mw-page-title-main">Cryptographic hash function</span> Hash function that is suitable for use in cryptography

A cryptographic hash function (CHF) is a hash algorithm that has special properties desirable for a cryptographic application:

File verification is the process of using an algorithm for verifying the integrity of a computer file, usually by checksum. This can be done by comparing two files bit-by-bit, but requires two copies of the same file, and may miss systematic corruptions which might occur to both files. A more popular approach is to generate a hash of the copied file and comparing that to the hash of the original file.

In general, compliance means conforming to a rule, such as a specification, policy, standard or law. Compliance has traditionally been explained by reference to the deterrence theory, according to which punishing a behavior will decrease the violations both by the wrongdoer and by others. This view has been supported by economic theory, which has framed punishment in terms of costs and has explained compliance in terms of a cost-benefit equilibrium. However, psychological research on motivation provides an alternative view: granting rewards or imposing fines for a certain behavior is a form of extrinsic motivation that weakens intrinsic motivation and ultimately undermines compliance.

<span class="mw-page-title-main">Federal Information Security Management Act of 2002</span> United States federal law

The Federal Information Security Management Act of 2002 is a United States federal law enacted in 2002 as Title III of the E-Government Act of 2002. The act recognized the importance of information security to the economic and national security interests of the United States. The act requires each federal agency to develop, document, and implement an agency-wide program to provide information security for the information and information systems that support the operations and assets of the agency, including those provided or managed by another agency, contractor, or other source.

In business and accounting, information technology controls are specific activities performed by persons or systems designed to ensure that business objectives are met. They are a subset of an enterprise's internal control. IT control objectives relate to the confidentiality, integrity, and availability of data and the overall management of the IT function of the business enterprise. IT controls are often described in two categories: IT general controls (ITGC) and IT application controls. ITGC include controls over the Information Technology (IT) environment, computer operations, access to programs and data, program development and program changes. IT application controls refer to transaction processing controls, sometimes called "input-processing-output" controls. Information technology controls have been given increased prominence in corporations listed in the United States by the Sarbanes-Oxley Act. The COBIT Framework is a widely used framework promulgated by the IT Governance Institute, which defines a variety of ITGC and application control objectives and recommended evaluation approaches. IT departments in organizations are often led by a chief information officer (CIO), who is responsible for ensuring effective information technology controls are utilized.

<span class="mw-page-title-main">IT security standards</span> Technology standards and techniques

IT security standards or cyber security standards are techniques generally outlined in published materials that attempt to protect the cyber environment of a user or organization. This environment includes users themselves, networks, devices, all software, processes, information in storage or transit, applications, services, and systems that can be connected directly or indirectly to networks.

In computing, off-site data protection, or vaulting, is the strategy of sending critical data out of the main location as part of a disaster recovery plan. Data is usually transported off-site using removable storage media such as magnetic tape or optical storage. Data can also be sent electronically via a remote backup service, which is known as electronic vaulting or e-vaulting. Sending backups off-site ensures systems and servers can be reloaded with the latest data in the event of a disaster, accidental error, or system crash. Sending backups off-site also ensures that there is a copy of pertinent data that is not stored on-site.

Data governance is a term used on both a macro and a micro level. The former is a political concept and forms part of international relations and Internet governance; the latter is a data management concept and forms part of corporate data governance.

Internal control, as defined by accounting and auditing, is a process for assuring of an organization's objectives in operational effectiveness and efficiency, reliable financial reporting, and compliance with laws, regulations and policies. A broad concept, internal control involves everything that controls risks to an organization.

The Security Content Automation Protocol (SCAP) is a method for using specific standards to enable automated vulnerability management, measurement, and policy compliance evaluation of systems deployed in an organization, including e.g., FISMA compliance. The National Vulnerability Database (NVD) is the U.S. government content repository for SCAP. An example of an implementation of SCAP is OpenSCAP.

LogLogic is a technology company that specializes in Security Management, Compliance Reporting, and IT Operations products. LogLogic developed the first appliance-based log management platform. LogLogic's Log Management platform collects and correlates user activity and event data. LogLogic's products are used by many of the world's largest enterprises to rapidly identify and alert on compliance violations, policy breaches, cyber attacks, and insider threats.

The Log Management Knowledge Base is a free database of detailed descriptions on over 20,000 event logs generated by Windows systems, syslog devices and applications. Provided as a free service to the IT community by Prism Microsystems, the aim of the Knowledge Base is to help IT personnel make sense of the large amounts of cryptic and arcane log data generated by network systems and IT infrastructures.

Database activity monitoring is a database security technology for monitoring and analyzing database activity. DAM may combine data from network-based monitoring and native audit information to provide a comprehensive picture of database activity. The data gathered by DAM is used to analyze and report on database activity, support breach investigations, and alert on anomalies. DAM is typically performed continuously and in real-time.

Managed Trusted Internet Protocol Service (MTIPS) was developed by the US General Services Administration (GSA) to allow US Federal agencies to physically and logically connect to the public Internet and other external connections in compliance with the Office of Management and Budget's (OMB) Trusted Internet Connection (TIC) Initiative.

<span class="mw-page-title-main">Security information and event management</span> Computer security

Security information and event management (SIEM) is a field within the field of computer security, where software products and services combine security information management (SIM) and security event management (SEM). They provide real-time analysis of security alerts generated by applications and network hardware. Vendors sell SIEM as software, as appliances, or as managed services; these products are also used to log security data and generate reports for compliance purposes. The term and the initialism SIEM was coined by Mark Nicolett and Amrit Williams of Gartner in 2005.

Control system security, or industrial control system (ICS) cybersecurity, is the prevention of interference with the proper operation of industrial automation and control systems. These control systems manage essential services including electricity, petroleum production, water, transportation, manufacturing, and communications. They rely on computers, networks, operating systems, applications, and programmable controllers, each of which could contain security vulnerabilities. The 2010 discovery of the Stuxnet worm demonstrated the vulnerability of these systems to cyber incidents. The United States and other governments have passed cyber-security regulations requiring enhanced protection for control systems operating critical infrastructure.

LogRhythm, Inc. is a global security intelligence company that specializes in Security Information and Event Management (SIEM), log management, network monitoring, and user behavior and security analytics. Headquartered in Boulder, Colorado, LogRhythm operates in North and South America; Europe; India, the Middle East, Turkey, and Africa; and the Asia Pacific region.

NIST Special Publication 800-92, "Guide to Computer Security Log Management", establishes guidelines and recommendations for securing and managing sensitive log data. The publication was prepared by Karen Kent and Murugiah Souppaya of the National Institute of Science and Technology and published under the SP 800-Series; a repository of best practices for the InfoSec community. Log management is essential to ensuring that computer security records are stored in sufficient detail for an appropriate period of time.

References

  1. "Verisys - How it Works". Ionx. Retrieved 2012-09-21.
  2. "File Integrity Monitoring". nCircle. Archived from the original on 2012-04-10. Retrieved 2012-04-18.
  3. "Payment Card Industry Data Security Standard" (PDF). PCI Security Council. Retrieved 2011-10-11.
  4. "Sarbanes-Oxley Sections 302 & 404 - A White Paper Proposing Practival, Cost Effective Compliance Strategies" (PDF). Card Decisions, Inc. Retrieved 2011-10-11.
  5. "Standard CIP-010-2 - Security Configuration, Change Management and Vulnerability Assessments". North American Electric Reliability Corporation (NERC). Retrieved 2016-06-06.
  6. "Applying NIST SP 800-53 to Industrial Control Systems" (PDF). National Institute of Standards and Technology (NIST). Retrieved 2011-10-11.
  7. Scholl, M. A.; Stine, K. M.; Hash, J.; Bowen, P.; Johnson, L. A.; Smith, C. D.; Steinberg, D. I. (2008). "An Introductory Resource Guide for Implementing the Health Insurance Portability and Accountability Act (HIPAA) Security Rule" (PDF). National Institute of Standards and Technology. doi:10.6028/NIST.SP.800-66r1 . Retrieved 2011-10-11.{{cite journal}}: Cite journal requires |journal= (help)
  8. "Critical Control 3: Secure Configurations for Hardware and Software on Mobile Devices, Laptops, Workstations, and Servers". SANS Institute. Archived from the original on 2013-01-08. Retrieved 2012-11-19.
  9. "AFICK (Another File Integrity ChecKer)". afick.sourceforge.net/. Retrieved 2020-01-19.
  10. "CimTrak Integrity Suite | Cimcor". www.cimcor.com. Retrieved 2022-07-21.
  11. "Lockpath Announces Significant Updates to Blacklight Platform". finance.yahoo.com. Retrieved 2019-07-16.
  12. "Mark Kerrison of 'New Net Technologies' Explains "SecureOps" and Vulnerability Tracking | TechNadu". www.technadu.com. Retrieved 2020-10-14.