This article needs additional citations for verification .(April 2012) |
Information quality (IQ) is the quality of the content of information systems. It is often pragmatically defined as: "The fitness for use of the information provided". IQ frameworks also provides a tangible approach to assess and measure DQ/IQ in a robust and rigorous manner. [1]
Although this pragmatic definition is usable for most everyday purposes, specialists often use more complex models for information quality. Most information system practitioners use the term synonymously with data quality. However, as many academics make a distinction between data and information, [2] some will the process to guarantee confidence that particular information meets some context specific quality requirements. It has been suggested, however, that higher the quality the greater will be the confidence in meeting more general, less specific contexts. [3]
"Information quality" is a measure of the value which the information provides to the user of that information. [1] "Quality" is often perceived as subjective and the quality of information can then vary among users and among uses of the information. Nevertheless, a high degree of quality increases its objectivity or at least the intersubjectivity. Accuracy can be seen as just one element of IQ but, depending upon how it is defined, can also be seen as encompassing many other dimensions of quality.
If not, it is perceived that often there is a trade-off between accuracy and other dimensions, aspects or elements of the information determining its suitability for any given tasks. Richard Wang and Diane Strong propose a list of dimensions or elements used in assessing Information Quality is: [4]
Other authors propose similar but different lists of dimensions for analysis, and emphasize measurement and reporting as information quality metrics. Larry English prefers the term "characteristics" to dimensions. [6] In fact, a considerable amount of information quality research involves investigating and describing various categories of desirable attributes (or dimensions) of data. Research has recently shown the huge diversity of terms and classification structures used. [7]
While information as a distinct term has various ambiguous definitions, there's one which is more general, such as "description of events". While the occurrences being described cannot be subjectively evaluated for quality, since they're very much autonomous events in space and time, their description can—since it possesses a garnishment attribute, unavoidably attached by the medium which carried the information, from the initial moment of the occurrences being described.
In an attempt to deal with this natural phenomenon, qualified professionals primarily representing the researchers' guild, have at one point or another identified particular metrics for information quality. They could also be described as 'quality traits' of information, since they're not so easily quantified, but rather subjectively identified on an individual basis.
Source: [1]
Authority refers to the expertise or recognized official status of a source. Consider the reputation of the author and publisher. When working with legal or government information, consider whether the source is the official provider of the information. Verifiability refers to the ability of a reader to verify the validity of the information irrespective of how authoritative the source is. To verify the facts is part of the duty of care of the journalistic deontology, as well as, where possible, to provide the sources of information so that they can be verified
Scope of coverage refers to the extent to which a source explores a topic. Consider time periods, geography or jurisdiction and coverage of related or narrower topics.
Composition and organization has to do with the ability of the information source to present its particular message in a coherent, logically sequential manner.
Objectivity is the bias or opinion expressed when a writer interprets or analyze facts. Consider the use of persuasive language, the source's presentation of other viewpoints, its reason for providing the information and advertising.
Validity of some information has to do with the degree of obvious truthfulness which the information carries
As much as ‘uniqueness’ of a given piece of information is intuitive in meaning, it also significantly implies not only the originating point of the information but also the manner in which it is presented and thus the perception which it conjures. The essence of any piece of information we process consists to a large extent of those two elements.
Timeliness refers to information that is current at the time of publication. Consider publication, creation and revision dates. Beware of Web site scripting that automatically reflects the current day's date on a page.
Means that documented methods are capable of being used on the same data set to achieve a consistent result.
A number of major conferences relevant to information quality are held annually:
An audit is an "independent examination of financial information of any entity, whether profit oriented or not, irrespective of its size or legal form when such an examination is conducted with a view to express an opinion thereon." Auditing also attempts to ensure that the books of accounts are properly maintained by the concern as required by law. Auditors consider the propositions before them, obtain evidence, roll forward prior year working papers, and evaluate the propositions in their auditing report.
In engineering, a requirement is a condition that must be satisfied for the output of a work effort to be acceptable. It is an explicit, objective, clear and often quantitative description of a condition to be satisfied by a material, design, product, or service.
In systems engineering, dependability is a measure of a system's availability, reliability, maintainability, and in some cases, other characteristics such as durability, safety and security. In real-time computing, dependability is the ability to provide services that can be trusted within a time-period. The service guarantees must hold even when the system is subject to attacks or natural failures.
The Logical Framework Approach (LFA) is a methodology mainly used for designing, monitoring, and evaluating international development projects. Variations of this tool are known as Goal Oriented Project Planning (GOPP) or Objectives Oriented Project Planning (OOPP).
Data quality refers to the state of qualitative or quantitative pieces of information. There are many definitions of data quality, but data is generally considered high quality if it is "fit for [its] intended uses in operations, decision making and planning". Moreover, data is deemed of high quality if it correctly represents the real-world construct to which it refers. Furthermore, apart from these definitions, as the number of data sources increases, the question of internal data consistency becomes significant, regardless of fitness for use for any particular external purpose. People's views on data quality can often be in disagreement, even when discussing the same set of data used for the same purpose. When this is the case, data governance is used to form agreed upon definitions and standards for data quality. In such cases, data cleansing, including standardization, may be required in order to ensure data quality.
An information technology audit, or information systems audit, is an examination of the management controls within an Information technology (IT) infrastructure and business applications. The evaluation of evidence obtained determines if the information systems are safeguarding assets, maintaining data integrity, and operating effectively to achieve the organization's goals or objectives. These reviews may be performed in conjunction with a financial statement audit, internal audit, or other form of attestation engagement.
Information technology controls are specific activities performed by persons or systems to ensure that computer systems operate in a way that minimises risk. They are a subset of an organisation's internal control. IT control objectives typically relate to assuring the confidentiality, integrity, and availability of data and the overall management of the IT function. IT controls are often described in two categories: IT general controls (ITGC) and IT application controls. ITGC includes controls over the hardware, system software, operational processes, access to programs and data, program development and program changes. IT application controls refer to controls to ensure the integrity of the information processed by the IT environment. Information technology controls have been given increased prominence in corporations listed in the United States by the Sarbanes-Oxley Act. The COBIT Framework is a widely used framework promulgated by the IT Governance Institute, which defines a variety of ITGC and application control objectives and recommended evaluation approaches.
Video quality is a characteristic of a video passed through a video transmission or processing system that describes perceived video degradation. Video processing systems may introduce some amount of distortion or artifacts in the video signal that negatively impact the user's perception of the system. For many stakeholders in video production and distribution, ensuring video quality is an important task.
Software visualization or software visualisation refers to the visualization of information of and related to software systems—either the architecture of its source code or metrics of their runtime behavior—and their development process by means of static, interactive or animated 2-D or 3-D visual representations of their structure, execution, behavior, and evolution.
The process of establishing documentary evidence demonstrating that a procedure, process, or activity carried out in testing and then production maintains the desired level of compliance at all stages. In the pharmaceutical industry, it is very important that in addition to final testing and compliance of products, it is also assured that the process will consistently produce the expected results. The desired results are established in terms of specifications for outcome of the process. Qualification of systems and equipment is therefore a part of the process of validation. Validation is a requirement of food, drug and pharmaceutical regulating agencies such as the US FDA and their good manufacturing practices guidelines. Since a wide variety of procedures, processes, and activities need to be validated, the field of validation is divided into a number of subsections including the following:
Internal auditing is an independent, objective assurance and consulting activity designed to add value and improve an organization's operations. It helps an organization accomplish its objectives by bringing a systematic, disciplined approach to evaluate and improve the effectiveness of risk management, control and governance processes. Internal auditing might achieve this goal by providing insight and recommendations based on analyses and assessments of data and business processes. With commitment to integrity and accountability, internal auditing provides value to governing bodies and senior management as an objective source of independent advice. Professionals called internal auditors are employed by organizations to perform the internal auditing activity.
Data governance is a term used on both a macro and a micro level. The former is a political concept and forms part of international relations and Internet governance; the latter is a data management concept and forms part of corporate data governance.
Sentiment analysis is the use of natural language processing, text analysis, computational linguistics, and biometrics to systematically identify, extract, quantify, and study affective states and subjective information. Sentiment analysis is widely applied to voice of the customer materials such as reviews and survey responses, online and social media, and healthcare materials for applications that range from marketing to customer service to clinical medicine. With the rise of deep language models, such as RoBERTa, also more difficult data domains can be analyzed, e.g., news texts where authors typically express their opinion/sentiment less explicitly.
Perceptual Evaluation of Audio Quality (PEAQ) is a standardized algorithm for objectively measuring perceived audio quality, developed in 1994–1998 by a joint venture of experts within Task Group 6Q of the International Telecommunication Union's Radiocommunication Sector (ITU-R). It was originally released as ITU-R Recommendation BS.1387 in 1998 and last updated in 2023. It utilizes software to simulate perceptual properties of the human ear and then integrates multiple model output variables into a single metric.
Image quality can refer to the level of accuracy with which different imaging systems capture, process, store, compress, transmit and display the signals that form an image. Another definition refers to image quality as "the weighted combination of all of the visually significant attributes of an image". The difference between the two definitions is that one focuses on the characteristics of signal processing in different imaging systems and the latter on the perceptual assessments that make an image pleasant for human viewers.
Master data management (MDM) is a discipline in which business and information technology work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Verification and validation of computer simulation models is conducted during the development of a simulation model with the ultimate goal of producing an accurate and credible model. "Simulation models are increasingly being used to solve problems and to aid in decision-making. The developers and users of these models, the decision makers using information obtained from the results of these models, and the individuals affected by decisions based on such models are all rightly concerned with whether a model and its results are "correct". This concern is addressed through verification and validation of the simulation model.
Video Multimethod Assessment Fusion (VMAF) is an objective full-reference video quality metric developed by Netflix in cooperation with the University of Southern California, The IPI/LS2N lab Nantes Université, and the Laboratory for Image and Video Engineering (LIVE) at The University of Texas at Austin. It predicts subjective video quality based on a reference and distorted video sequence. The metric can be used to evaluate the quality of different video codecs, encoders, encoding settings, or transmission variants.
Wikipedia's volunteer editor community has the responsibility of fact-checking Wikipedia's content. Their aim is to curb the dissemination of misinformation and disinformation by the website.
Richard Y. Wang is the founder and executive director of the Chief Data Officer and Information Quality (CDOIQ) Program at the Massachusetts Institute of Technology. Wang is widely acknowledged as the "Founder of Information Quality"—the scholar who made Information Quality an established field. For the past three decades, he advocated that the importance of information quality must be embraced at the highest level of organizations. He championed and led a movement to establish the position of Chief Data Officers in all organizations. His pioneering work culminated in a wide-scale adoption of the Chief Data Officer role worldwide. Notably, in 2019, the U.S. Congress enacted the Foundations for Evidence-Based Policymaking Act of 2018 into law, which statutorily mandated all federal agencies to establish and appoint a CDO for their agency.