Analytical quality control

Last updated

Analytical quality control (AQC) refers to all those processes and procedures designed to ensure that the results of laboratory analysis are consistent, comparable, accurate and within specified limits of precision. [1] Constituents submitted to the analytical laboratory must be accurately described to avoid faulty interpretations, approximations, or incorrect results. [2] The qualitative and quantitative data generated from the laboratory can then be used for decision making. In the chemical sense, quantitative analysis refers to the measurement of the amount or concentration of an element or chemical compound in a matrix that differs from the element or compound. [3] Fields such as industry, medicine, and law enforcement can make use of AQC.

Contents

In the laboratory

AQC processes are of particular importance in laboratories analysing environmental samples where the concentration of chemical species present may be extremely low and close to the detection limit of the analytical method. In well managed laboratories, AQC processes are built into the routine operations of the laboratory often by the random introduction of known standards into the sample stream or by the use of spiked samples.

Quality control begins with sample collection and ends with the reporting of data. [4] AQC is achieved through laboratory control of analytical performance. Initial control of the complete system can be achieved through specification of laboratory services, instrumentation, glassware, reagents, solvents, and gases. However, evaluation of daily performance must be documented to ensure continual production of valid data. A check should first be done to ensure that the data should be seen is precise and accurate. Next, systematic daily checks such as analysing blanks, calibration standards, quality control check samples, and references must be performed to establish the reproducibility of the data. The checks help certify that the methodology is measuring what is in the sample.

The quality of individual AQC efforts can be variable depending on the training, professional pride, and importance of a particular project to a particular analyst. The burden of an individual analyst originating AQC efforts can be lessened through the implementation of quality assurance programs. Through the implementation of established and routine quality assurance programs, two primary functions are fulfilled: the determination of quality, and the control of quality. By monitoring the accuracy and precision of results, the quality assurance program should increase confidence in the reliability of the reported analytical results, thereby achieving adequate AQC.

Pharmaceutical industry

Validation of analytical procedures is imperative in demonstrating that a drug substance is suitable for a particular purpose. [5] Common validation characteristics include: accuracy, precision (repeatability and intermediate precision), specificity, detection limit, quantitation limit, linearity, range, and robustness. In cases such as changes in synthesis of the drug substance, changes in composition of the finished product, and changes in the analytical procedure, revalidation is necessary to ensure quality control.

All analytical procedures should be validated. Identification tests are conducted to ensure the identity of an analyte in a sample through comparison of the sample to a reference standard through methods such as spectrum, chromatographic behavior, and chemical reactivity. [5] Impurity testing can either be a quantitative test or a limit test. Both tests should accurately measure the purity of the sample. Quantitative tests of either the active moiety or other components of a sample can be conducted through assay procedures. Other analytical procedures such as dissolution testing or particle size determination may also need to be validated and are equally important.

Statistics

Because of the complex inter-relationship between analytical method, sample concentration, limits of detection and method precision, the management of Analytical Quality Control is undertaken using a statistical approach to determine whether the results obtained lie within an acceptable statistical envelope.

Inter-laboratory calibration

In circumstances where more than one laboratory is analysing samples and feeding data into a large programme of work such as the Harmonised monitoring scheme in the UK, AQC can also be applied to validate one laboratory against another. In such cases the work may be referred to as inter-laboratory calibration. [6]

Related Research Articles

<span class="mw-page-title-main">Spectrophotometry</span> Branch of spectroscopy

Spectrophotometry is a branch of electromagnetic spectroscopy concerned with the quantitative measurement of the reflection or transmission properties of a material as a function of wavelength. Spectrophotometry uses photometers, known as spectrophotometers, that can measure the intensity of a light beam at different wavelengths. Although spectrophotometry is most commonly applied to ultraviolet, visible, and infrared radiation, modern spectrophotometers can interrogate wide swaths of the electromagnetic spectrum, including x-ray, ultraviolet, visible, infrared, and/or microwave wavelengths.

An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity of a target entity. The measured entity is often called the analyte, the measurand, or the target of the assay. The analyte can be a drug, biochemical substance, chemical element or compound, or cell in an organism or organic sample. An assay usually aims to measure an analyte's intensive property and express it in the relevant measurement unit.

<span class="mw-page-title-main">Total organic carbon</span> Concentration of organic carbon in a sample

Total organic carbon (TOC) is an analytical parameter representing the concentration of organic carbon in a sample. TOC determinations are made in a variety of application areas. For example, TOC may be used as a non-specific indicator of water quality, or TOC of source rock may be used as one factor in evaluating a petroleum play. For marine surface sediments average TOC content is 0.5% in the deep ocean, and 2% along the eastern margins.

<span class="mw-page-title-main">Forensic chemistry</span> Forensic application of the study of chemistry

Forensic chemistry is the application of chemistry and its subfield, forensic toxicology, in a legal setting. A forensic chemist can assist in the identification of unknown materials found at a crime scene. Specialists in this field have a wide array of methods and instruments to help identify unknown substances. These include high-performance liquid chromatography, gas chromatography-mass spectrometry, atomic absorption spectroscopy, Fourier transform infrared spectroscopy, and thin layer chromatography. The range of different methods is important due to the destructive nature of some instruments and the number of possible unknown substances that can be found at a scene. Forensic chemists prefer using nondestructive methods first, to preserve evidence and to determine which destructive methods will produce the best results.

The limit of detection is the lowest signal, or the lowest corresponding quantity to be determined from the signal, that can be observed with a sufficient degree of confidence or statistical significance. However, the exact threshold used to decide when a signal significantly emerges above the continuously fluctuating background noise remains arbitrary and is a matter of policy and often of debate among scientists, statisticians and regulators depending on the stakes in different fields.

<span class="mw-page-title-main">Data analysis</span> The process of analyzing data to discover useful information and support decision-making

Data analysis is the process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, informing conclusions, and supporting decision-making. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, and is used in different business, science, and social science domains. In today's business world, data analysis plays a role in making decisions more scientific and helping businesses operate more effectively.

The process of establishing documentary evidence demonstrating that a procedure, process, or activity carried out in testing and then production maintains the desired level of compliance at all stages. In the pharmaceutical industry, it is very important that in addition to final testing and compliance of products, it is also assured that the process will consistently produce the expected results. The desired results are established in terms of specifications for outcome of the process. Qualification of systems and equipment is therefore a part of the process of validation. Validation is a requirement of food, drug and pharmaceutical regulating agencies such as the US FDA and their good manufacturing practices guidelines. Since a wide variety of procedures, processes, and activities need to be validated, the field of validation is divided into a number of subsections including the following:

A test method is a method for a test in science or engineering, such as a physical test, chemical test, or statistical test. It is a definitive procedure that produces a test result. In order to ensure accurate and relevant test results, a test method should be "explicit, unambiguous, and experimentally feasible.", as well as effective and reproducible.

<span class="mw-page-title-main">Data collection</span> Gathering information for analysis

Data collection or data gathering is the process of gathering and measuring information on targeted variables in an established system, which then enables one to answer relevant questions and evaluate outcomes. Data collection is a research component in all study fields, including physical and social sciences, humanities, and business. While methods vary by discipline, the emphasis on ensuring accurate and honest collection remains the same. The goal for all data collection is to capture evidence that allows data analysis to lead to the formulation of credible answers to the questions that have been posed.

Verification and validation are independent procedures that are used together for checking that a product, service, or system meets requirements and specifications and that it fulfills its intended purpose. These are critical components of a quality management system such as ISO 9000. The words "verification" and "validation" are sometimes preceded with "independent", indicating that the verification and validation is to be performed by a disinterested third party. "Integration verification and validation" can be abbreviated as "IV&V".

<span class="mw-page-title-main">Food sampling</span>

Food sampling is a process used to check that a food is safe and that it does not contain harmful contaminants, or that it contains only permitted additives at acceptable levels, or that it contains the right levels of key ingredients and its label declarations are correct, or to know the levels of nutrients present.

Laboratory quality control is designed to detect, reduce, and correct deficiencies in a laboratory's internal analytical process prior to the release of patient results, in order to improve the quality of the results reported by the laboratory. Quality control (QC) is a measure of precision, or how well the measurement system reproduces the same result over time and under varying operating conditions. Laboratory quality control material is usually run at the beginning of each shift, after an instrument is serviced, when reagent lots are changed, after equipment calibration, and whenever patient results seem inappropriate. Quality control material should approximate the same matrix as patient specimens, taking into account properties such as viscosity, turbidity, composition, and color. It should be simple to use, with minimal vial-to-vial variability, because variability could be misinterpreted as systematic error in the method or instrument. It should be stable for long periods of time, and available in large enough quantities for a single batch to last at least one year. Liquid controls are more convenient than lyophilized (freeze-dried) controls because they do not have to be reconstituted, minimizing pipetting error. Dried Tube Specimen (DTS) is slightly cumbersome as a QC material but it is very low-cost, stable over long periods and efficient, especially useful for resource-restricted settings in under-developed and developing countries. DTS can be manufactured in-house by a laboratory or Blood Bank for its use.

<span class="mw-page-title-main">Metallurgical assay</span> Compositional analysis of an ore, metal, or alloy

A metallurgical assay is a compositional analysis of an ore, metal, or alloy, usually performed in order to test for purity or quality.

Radioanalytical chemistry focuses on the analysis of sample for their radionuclide content. Various methods are employed to purify and identify the radioelement of interest through chemical methods and sample measurement techniques.

Colorado River Watch is a statewide volunteer water quality monitoring program operated by the non-profit organization Earth Force, in collaboration with Colorado Parks and Wildlife. Its mission is to work with voluntary stewards to monitor water quality and other indicators of watershed health, and utilize this high quality data to educate citizens and inform decision makers about the condition of Colorado's waters. This data is also used in the Clean Water Act decision-making process. River Watch's motto is "Real people doing real science for a real purpose."

Cochran's test, named after William G. Cochran, is a one-sided upper limit variance outlier statistical test. The C test is used to decide if a single estimate of a variance is significantly larger than a group of variances with which the single estimate is supposed to be comparable. The C test is discussed in many text books and has been recommended by IUPAC and ISO. Cochran's C test should not be confused with Cochran's Q test, which applies to the analysis of two-way randomized block designs.

Process validation is the analysis of data gathered throughout the design and manufacturing of a product in order to confirm that the process can reliably output products of a determined standard. Regulatory authorities like EMA and FDA have published guidelines relating to process validation. The purpose of process validation is to ensure varied inputs lead to consistent and high quality outputs. Process validation is an ongoing process that must be frequently adapted as manufacturing feedback is gathered. End-to-end validation of production processes is essential in determining product quality because quality cannot always be determined by finished-product inspection. Process validation can be broken down into 3 steps: process design, process qualification, and continued process verification.

Wireline quality assurance and quality control is a set of requirements and operating procedures which take place before, during, and after the wireline logging job. The main merits of wireline QA/QC include accuracy and precision of recorded data and information. Accuracy is a measure of the correctness of the result and is generally depended on how well the systematic errors are controlled and compensated for. Precision is depended on how well random errors are analysed and overcame.

A certificate of analysis (COA) is a formal laboratory-prepared document that details the results of one or more laboratory analyses, signed—manually or electronically—by an authorized representative of the entity conducting the analyses. This document gives assurances to the recipient that the analyzed item is what it is designated to be, or has the features advertised by the producer. The design and content of a COA may be based upon a set of requirements identified by the lab, by regulatory-driven requirements, and/or by standards developed by standard developing organizations. The COA is used in a wide variety of industries, including but not limited to the agriculture, chemical, clinical research, food and beverage, and pharmaceutical industries.

Workplace exposure monitoring is the monitoring of substances in a workplace that are chemical or biological hazards. It is performed in the context of workplace exposure assessment and risk assessment. Exposure monitoring analyzes hazardous substances in the air or on surfaces of a workplace, and is complementary to biomonitoring, which instead analyzes toxicants or their effects within workers.

References

  1. analytical quality control (AQC) program to ensure the highest level of confidence in reported data Archived March 28, 2012, at the Wayback Machine
  2. US EPA, ORD (March 1979). "HANDBOOK FOR ANALYTICAL QUALITY CONTROL IN WATER AND WASTEWATER LABORATORIES" (PDF). www.epa.gov. Archived from the original on November 14, 2013.{{cite web}}: CS1 maint: unfit URL (link)
  3. "An IAEA Service: Analytical Quality Control" (PDF). Archived from the original (PDF) on November 14, 2013.
  4. TECHNICAL REPORT: GUIDANCE TO OPERATION OF WATER QUALITY LABORATORIES
  5. 1 2 VALIDATION OF ANALYTICAL PROCEDURES: TEXT AND METHODOLOGY
  6. Committee, Analytical Quality Control (January 1, 1979). "Accuracy of determination of chloride in river waters: Analytical Quality Control in the Harmonised Monitoring Scheme". Analyst. 104 (1237): 290–298. doi:10.1039/AN9790400290 via pubs.rsc.org.