Real-time outbreak and disease surveillance

Last updated

Real-time outbreak and disease surveillance system (RODS) is a syndromic surveillance system developed by the University of Pittsburgh, Department of Biomedical Informatics. [1] It is "prototype developed at the University of Pittsburgh where real-time clinical data from emergency departments within a geographic region can be integrated to provide an instantaneous picture of symptom patterns and early detection of epidemic events." [2]

RODS uses a combination of various monitoring tools. [3]

  1. The first tool is a moving average with a 120-day sliding phase-I-window.[ clarification needed ]
  2. The second tool is a nonstandard combination of CUSUM and EWMA, where an EWMA is used to predict next-day counts, and a CuSum monitors the residuals from these predictions.
  3. The third monitoring tool in RODS is a recursive least squares (RLS) algorithm, which fits an autoregressive model to the counts and updates estimates continuously by minimizing prediction error. A Shewhart I-chart[ clarification needed ] is then applied to the residuals, using a threshold of 4 standard deviations.
  4. The fourth tool in RODS implements a wavelet approach, which decomposes the time series using Haar wavelets, and uses the lowest resolution to remove long-term trends from the raw series. The residuals are then monitored using an ordinary Shewhart I-chart with a threshold of 4 standard deviations.

Related Research Articles

Biodefense refers to measures to counter biological threats, reduce biological risks, and prepare for, respond to, and recover from bioincidents, whether naturally occurring, accidental, or deliberate in origin and whether impacting human, animal, plant, or environmental health. Biodefense measures often aim to improve biosecurity or biosafety. Biodefense is frequently discussed in the context of biological warfare or bioterrorism, and is generally considered a military or emergency response term.

Audio Video Coding Standard (AVS) refers to the digital audio and digital video series compression standard formulated by the Audio and Video coding standard workgroup of China. Work began in 2002, and three generations of standards were published.

<span class="mw-page-title-main">Time series</span> Sequence of data points over time

In mathematics, a time series is a series of data points indexed in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average.

<span class="mw-page-title-main">Control chart</span> Process control tool to determine if a manufacturing process is in a state of control

Control charts are graphical plots used in production control to determine whether quality and manufacturing processes are being controlled under stable conditions. The hourly status is arranged on the graph, and the occurrence of abnormalities is judged based on the presence of data that differs from the conventional trend or deviates from the control limit line. Control charts are classified into Shewhart individuals control chart and CUSUM(CUsUM)(or cumulative sum control chart)(ISO 7870-4).

Statistical process control (SPC) or statistical quality control (SQC) is the application of statistical methods to monitor and control the quality of a production process. This helps to ensure that the process operates efficiently, producing more specification-conforming products with less waste scrap. SPC can be applied to any process where the "conforming product" output can be measured. Key tools used in SPC include run charts, control charts, a focus on continuous improvement, and the design of experiments. An example of a process where SPC is applied is manufacturing lines.

Receiver autonomous integrity monitoring (RAIM) is a technology developed to assess the integrity of individual signals collected and integrated by the receiver units employed in a Global Navigation Satellite System (GNSS). The integrity of received signals and resulting correctness and precision of derived receiver location are of special importance in safety-critical GNSS applications, such as in aviation or marine navigation.

Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.

<span class="mw-page-title-main">Nelson rules</span> Decision rules for interpreting control-chart data

Nelson rules are a method in process control of determining whether some measured variable is out of control. Rules for detecting "out-of-control" or non-random conditions were first postulated by Walter A. Shewhart in the 1920s. The Nelson rules were first published in the October 1984 issue of the Journal of Quality Technology in an article by Lloyd S Nelson.

Lossless JPEG is a 1993 addition to JPEG standard by the Joint Photographic Experts Group to enable lossless compression. However, the term may also be used to refer to all lossless compression schemes developed by the group, including JPEG 2000, JPEG-LS, and JPEG XL.

<span class="mw-page-title-main">Disease surveillance</span> Monitoring spread of disease to establish patterns of progression

Disease surveillance is an epidemiological practice by which the spread of disease is monitored in order to establish patterns of progression. The main role of disease surveillance is to predict, observe, and minimize the harm caused by outbreak, epidemic, and pandemic situations, as well as increase knowledge about which factors contribute to such circumstances. A key part of modern disease surveillance is the practice of disease case reporting.

In data analysis, anomaly detection is generally understood to be the identification of rare items, events or observations which deviate significantly from the majority of the data and do not conform to a well defined notion of normal behavior. Such examples may arouse suspicions of being generated by a different mechanism, or appear inconsistent with the remainder of that set of data.

Analyse-it is a statistical analysis add-in for Microsoft Excel. Analyse-it is the successor to Astute, developed in 1992 for Excel 4 and the first statistical analysis add-in for Microsoft Excel. Analyse-it provides a range of standard parametric and non-parametric procedures, including Descriptive statistics, ANOVA, ANCOVA, Mann–Whitney, Wilcoxon, chi-square, correlation, linear regression, logistic regression, polynomial regression and advanced model fitting, principal component analysis, and factor analysis.

In industrial statistics, the X-bar chart is a type of Shewhart control chart that is used to monitor the arithmetic means of successive samples of constant size, n. This type of control chart is used for characteristics that can be measured on a continuous scale, such as weight, temperature, thickness etc. For example, one might take a sample of 5 shafts from production every hour, measure the diameter of each, and then plot, for each sample, the average of the five diameter values on the chart.

In statistical quality control, the CUSUM is a sequential analysis technique developed by E. S. Page of the University of Cambridge. It is typically used for monitoring change detection. CUSUM was announced in Biometrika, in 1954, a few years after the publication of Wald's sequential probability ratio test (SPRT).

In statistics, regression validation is the process of deciding whether the numerical results quantifying hypothesized relationships between variables, obtained from regression analysis, are acceptable as descriptions of the data. The validation process can involve analyzing the goodness of fit of the regression, analyzing whether the regression residuals are random, and checking whether the model's predictive performance deteriorates substantially when applied to data that were not used in model estimation.

<span class="texhtml mvar" style="font-style:italic;">x̅</span> and s chart

In statistical quality control, the and s chart is a type of control chart used to monitor variables data when samples are collected at regular intervals from a business or industrial process. This is connected to traditional statistical quality control (SQC) and statistical process control (SPC). However, Woodall noted that "I believe that the use of control charts and other monitoring methods should be referred to as “statistical process monitoring,” not “statistical process control (SPC).”"

Laboratory quality control is designed to detect, reduce, and correct deficiencies in a laboratory's internal analytical process prior to the release of patient results, in order to improve the quality of the results reported by the laboratory. Quality control (QC) is a measure of precision, or how well the measurement system reproduces the same result over time and under varying operating conditions. Laboratory quality control material is usually run at the beginning of each shift, after an instrument is serviced, when reagent lots are changed, after equipment calibration, and whenever patient results seem inappropriate. Quality control material should approximate the same matrix as patient specimens, taking into account properties such as viscosity, turbidity, composition, and color. IIt should be stable for long periods of time, and available in large enough quantities for a single batch to last at least one year. Liquid controls are more convenient than lyophilized (freeze-dried) controls because they do not have to be reconstituted, minimizing pipetting error. Dried Tube Specimen (DTS) is slightly cumbersome as a QC material but it is very low-cost, stable over long periods and efficient, especially useful for resource-restricted settings in under-developed and developing countries. DTS can be manufactured in-house by a laboratory or Blood Bank for its use.

<span class="mw-page-title-main">EWMA chart</span> Type of control chart in statistical quality control

In statistical quality control, the EWMA chart is a type of control chart used to monitor either variables or attributes-type data using the monitored business or industrial process's entire history of output. While other control charts treat rational subgroups of samples individually, the EWMA chart tracks the exponentially-weighted moving average of all prior sample means. EWMA weights samples in geometrically decreasing order so that the most recent samples are weighted most highly while the most distant samples contribute very little.

<span class="mw-page-title-main">COVID-19 surveillance</span> Measures to monitor the spread of the respiratory disease

COVID-19 surveillance involves monitoring the spread of the coronavirus disease in order to establish the patterns of disease progression. The World Health Organization (WHO) recommends active surveillance, with focus of case finding, testing and contact tracing in all transmission scenarios. COVID-19 surveillance is expected to monitor epidemiological trends, rapidly detect new cases, and based on this information, provide epidemiological information to conduct risk assessment and guide disease preparedness.

Acoustic epidemiology refers to the study of the determinants and distribution of disease. It also refers to the analysis of sounds produced by the body through a single tool or a combination of diagnostic tools.

References