Superposed epoch analysis

Last updated

Superposed epoch analysis (SPE or SEA), also called Chree analysis after a paper by Charles Chree [1] that employed the technique, is a statistical tool used in data analysis either to detect periodicities within a time sequence or to reveal a correlation (usually in time) between two data sequences (usually two time series). [2]

When comparing two time series, the essence of the method is to: (1) define each occurrence of an event in one data sequence (series #1) as a key time; (2) extract subsets of data from the other sequence (series #2) within some time range near each key time; (3) superpose all extracted subsets from series #2 (with key times for all subsets synchronized) by adding them. (To effectively superpose data from series #2 that are recorded at different or even irregular times, data binning is often used.) This approach can be used to detect a signal (i.e., related variations in both series) in the presence of noise (i.e., unrelated variations in both series) whenever the noise sums incoherently while the signal is reinforced by the superposition.

To search for periodicities in a single time series, the data sequence can be broken into separate subsets of equal duration, and then all subsets can be superposed. Some hypothesis for the length of the period is required to set the subsets' duration.

The approach has been used in signal analysis in several fields, including geophysics (where it has been referred to as compositing) [3] [4] and solar physics. [5]

Related Research Articles

Computer vision is an interdisciplinary scientific field that deals with how computers can gain high-level understanding from digital images or videos. From the perspective of engineering, it seeks to understand and automate tasks that the human visual system can do.

Digital data Discrete, discontinuous representation of information

Digital data, in information theory and information systems, is information represented as a string of discrete symbols each of which can take on one of only a finite number of values from some alphabet, such as letters or digits. An example is a text document, which consists of a string of alphanumeric characters. The most common form of digital data in modern information systems is binary data, which is represented by a string of binary digits (bits) each of which can have one of two values, either 0 or 1.

Signal processing Academic subfield of electrical engineering

Signal processing is an electrical engineering subfield that focuses on analysing, modifying, and synthesizing signals such as sound, images, and scientific measurements. Signal processing techniques can be used to improve transmission, storage efficiency and subjective quality and to also emphasize or detect components of interest in a measured signal.

Hardware random number generator Cryptographic device

In computing, a hardware random number generator (HRNG) or true random number generator (TRNG) is a device that generates random numbers from a physical process, rather than by means of an algorithm. Such devices are often based on microscopic phenomena that generate low-level, statistically random "noise" signals, such as thermal noise, the photoelectric effect, involving a beam splitter, and other quantum phenomena. These stochastic processes are, in theory, completely unpredictable for as long as an equation governing such phenomena is unknown or uncomputable, and the theory's assertions of unpredictability are subject to experimental test. This is in contrast to the paradigm of pseudo-random number generation commonly implemented in computer programs.

Signal Varying physical quantity that conveys information

In signal processing, a signal is a function that conveys information about a phenomenon. In electronics and telecommunications, it refers to any time varying voltage, current, or electromagnetic wave that carries information. A signal may also be defined as an observable change in a quality such as quantity.

Time series Sequence of data points over time

In mathematics, a time series is a series of data points indexed in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average.

In signal processing, a periodogram is an estimate of the spectral density of a signal. The term was coined by Arthur Schuster in 1898. Today, the periodogram is a component of more sophisticated methods. It is the most common tool for examining the amplitude vs frequency characteristics of FIR filters and window functions. FFT spectrum analyzers are also implemented as a time-sequence of periodograms.

Steganalysis is the study of detecting messages hidden using steganography; this is analogous to cryptanalysis applied to cryptography.

Power analysis

In cryptography, a side channel attack is used to extract secret data from some secure device. Side-channel analysis is typically trying to non-invasively extract cryptographic keys and other secret information from the device. A simple example of this is the German tank problem: the serial numbers of tanks provide details of the production data for tanks. In physical security, a non-invasive attack would be similar to lock-picking, where a successful attack leaves no trace of the attacker being present.

VoIP spam or SPIT is unsolicited, automatically dialed telephone calls, typically using voice over Internet Protocol (VoIP) technology.

In time series analysis, the Box–Jenkins method, named after the statisticians George Box and Gwilym Jenkins, applies autoregressive moving average (ARMA) or autoregressive integrated moving average (ARIMA) models to find the best fit of a time-series model to past values of a time series.

The decomposition of time series is a statistical task that deconstructs a time series into several components, each representing one of the underlying categories of patterns. There are two principal types of decomposition, which are outlined below.

In statistical signal processing, the goal of spectral density estimation (SDE) is to estimate the spectral density of a random signal from a sequence of time samples of the signal. Intuitively speaking, the spectral density characterizes the frequency content of the signal. One purpose of estimating the spectral density is to detect any periodicities in the data, by observing peaks at the frequencies corresponding to these periodicities.

The Hilbert–Huang transform (HHT) is a way to decompose a signal into so-called intrinsic mode functions (IMF) along with a trend, and obtain instantaneous frequency data. It is designed to work well for data that is nonstationary and nonlinear. In contrast to other common transforms like the Fourier transform, the HHT is more like an algorithm that can be applied to a data set, rather than a theoretical tool.

Singular spectrum analysis Nonparametric spectral estimation method

In time series analysis, singular spectrum analysis (SSA) is a nonparametric spectral estimation method. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. Its roots lie in the classical Karhunen (1946)–Loève spectral decomposition of time series and random fields and in the Mañé (1981)–Takens (1981) embedding theorem. SSA can be an aid in the decomposition of time series into a sum of components, each having a meaningful interpretation. The name "singular spectrum analysis" relates to the spectrum of eigenvalues in a singular value decomposition of a covariance matrix, and not directly to a frequency domain decomposition.

Phase dispersion minimization

Phase dispersion minimization (PDM) is a data analysis technique that searches for periodic components of a time series data set. It is useful for data sets with gaps, non-sinusoidal variations, poor time coverage or other problems that would make Fourier techniques unusable. It was first developed by Stellingwerf in 1978 and has been widely used for astronomical and other types of periodic data analyses. Source code is available for PDM analysis. The current version of this application is available for download.

In time series data, seasonality is the presence of variations that occur at specific regular intervals less than a year, such as weekly, monthly, or quarterly. Seasonality may be caused by various factors, such as weather, vacation, and holidays and consists of periodic, repetitive, and generally regular and predictable patterns in the levels of a time series.

Step detection

In statistics and signal processing, step detection is the process of finding abrupt changes in the mean level of a time series or signal. It is usually considered as a special case of the statistical method known as change detection or change point detection. Often, the step is small and the time series is corrupted by some kind of noise, and this makes the problem challenging because the step may be hidden by the noise. Therefore, statistical and/or signal processing algorithms are often required.

Alpha Centauri Bb Proposed planet that orbits Alpha Centauri B

Alpha Centauri Bb was a proposed exoplanet orbiting the K-type main-sequence star Alpha Centauri B, located 4.37 light-years from Earth in the southern constellation of Centaurus, but there has not been enough evidence to support the claim.

In computer vision, rigid motion segmentation is the process of separating regions, features, or trajectories from a video sequence into coherent subsets of space and time. These subsets correspond to independent rigidly moving objects in the scene. The goal of this segmentation is to differentiate and extract the meaningful rigid motion from the background and analyze it. Image segmentation techniques labels the pixels to be a part of pixels with certain characteristics at a particular time. Here, the pixels are segmented depending on its relative movement over a period of time i.e. the time of the video sequence.

References