Image destriping is the process of removing stripes or streaks from images and videos. These artifacts plague a range of fields in scientific imaging including atomic force microscopy, [2] light sheet fluorescence microscopy, [3] and planetary satellite imaging. [4]
The most common image processing techniques to reduce stripe artifacts is with Fourier filtering. [5] Unfortunately, filtering methods risk altering or suppressing useful image data. Methods developed for multiple-sensor imaging systems in planetary satellites use statistical-based methods to match signal distribution across multiple sensors. [6] More recently, a new class of approaches leverage compressed sensing, to regularize an optimization problem, and recover stripe free images. [7] [1] [8] In many cases, these destriped images have little to no artifacts, even at low signal to noise ratios. [1]
In physics and geosciences, the term spatial resolution refers to distance between independent measurements, or the physical dimension that represents a pixel of the image. While in some instruments, like cameras and telescopes, spatial resolution is directly connected to angular resolution, other instruments, like synthetic aperture radar or a network of weather stations, produce data whose spatial sampling layout is more related to the Earth's surface, such as in remote sensing and satellite imagery.
Iterative reconstruction refers to iterative algorithms used to reconstruct 2D and 3D images in certain imaging techniques. For example, in computed tomography an image must be reconstructed from projections of an object. Here, iterative reconstruction techniques are usually a better, but computationally more expensive alternative to the common filtered back projection (FBP) method, which directly calculates the image in a single reconstruction step. In recent research works, scientists have shown that extremely fast computations and massive parallelism is possible for iterative reconstruction, which makes iterative reconstruction practical for commercialization.
A total internal reflection fluorescence microscope (TIRFM) is a type of microscope with which a thin region of a specimen, usually less than 200 nanometers can be observed.
Hyperspectral imaging collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes. There are three general branches of spectral imagers. There are push broom scanners and the related whisk broom scanners, which read images over time, band sequential scanners, which acquire images of an area at different wavelengths, and snapshot hyperspectral imaging, which uses a staring array to generate an image in an instant.
The FLuorescence EXplorer (FLEX) is a planned mission by the European Space Agency to launch a satellite to monitor the global steady-state chlorophyll fluorescence in terrestrial vegetation. FLEX was selected for funding on 19 November 2015 and will be launched on a Vega C rocket from Guiana Space Centre in mid-2025.
Radioglaciology is the study of glaciers, ice sheets, ice caps and icy moons using ice penetrating radar. It employs a geophysical method similar to ground-penetrating radar and typically operates at frequencies in the MF, HF, VHF and UHF portions of the radio spectrum. This technique is also commonly referred to as "Ice Penetrating Radar (IPR)" or "Radio Echo Sounding (RES)".
GNSS reflectometry involves making measurements from the reflections from the Earth of navigation signals from Global Navigation Satellite Systems such as GPS. The idea of using reflected GNSS signal for earth observation became more and more popular in the mid-1990s at NASA Langley Research Center and is also known as GPS reflectometry. Research applications of GNSS-R are found in
The Special Sensor Microwave Imager / Sounder (SSMIS) is a 24-channel, 21-frequency, linearly polarized passive microwave radiometer system. The instrument is flown on board the United States Air Force Defense Meteorological Satellite Program (DMSP) F-16, F-17, F-18 and F-19 satellites, which were launched in October 2003, November 2006, October 2009, and April 2014, respectively. It is the successor to the Special Sensor Microwave/Imager (SSM/I). The SSMIS on the F19 satellite stopped producing useful data in February 2016.
Atmospheric correction is the process of removing the scattering and absorption effects of the atmosphere on the reflectance values of images taken by satellite or airborne sensors. Atmospheric effects in optical remote sensing are significant and complex, dramatically altering the spectral nature of the radiation reaching the remote sensor. The atmosphere both absorbs and scatters various wavelengths of the visible spectrum which must pass through the atmosphere twice, once from the sun to the object and then again as it travels back up the image sensor. These distortions are corrected using various approaches and techniques, as described below.
Speckle, speckle pattern, or speckle noise is a granular noise texture degrading the quality as a consequence of interference among wavefronts in coherent imaging systems, such as radar, synthetic aperture radar (SAR), medical ultrasound and optical coherence tomography. Speckle is not external noise; rather, it is an inherent fluctuation in diffuse reflections, because the scatterers are not identical for each cell, and the coherent illumination wave is highly sensitive to small variations in phase changes.
With increased interest in sea ice and its effects on the global climate, efficient methods are required to monitor both its extent and exchange processes. Satellite-mounted, microwave radiometers, such SSMI, AMSR and AMSU, are an ideal tool for the task because they can see through cloud cover, and they have frequent, global coverage. A passive microwave instrument detects objects through emitted radiation since different substance have different emission spectra. To detect sea ice more efficiently, there is a need to model these emission processes. The interaction of sea ice with electromagnetic radiation in the microwave range is still not well understood. In general is collected information limited because of the large-scale variability due to the emissivity of sea ice.
Satellite surface salinity refers to measurements of surface salinity made by remote sensing satellites. The radiative properties of the ocean surface are exploited in order to estimate the salinity of the water's surface layer.
JoBea Way Holt is an American planetary scientist who has worked for NASA. Holt studied the carbon cycle in Earth's atmosphere. She is also a member of the Climate Project, and is the author of several books and research papers.
Frank Silvio Marzano was a professor at the Sapienza University of Rome, Italy who was named Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2016 for contributions to microwave remote sensing in meteorology and volcanology. He was also a Fellow of the UK Royal Meteorological Society since 2012. In 2020 Dr. Marzano was inserted in the World's Top 2% Scientists database of Stanford University (USA).
Dr. Y. S. Rao is a Professor at the Centre of Studies in Resources Engineering, Indian Institute of Technology Bombay, Mumbai, India. He is working in the field of microwave remote sensing and land based applications for more than 34 years. His early research was focused on the use of Synthetic Aperture Radar (SAR) interferometry for landslides and land deformation monitoring, Digital Elevation Model generation, snow and glacier monitoring. He is also actively involved in developing several techniques for soil moisture estimation using passive and active microwave remote sensing data for more than 25 years. His current research involves SAR Polarimetry for crop characterization, classification, biophysical parameter retrieval using linear and compact-pol SAR data. Apart from applications, he has also contributed in the field of Polarimetric SAR system calibration and software tool development.
A copula is a mathematical function that provides a relationship between marginal distributions of random variables and their joint distributions. Copulas are important because it represents a dependence structure without using marginal distributions. Copulas have been widely used in the field of finance, but their use in signal processing is relatively new. Copulas have been employed in the field of wireless communication for classifying radar signals, change detection in remote sensing applications, and EEG signal processing in medicine. In this article, a short introduction to copulas is presented, followed by a mathematical derivation to obtain copula density functions, and then a section with a list of copula density functions with applications in signal processing.
Deep learning in photoacoustic imaging combines the hybrid imaging modality of photoacoustic imaging (PA) with the rapidly evolving field of deep learning. Photoacoustic imaging is based on the photoacoustic effect, in which optical absorption causes a rise in temperature, which causes a subsequent rise in pressure via thermo-elastic expansion. This pressure rise propagates through the tissue and is sensed via ultrasonic transducers. Due to the proportionality between the optical absorption, the rise in temperature, and the rise in pressure, the ultrasound pressure wave signal can be used to quantify the original optical energy deposition within the tissue.
Avik Bhattacharya is a professor at the Centre of Studies in Resources Engineering, Indian Institute of Technology Bombay, Mumbai, India. He has been working in the field of Radar Polarimetry theory and applications for more than a decade. His main focuses on the use of Synthetic Aperture Radar (SAR) data for land use classification, change detection and qualitative and quantitative biophysical and geophysical information estimation.
Dorothy K. Hall is a scientific researcher known for her studies on snow and ice, which she studies through a combination of satellite data and direct measurements. She is a fellow of the American Geophysical Union.
Ultrasound-switchable fluorescence (USF) imaging is a deep optics imaging technique. In last few decades, fluorescence microscopy has been highly developed to image biological samples and live tissues. However, due to light scattering, fluorescence microscopy is limited to shallow tissues. Since fluorescence is characterized by high contrast, high sensitivity, and low cost which is crucial to investigate deep tissue information, developing fluorescence imaging technique with high depth-to-resolution ratio would be promising.. Recently, ultrasound-switchable fluorescence imaging has been developed to achieve high signal-to-noise ratio (SNR) and high spatial resolution imaging without sacrificing image depth.