This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Microscope image processing is a broad term that covers the use of digital image processing techniques to process, analyze and present images obtained from a microscope. Such processing is now commonplace in a number of diverse fields such as medicine, biological research, cancer research, drug testing, metallurgy, etc. A number of manufacturers of microscopes now specifically design in features that allow the microscopes to interface to an image processing system.
Until the early 1990s, most image acquisition in video microscopy applications was typically done with an analog video camera, often simply closed circuit TV cameras. While this required the use of a frame grabber to digitize the images, video cameras provided images at full video frame rate (25-30 frames per second) allowing live video recording and processing. While the advent of solid state detectors yielded several advantages, the real-time video camera was actually superior in many respects.
Today, acquisition is usually done using a CCD camera mounted in the optical path of the microscope. The camera may be full colour or monochrome. Very often, very high resolution cameras are employed to gain as much direct information as possible. Cryogenic cooling is also common, to minimise noise. Often digital cameras used for this application provide pixel intensity data to a resolution of 12-16 bits, much higher than is used in consumer imaging products.
Ironically, in recent years, much effort has been put into acquiring data at video rates, or higher (25-30 frames per second or higher). What was once easy with off-the-shelf video cameras now requires special, high speed electronics to handle the vast digital data bandwidth.
Higher speed acquisition allows dynamic processes to be observed in real time, or stored for later playback and analysis. Combined with the high image resolution, this approach can generate vast quantities of raw data, which can be a challenge to deal with, even with a modern computer system.
It should be observed that while current CCD detectors allow very high image resolution, often this involves a trade-off because, for a given chip size, as the pixel count increases, the pixel size decreases. As the pixels get smaller, their well depth decreases, reducing the number of electrons that can be stored. In turn, this results in a poorer signal-to-noise ratio.
For best results, one must select an appropriate sensor for a given application. Because microscope images have an intrinsic limiting resolution, it often makes little sense to use a noisy, high resolution detector for image acquisition. A more modest detector, with larger pixels, can often produce much higher quality images because of reduced noise. This is especially important in low-light applications such as fluorescence microscopy.
Moreover, one must also consider the temporal resolution requirements of the application. A lower resolution detector will often have a significantly higher acquisition rate, permitting the observation of faster events. Conversely, if the observed object is motionless, one may wish to acquire images at the highest possible spatial resolution without regard to the time required to acquire a single image.
Image processing for microscopy application begins with fundamental techniques intended to most accurately reproduce the information contained in the microscopic sample. This might include adjusting the brightness and contrast of the image, averaging images to reduce image noise and correcting for illumination non-uniformities. Such processing involves only basic arithmetic operations between images (i.e. addition, subtraction, multiplication and division). The vast majority of processing done on microscope image is of this nature.
Another class of common 2D operations called image convolution are often used to reduce or enhance image details. Such "blurring" and "sharpening" algorithms in most programs work by altering a pixel's value based on a weighted sum of that and the surrounding pixels (a more detailed description of kernel based convolution deserves an entry for itself) or by altering the frequency domain function of the image using Fourier Transform. Most image processing techniques are performed in the Frequency domain.
Other basic two dimensional techniques include operations such as image rotation, warping, color balancing etc.
At times, advanced techniques are employed with the goal of "undoing" the distortion of the optical path of the microscope, thus eliminating distortions and blurring caused by the instrumentation. This process is called deconvolution, and a variety of algorithms have been developed, some of great mathematical complexity. The end result is an image far sharper and clearer than could be obtained in the optical domain alone. This is typically a 3-dimensional operation, that analyzes a volumetric image (i.e. images taken at a variety of focal planes through the sample) and uses this data to reconstruct a more accurate 3-dimensional image.
Another common requirement is to take a series of images at a fixed position, but at different focal depths. Since most microscopic samples are essentially transparent, and the depth of field of the focused sample is exceptionally narrow, it is possible to capture images "through" a three-dimensional object using 2D equipment like confocal microscopes. Software is then able to reconstruct a 3D model of the original sample which may be manipulated appropriately. The processing turns a 2D instrument into a 3D instrument, which would not otherwise exist. In recent times this technique has led to a number of scientific discoveries in cell biology.
Analysis of images will vary considerably according to application. Typical analysis includes determining where the edges of an object are, counting similar objects, calculating the area, perimeter length and other useful measurements of each object. A common approach is to create an image mask which only includes pixels that match certain criteria, then perform simpler scanning operations on the resulting mask. It is also possible to label objects and track their motion over a series of frames in a video sequence.
Microscopy is the technical field of using microscopes to view objects and areas of objects that cannot be seen with the naked eye. There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy.
A microscope is a laboratory instrument used to examine objects that are too small to be seen by the naked eye. Microscopy is the science of investigating small objects and structures using a microscope. Microscopic means being invisible to the eye unless aided by a microscope.
A scanning electron microscope (SEM) is a type of electron microscope that produces images of a sample by scanning the surface with a focused beam of electrons. The electrons interact with atoms in the sample, producing various signals that contain information about the surface topography and composition of the sample. The electron beam is scanned in a raster scan pattern, and the position of the beam is combined with the intensity of the detected signal to produce an image. In the most common SEM mode, secondary electrons emitted by atoms excited by the electron beam are detected using a secondary electron detector. The number of secondary electrons that can be detected, and thus the signal intensity, depends, among other things, on specimen topography. Some SEMs can achieve resolutions better than 1 nanometer.
The optical microscope, also referred to as a light microscope, is a type of microscope that commonly uses visible light and a system of lenses to generate magnified images of small objects. Optical microscopes are the oldest design of microscope and were possibly invented in their present compound form in the 17th century. Basic optical microscopes can be very simple, although many complex designs aim to improve resolution and sample contrast.
In optics, any optical instrument or system – a microscope, telescope, or camera – has a principal limit to its resolution due to the physics of diffraction. An optical instrument is said to be diffraction-limited if it has reached this limit of resolution performance. Other factors may affect an optical system's performance, such as lens imperfections or aberrations, but these are caused by errors in the manufacture or calculation of a lens, whereas the diffraction limit is the maximum resolution possible for a theoretically perfect, or ideal, optical system.
A total internal reflection fluorescence microscope (TIRFM) is a type of microscope with which a thin region of a specimen, usually less than 200 nanometers can be observed.
Confocal microscopy, most frequently confocal laser scanning microscopy (CLSM) or laser scanning confocal microscopy (LSCM), is an optical imaging technique for increasing optical resolution and contrast of a micrograph by means of using a spatial pinhole to block out-of-focus light in image formation. Capturing multiple two-dimensional images at different depths in a sample enables the reconstruction of three-dimensional structures within an object. This technique is used extensively in the scientific and industrial communities and typical applications are in life sciences, semiconductor inspection and materials science.
Optical resolution describes the ability of an imaging system to resolve detail, in the object that is being imaged. An imaging system may have many individual components, including one or more lenses, and/or recording and display components. Each of these contributes to the optical resolution of the system; the environment in which the imaging is done often is a further important factor.
The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are captured or transmitted. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.
The following are common definitions related to the machine vision field.
Chemical imaging is the analytical capability to create a visual image of components distribution from simultaneous measurement of spectra and spatial, time information. Hyperspectral imaging measures contiguous spectral bands, as opposed to multispectral imaging which measures spaced spectral bands.
Vertico spatially modulated illumination (Vertico-SMI) is the fastest light microscope for the 3D analysis of complete cells in the nanometer range. It is based on two technologies developed in 1996, SMI and SPDM. The effective optical resolution of this optical nanoscope has reached the vicinity of 5 nm in 2D and 40 nm in 3D, greatly surpassing the λ/2 resolution limit applying to standard microscopy using transmission or reflection of natural light according to the Abbe resolution limit That limit had been determined by Ernst Abbe in 1873 and governs the achievable resolution limit of microscopes using conventional techniques.
Super-resolution microscopy is a series of techniques in optical microscopy that allow such images to have resolutions higher than those imposed by the diffraction limit, which is due to the diffraction of light. Super-resolution imaging techniques rely on the near-field or on the far-field. Among techniques that rely on the latter are those that improve the resolution only modestly beyond the diffraction-limit, such as confocal microscopy with closed pinhole or aided by computational methods such as deconvolution or detector-based pixel reassignment, the 4Pi microscope, and structured-illumination microscopy technologies such as SIM and SMI.
Digital holographic microscopy (DHM) is digital holography applied to microscopy. Digital holographic microscopy distinguishes itself from other microscopy methods by not recording the projected image of the object. Instead, the light wave front information originating from the object is digitally recorded as a hologram, from which a computer calculates the object image by using a numerical reconstruction algorithm. The image forming lens in traditional microscopy is thus replaced by a computer algorithm. Other closely related microscopy methods to digital holographic microscopy are interferometric microscopy, optical coherence tomography and diffraction phase microscopy. Common to all methods is the use of a reference wave front to obtain amplitude (intensity) and phase information. The information is recorded on a digital image sensor or by a photodetector from which an image of the object is created (reconstructed) by a computer. In traditional microscopy, which do not use a reference wave front, only intensity information is recorded and essential information about the object is lost.
Time stretch microscopy, also known as serial time-encoded amplified imaging/microscopy or stretched time-encoded amplified imaging/microscopy' (STEAM), is a fast real-time optical imaging method that provides MHz frame rate, ~100 ps shutter speed, and ~30 dB optical image gain. Based on the photonic time stretch technique, STEAM holds world records for shutter speed and frame rate in continuous real-time imaging. STEAM employs the Photonic Time Stretch with internal Raman amplification to realize optical image amplification to circumvent the fundamental trade-off between sensitivity and speed that affects virtually all optical imaging and sensing systems. This method uses a single-pixel photodetector, eliminating the need for the detector array and readout time limitations. Avoiding this problem and featuring the optical image amplification for improvement in sensitivity at high image acquisition rates, STEAM's shutter speed is at least 1000 times faster than the best CCD and CMOS cameras. Its frame rate is 1000 times faster than the fastest CCD cameras and 10–100 times faster than the fastest CMOS cameras.
Multifocal plane microscopy (MUM), also known as multiplane microscopy or multifocus microscopy, is a form of light microscopy that allows the tracking of the 3D dynamics in live cells at high temporal and spatial resolution by simultaneously imaging different focal planes within the specimen. In this methodology, the light collected from the sample by an infinity-corrected objective lens is split into two paths. In each path the split light is focused onto a detector which is placed at a specific calibrated distance from the tube lens. In this way, each detector images a distinct plane within the sample. The first developed MUM setup was capable of imaging two distinct planes within the sample. However, the setup can be modified to image more than two planes by further splitting the light in each light path and focusing it onto detectors placed at specific calibrated distances. It has later been improved for imaging up to four distinct planes. To image a greater number of focal planes, simpler techniques based on image splitting optics have been developed. One example is by using a customized image splitting prism, which is capable of capturing up to 8 focal planes using only two cameras. Better yet, standard off-the-shelf partial beamsplitters can be used to construct a so-called z-splitter prism that allows simultaneous imaging of 9 individual focal planes using a single camera. Another technique called multifocus microscopy (MFM) uses diffractive Fourier optics to image up to 25 focal planes.
Photo-activated localization microscopy and stochastic optical reconstruction microscopy (STORM) are widefield fluorescence microscopy imaging methods that allow obtaining images with a resolution beyond the diffraction limit. The methods were proposed in 2006 in the wake of a general emergence of optical super-resolution microscopy methods, and were featured as Methods of the Year for 2008 by the Nature Methods journal. The development of PALM as a targeted biophysical imaging method was largely prompted by the discovery of new species and the engineering of mutants of fluorescent proteins displaying a controllable photochromism, such as photo-activatible GFP. However, the concomitant development of STORM, sharing the same fundamental principle, originally made use of paired cyanine dyes. One molecule of the pair, when excited near its absorption maximum, serves to reactivate the other molecule to the fluorescent state.
Live-cell imaging is the study of living cells using time-lapse microscopy. It is used by scientists to obtain a better understanding of biological function through the study of cellular dynamics. Live-cell imaging was pioneered in the first decade of the 21st century. One of the first time-lapse microcinematographic films of cells ever made was made by Julius Ries, showing the fertilization and development of the sea urchin egg. Since then, several microscopy methods have been developed to study living cells in greater detail with less effort. A newer type of imaging using quantum dots have been used, as they are shown to be more stable. The development of holotomographic microscopy has disregarded phototoxicity and other staining-derived disadvantages by implementing digital staining based on cells’ refractive index.
Imaging particle analysis is a technique for making particle measurements using digital imaging, one of the techniques defined by the broader term particle size analysis. The measurements that can be made include particle size, particle shape (morphology or shape analysis and grayscale or color, as well as distributions of statistical population measurements.
4D scanning transmission electron microscopy is a subset of scanning transmission electron microscopy (STEM) which utilizes a pixelated electron detector to capture a convergent beam electron diffraction (CBED) pattern at each scan location. This technique captures a 2 dimensional reciprocal space image associated with each scan point as the beam rasters across a 2 dimensional region in real space, hence the name 4D STEM. Its development was enabled by evolution in STEM detectors and improvements computational power. The technique has applications in visual diffraction imaging, phase orientation and strain mapping, phase contrast analysis, among others.
Russ, John C. (2006-12-19) [1992]. The Image Processing Handbook (5th ed.). CRC Press. ISBN 0-8493-7254-2.