Fourier ptychography is a computational imaging technique based on optical microscopy that consists in the synthesis of a wider numerical aperture from a set of full-field images acquired at various coherent illumination angles, [1] resulting in increased resolution compared to a conventional microscope.
Each image is acquired under the illumination of a coherent light source at various angles of incidence (typically from an array of LEDs); the acquired image set is then combined using an iterative phase retrieval algorithm into a final high-resolution image that can contain up to a billion pixels (a gigapixel) with diffraction-limited resolution, resulting in a high space-bandwidth product.
Fourier ptychography reconstructs the complex image of the object (with quantitative phase information), but contrary to holography, it is a non-interferometric imaging technique and thus often easier to implement.
The name "ptychography" comes from the ancient Greek word πτυχή ("to fold", also found in the word triptych), because the technique is based on multiple "views" of the object.
The image reconstruction algorithms are based on iterative phase retrieval, [2] either related to the Gerchberg–Saxton algorithm or based on convex relaxation methods. [3] Like real space ptychography, the solution of the phase problem relies on the same mathematical shift invariance constraint, except in Fourier ptychography it is the diffraction pattern in the back focal plane that is moving with respect to the back-focal plane aperture. (In traditional ptychography the illumination moves with respect to the specimen.) Many reconstruction algorithms used in real-space ptychography are therefore used in Fourier ptychography, most commonly PIE [4] [5] and variants such as ePIE [6] and 3PIE. [7] Variants of these algorithms allow for simultaneous reconstruction of the pupil function of an optical system, [8] allowing for the correction of the aberrations of the microscope objective, and diffraction tomography [9] which permits the 3D reconstruction of thin sample objects without requiring the angular sample scanning needed for CT scans.
Fourier ptychography can be easily implemented on a conventional optical microscope by replacing the illumination source by an array of LED and improve the optical resolution by a factor 2 (with only bright-field illumination) or more (when including dark-field images to the reconstruction.)
A major advantage of Fourier ptychography is the ability to use a microscope objective with a lower numerical aperture without sacrificing the resolution. The use of a lower numerical aperture allows for larger field of view, larger depth of focus, and larger working distance. Moreover, it enables effective numerical aperture larger than 1 without resorting to oil immersion. [10]
Contrary to Fourier ptychography, (conventional) ptychography swaps the role of the focus element, from an objective to become a condenser, and relies on the acquisition of diffractograms with illumination position diversity. However, the two techniques are both based on the determination of the angular spectrum of the object through a phase retrieval procedure, [11] and inherently reconstruct the same information. Therefore, Fourier ptychography and conventional ptychography provides a bridge between coherent diffraction imaging and full-field microscopy.
Microscopy is the technical field of using microscopes to view objects and areas of objects that cannot be seen with the naked eye. There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy.
The optical microscope, also referred to as a light microscope, is a type of microscope that commonly uses visible light and a system of lenses to generate magnified images of small objects. Optical microscopes are the oldest design of microscope and were possibly invented in their present compound form in the 17th century. Basic optical microscopes can be very simple, although many complex designs aim to improve resolution and sample contrast.
Transmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through a specimen to form an image. The specimen is most often an ultrathin section less than 100 nm thick or a suspension on a grid. An image is formed from the interaction of the electrons with the sample as the beam is transmitted through the specimen. The image is then magnified and focused onto an imaging device, such as a fluorescent screen, a layer of photographic film, or a detector such as a scintillator attached to a charge-coupled device or a direct electron detector.
In optics, any optical instrument or system – a microscope, telescope, or camera – has a principal limit to its resolution due to the physics of diffraction. An optical instrument is said to be diffraction-limited if it has reached this limit of resolution performance. Other factors may affect an optical system's performance, such as lens imperfections or aberrations, but these are caused by errors in the manufacture or calculation of a lens, whereas the diffraction limit is the maximum resolution possible for a theoretically perfect, or ideal, optical system.
The point spread function (PSF) describes the response of a focused optical imaging system to a point source or point object. A more general term for the PSF is the system's impulse response; the PSF is the impulse response or impulse response function (IRF) of a focused optical imaging system. The PSF in many contexts can be thought of as the extended blob in an image that represents a single point object, that is considered as a spatial impulse. In functional terms, it is the spatial domain version of the optical transfer function (OTF) of an imaging system. It is a useful concept in Fourier optics, astronomical imaging, medical imaging, electron microscopy and other imaging techniques such as 3D microscopy and fluorescence microscopy.
Super-resolution imaging (SR) is a class of techniques that enhance (increase) the resolution of an imaging system. In optical SR the diffraction limit of systems is transcended, while in geometrical SR the resolution of digital imaging sensors is enhanced.
RESOLFT, an acronym for REversible Saturable OpticaLFluorescence Transitions, denotes a group of optical fluorescence microscopy techniques with very high resolution. Using standard far field visible light optics a resolution far below the diffraction limit down to molecular scales can be obtained.
Dark-field microscopy describes microscopy methods, in both light and electron microscopy, which exclude the unscattered beam from the image. Consequently, the field around the specimen is generally dark.
Digital holography refers to the acquisition and processing of holograms with a digital sensor array, typically a CCD camera or a similar device. Image rendering, or reconstruction of object data is performed numerically from digitized interferograms. Digital holography offers a means of measuring optical phase data and typically delivers three-dimensional surface or optical thickness images. Several recording and processing schemes have been developed to assess optical wave characteristics such as amplitude, phase, and polarization state, which make digital holography a very powerful method for metrology applications .
Interferometric microscopy or imaging interferometric microscopy is the concept of microscopy which is related to holography, synthetic-aperture imaging, and off-axis-dark-field illumination techniques. Interferometric microscopy allows enhancement of resolution of optical microscopy due to interferometric (holographic) registration of several partial images and the numerical combining.
Coherent diffractive imaging (CDI) is a "lensless" technique for 2D or 3D reconstruction of the image of nanoscale structures such as nanotubes, nanocrystals, porous nanocrystalline layers, defects, potentially proteins, and more. In CDI, a highly coherent beam of X-rays, electrons or other wavelike particle or photon is incident on an object.
Ptychography is a computational method of microscopic imaging. It generates images by processing many coherent interference patterns that have been scattered from an object of interest. Its defining characteristic is translational invariance, which means that the interference patterns are generated by one constant function moving laterally by a known amount with respect to another constant function. The interference patterns occur some distance away from these two components, so that the scattered waves spread out and "fold" into one another as shown in the figure.
Optical sectioning is the process by which a suitably designed microscope can produce clear images of focal planes deep within a thick sample. This is used to reduce the need for thin sectioning using instruments such as the microtome. Many different techniques for optical sectioning are used and several microscopy techniques are specifically designed to improve the quality of optical sectioning.
Digital holographic microscopy (DHM) is digital holography applied to microscopy. Digital holographic microscopy distinguishes itself from other microscopy methods by not recording the projected image of the object. Instead, the light wave front information originating from the object is digitally recorded as a hologram, from which a computer calculates the object image by using a numerical reconstruction algorithm. The image forming lens in traditional microscopy is thus replaced by a computer algorithm. Other closely related microscopy methods to digital holographic microscopy are interferometric microscopy, optical coherence tomography and diffraction phase microscopy. Common to all methods is the use of a reference wave front to obtain amplitude (intensity) and phase information. The information is recorded on a digital image sensor or by a photodetector from which an image of the object is created (reconstructed) by a computer. In traditional microscopy, which do not use a reference wave front, only intensity information is recorded and essential information about the object is lost.
Multifocal plane microscopy (MUM), also known as multiplane microscopy or multifocus microscopy, is a form of light microscopy that allows the tracking of the 3D dynamics in live cells at high temporal and spatial resolution by simultaneously imaging different focal planes within the specimen. In this methodology, the light collected from the sample by an infinity-corrected objective lens is split into two paths. In each path the split light is focused onto a detector which is placed at a specific calibrated distance from the tube lens. In this way, each detector images a distinct plane within the sample. The first developed MUM setup was capable of imaging two distinct planes within the sample. However, the setup can be modified to image more than two planes by further splitting the light in each light path and focusing it onto detectors placed at specific calibrated distances. It has later been improved for imaging up to four distinct planes. To image a greater number of focal planes, simpler techniques based on image splitting optics have been developed. One example is by using a customized image splitting prism, which is capable of capturing up to 8 focal planes using only two cameras. Better yet, standard off-the-shelf partial beamsplitters can be used to construct a so-called z-splitter prism that allows simultaneous imaging of 9 individual focal planes using a single camera. Another technique called multifocus microscopy (MFM) uses diffractive Fourier optics to image up to 25 focal planes.
Light sheet fluorescence microscopy (LSFM) is a fluorescence microscopy technique with an intermediate-to-high optical resolution, but good optical sectioning capabilities and high speed. In contrast to epifluorescence microscopy only a thin slice of the sample is illuminated perpendicularly to the direction of observation. For illumination, a laser light-sheet is used, i.e. a laser beam which is focused only in one direction. A second method uses a circular beam scanned in one direction to create the lightsheet. As only the actually observed section is illuminated, this method reduces the photodamage and stress induced on a living sample. Also the good optical sectioning capability reduces the background signal and thus creates images with higher contrast, comparable to confocal microscopy. Because light sheet fluorescence microscopy scans samples by using a plane of light instead of a point, it can acquire images at speeds 100 to 1,000 times faster than those offered by point-scanning methods.
Wide-field multiphoton microscopy refers to an optical non-linear imaging technique tailored for ultrafast imaging in which a large area of the object is illuminated and imaged without the need for scanning. High intensities are required to induce non-linear optical processes such as two-photon fluorescence or second harmonic generation. In scanning multiphoton microscopes the high intensities are achieved by tightly focusing the light, and the image is obtained by beam scanning. In wide-field multiphoton microscopy the high intensities are best achieved using an optically amplified pulsed laser source to attain a large field of view (~100 µm). The image in this case is obtained as a single frame with a CCD without the need of scanning, making the technique particularly useful to visualize dynamic processes simultaneously across the object of interest. With wide-field multiphoton microscopy the frame rate can be increased up to a 1000-fold compared to multiphoton scanning microscopy. Wide-field multiphoton microscopes are not yet commercially available, but working prototypes exist in several optics laboratories.
Computational imaging is the process of indirectly forming images from measurements using algorithms that rely on a significant amount of computing. In contrast to traditional imaging, computational imaging systems involve a tight integration of the sensing system and the computation in order to form the images of interest. The ubiquitous availability of fast computing platforms, the advances in algorithms and modern sensing hardware is resulting in imaging systems with significantly enhanced capabilities. Computational Imaging systems cover a broad range of applications include computational microscopy, tomographic imaging, MRI, ultrasound imaging, computational photography, Synthetic Aperture Radar (SAR), seismic imaging etc. The integration of the sensing and the computation in computational imaging systems allows for accessing information which was otherwise not possible. For example:
John Marius Rodenburg is Professor in the Department of Electronic and Electrical Engineering at the University of Sheffield.
Computational microscopy is a subfield of computational imaging, which combines algorithmic reconstruction with sensing to capture microscopic images of objects. The algorithms used in computational microscopy often combine the information of several images captured using various illuminations or measurements to form an aggregated 2D or 3D image using iterative techniques or machine learning. Notable forms of computational microscopy include super-resolution fluorescence microscopy, quantitative phase imaging, and Fourier ptychography. Computational microscopy is at the intersection of computer science and optics.