Phase-contrast imaging is a method of imaging that has a range of different applications. It measures differences in the refractive index of different materials to differentiate between structures under analysis. In conventional light microscopy, phase contrast can be employed to distinguish between structures of similar transparency, and to examine crystals on the basis of their double refraction. This has uses in biological, medical and geological science. In X-ray tomography, the same physical principles can be used to increase image contrast by highlighting small details of differing refractive index within structures that are otherwise uniform. In transmission electron microscopy (TEM), phase contrast enables very high resolution (HR) imaging, making it possible to distinguish features a few Angstrom apart (at this point highest resolution is 40 pm [1] ).
Phase-contrast imaging is commonly used in atomic physics to describe a range of techniques for dispersively imaging ultracold atoms. Dispersion is the phenomena of the propagation of electromagnetic fields (light) in matter. In general, the refractive index of a material, which alters the phase velocity and refraction of the field, depends on the wavelength or frequency of the light. This is what gives rise to the familiar behavior of prisms, which are seen to split light into its constituent wavelengths. Microscopically, we may think of this behavior as arising from the interaction of the electromagnetic wave with the atomic dipoles. The oscillating force field in turn causes the dipoles to oscillate and in doing so reradiate light with the same polarization and frequency, albeit delayed or phase-shifted from the incident wave. These waves interfere to produce the altered wave which propagates through the medium. If the light is monochromatic (that is, an electromagnetic wave of a single frequency or wavelength), with a frequency close to an atomic transition, the atom will also absorb photons from the light field, reducing the amplitude of the incident wave. Mathematically, these two interaction mechanisms (dispersive and absorptive) are commonly written as the real and imaginary parts, respectively, of a Complex refractive index.[ citation needed ]
Dispersive imaging refers strictly to the measurement of the real part of the refractive index. In phase contrast-imaging, a monochromatic probe field is detuned far away from any atomic transitions to minimize absorption and shone onto an atomic medium (such as a Bose-condensed gas). Since absorption is minimized, the only effect of the gas on the light is to alter the phase of various points along its wavefront. If we write the incident electromagnetic field as
then the effect of the medium is to phase shift the wave by some amount which is in general a function of in the plane of the object (unless the object is of homogenous density, i.e. of constant index of refraction), where we assume the phase shift to be small, such that we can neglect refractive effects:
We may think of this wave as a superposition of smaller bundles of waves each with a corresponding phase shift :
where is a normalization constant and the integral is over the area of the object plane. Since is assumed to be small, we may expand that part of the exponential to first order such that
where represents the integral over all small changes in phase to the wavefront due to each point in the area of the object. Looking at the real part of this expression, we find the sum of a wave with the original unshifted phase , with a wave that is out of phase and has very small amplitude . As written, this is simply another complex wave with phase
Since imaging systems see only changes in the intensity of the electromagnetic waves, which is proportional to the square of the electric field, we have . We see that both the incident wave and the phase shifted wave are equivalent in this respect. Such objects, which only impart phase changes to light which pass through them, are commonly referred to as phase objects, and are for this reason invisible to any imaging system. However, if we look more closely at the real part of our phase shifted wave
and suppose we could shift the term unaltered by the phase object (the cosine term) by , such that , then we have
The phase shifts due to the phase object are effectively converted into amplitude fluctuations of a single wave. These would be detectable by an imaging system since the intensity is now . This is the basis of the idea of phase contrast imaging. [2] As an example, consider the setup shown in the figure on the right.
A probe laser is incident on a phase object. This could be an atomic medium such as a Bose-Einstein Condensate. [3] The laser light is detuned far from any atomic resonance, such that the phase object only alters the phase of various points along the portion of the wavefront which pass through the object. The rays which pass through the phase object will diffract as a function of the index of refraction of the medium and diverge as shown by the dotted lines in the figure. The objective lens collimates this light, while focusing the so-called 0-order light, that is, the portion of the beam unaltered by the phase object (solid lines). This light comes to a focus in the focal plane of the objective lens, where a Phase plate can be positioned to delay only the phase of the 0-order beam, bringing it back into phase with the diffracted beam and converting the phase alterations in the diffracted beam into intensity fluctuations at the imaging plane. The phase plate is usually a piece of glass with a raised center encircled by a shallower etch, such that light passing through the center is delayed in phase relative to that passing through the edges.
In polarization contrast imaging, the Faraday effect of the light-matter interaction is leveraged to image the cloud using a standard absorption imaging setup altered with a far detuned probe beam and an extra polarizer. The Faraday effect rotates a linear probe beam polarization as it passes through a cloud polarized by a strong magnetic field in the propagation direction of the probe beam.[ citation needed ]
Classically, a linearly polarized probe beam may be thought of as a superposition of two oppositely handed, circularly polarized beams. The interaction between the rotating magnetic field of each probe beam interacts with the magnetic dipoles of atoms in the sample. If the sample is magnetically polarized in a direction with non-zero projection onto the light field k-vector, the two circularly polarized beams will interact with the magnetic dipoles of the sample with different strengths, corresponding to a relative phase shift between the two beams. This phase shift in turns maps to a rotation of the input beam linear polarization.[ citation needed ]
The quantum physics of the Faraday interaction may be described by the interaction of the second quantized Stokes parameters describing the polarization of a probe light field with the total angular momentum state of the atoms. Thus, if a BEC or other cold, dense sample of atoms is prepared in a particular spin (hyperfine) state polarized parallel to the imaging light propagation direction, both the density and change in spin state may be monitored by feeding the transmitted probe beam through a beam splitter before imaging onto a camera sensor. By adjusting the polarizer optic axis relative to the input linear polarization one can switch between a dark field scheme (zero light in the absence of atoms), and variable phase contrast imaging. [4] [5] [6]
In addition to phase-contrast, there are a number of other similar dispersive imaging methods. In the dark-field method, [7] the aforementioned phase plate is made completely opaque, such that the 0-order contribution to the beam is totally removed. In the absence of any imaging object the image plane would be dark. This amounts to removing the factor of 1 in the equation
from above. Comparing the squares of the two equations one will find that in the case of dark-ground, the range of contrast (or dynamic range of the intensity signal) is actually reduced. For this reason this method has fallen out of use.
In the defocus-contrast method, [8] [9] the phase plate is replaced by a defocusing of the objective lens. Doing so breaks the equivalence of parallel ray path lengths such that a relative phase is acquired between parallel rays. By controlling the amount of defocusing one can thus achieve an effect similar to that of the phase plate in standard phase-contrast. In this case however the defocusing scrambles the phase and amplitude modulation of the diffracted rays from the object in such a way that does not capture the exact phase information of the object, but produces an intensity signal proportional to the amount of phase noise in the object.[ citation needed ]
There is also another method, called bright-field balanced (BBD) method. This method leverages the complementary intensity changes of transmitted disks at different scattering angles that provide straightforward, dose-efficient, and noise-robust phase imaging from atomic resolution to intermediate length scales, such as both light and heavy atomic columns and nanoscale magnetic phases in FeGe samples. [10]
Phase contrast takes advantage of the fact that different structures have different refractive indices, and either bend, refract or delay the light passage through the sample by different amounts. The changes in the light passage result in waves being 'out of phase' with others. This effect can be transformed by phase contrast microscopes into amplitude differences that are observable in the eyepieces and are depicted effectively as darker or brighter areas of the resultant image.[ citation needed ]
Phase contrast is used extensively in optical microscopy, in both biological and geological sciences. In biology, it is employed in viewing unstained biological samples, making it possible to distinguish between structures that are of similar transparency or refractive indices.[ citation needed ]
In geology, phase contrast is exploited to highlight differences between mineral crystals cut to a standardised thin section (usually 30 μm) and mounted under a light microscope. Crystalline materials are capable of exhibiting double refraction, in which light rays entering a crystal are split into two beams that may exhibit different refractive indices, depending on the angle at which they enter the crystal. The phase contrast between the two rays can be detected with the human eye using particular optical filters. As the exact nature of the double refraction varies for different crystal structures, phase contrast aids in the identification of minerals.[ citation needed ]
There are four main techniques for X-ray phase-contrast imaging, which use different principles to convert phase variations in the X-rays emerging from the object into intensity variations at an X-ray detector. [11] [12] Propagation-based phase contrast [13] uses free-space propagation to get edge enhancement, Talbot and polychromatic far-field interferometry [12] [14] [15] uses a set of diffraction gratings to measure the derivative of the phase, refraction-enhanced imaging [16] uses an analyzer crystal also for differential measurement, and x-ray interferometry [17] uses a crystal interferometer to measure the phase directly. The advantages of these methods compared to normal absorption-contrast X-ray imaging is higher contrast for low-absorbing materials (because phase shift is a different mechanism than absorption) and a contrast-to-noise relationship that increases with spatial frequency (because many phase-contrast techniques detect the first or second derivative of the phase shift), which makes it possible to see smaller details [15] One disadvantage is that these methods require more sophisticated equipment, such as synchrotron or microfocus X-ray sources, x-ray optics, and high resolution X-ray detectors. This sophisticated equipment provides the sensitivity required to differentiate between small variations in the refractive index of X-rays passing through different media. The refractive index is normally smaller than 1 with a difference from 1 between 10−7 and 10−6.[ citation needed ]
All of these methods produce images that can be used to calculate the projections (integrals) of the refractive index in the imaging direction. For propagation-based phase contrast there are phase-retrieval algorithms, for Talbot interferometry and refraction-enhanced imaging the image is integrated in the proper direction, and for X-ray interferometry phase unwrapping is performed. For this reason they are well suited for tomography, i.e. reconstruction of a 3D-map of the refractive index of the object from many images at slightly different angles. For X-ray radiation the difference from 1 of the refractive index is essentially proportional to the density of the material.[ citation needed ]
Synchrotron X-ray tomography can employ phase contrast imaging to enable imaging of the interior surfaces of objects. In this context, phase contrast imaging is used to enhance the contrast that would normally be possible from conventional radiographic imaging. A difference in the refractive index between a detail and its surroundings causes a phase shift between the light wave that travels through the detail and that which travels outside the detail. An interference pattern results, marking out the detail. [18]
This method has been used to image Precambrian metazoan embryos from the Doushantuo Formation in China, allowing the internal structure of delicate microfossils to be imaged without destroying the original specimen. [19]
In the field of transmission electron microscopy, phase-contrast imaging may be employed to image columns of individual atoms; a more common name is high-resolution transmission electron microscopy. It is the highest resolution imaging technique ever developed, and can allow for resolutions of less than one angstrom (less than 0.1 nanometres). It thus enables the direct viewing of columns of atoms in a crystalline material. [20] [21]
The interpretation of these images is not a straightforward task. Computer simulations are used to determine what sort of contrast different structures may produce in a phase-contrast image. These commonly use the multislice method of Cowley and Moodie, [22] and include the phase changes due to the lens aberrations. [23] These require a reasonable amount of information about the sample and imaging conditions needs to be understood before the image can be properly interpreted, such as what crystal structure the material has.
The images are formed by removing the objective aperture entirely or by using a very large objective aperture. This ensures that not only the transmitted beam, but also the diffracted ones are allowed to contribute to the image. Instruments that are specifically designed for phase-contrast imaging are called HRTEMs (high resolution transmission electron microscopes), and differ from analytical TEMs mainly in the design of the electron beam column. Advances in spherical aberration (Cs) correction have enabled a new generation of HRTEMs to reach significantly better resolutions. [24]
In physics, the cross section is a measure of the probability that a specific process will take place in a collision of two particles. For example, the Rutherford cross-section is a measure of probability that an alpha particle will be deflected by a given angle during an interaction with an atomic nucleus. Cross section is typically denoted σ (sigma) and is expressed in units of area, more specifically in barns. In a way, it can be thought of as the size of the object that the excitation must hit in order for the process to occur, but more exactly, it is a parameter of a stochastic process.
In classical mechanics, a harmonic oscillator is a system that, when displaced from its equilibrium position, experiences a restoring force F proportional to the displacement x: where k is a positive constant.
In physics, interference is a phenomenon in which two coherent waves are combined by adding their intensities or displacements with due consideration for their phase difference. The resultant wave may have greater intensity or lower amplitude if the two waves are in phase or out of phase, respectively. Interference effects can be observed with all types of waves, for example, light, radio, acoustic, surface water waves, gravity waves, or matter waves as well as in loudspeakers as electrical waves.
In optics, polarized light can be described using the Jones calculus, invented by R. C. Jones in 1941. Polarized light is represented by a Jones vector, and linear optical elements are represented by Jones matrices. When light crosses an optical element the resulting polarization of the emerging light is found by taking the product of the Jones matrix of the optical element and the Jones vector of the incident light. Note that Jones calculus is only applicable to light that is already fully polarized. Light which is randomly polarized, partially polarized, or incoherent must be treated using Mueller calculus.
In mechanics and physics, simple harmonic motion is a special type of periodic motion an object experiences by means of a restoring force whose magnitude is directly proportional to the distance of the object from an equilibrium position and acts towards the equilibrium position. It results in an oscillation that is described by a sinusoid which continues indefinitely.
In electrodynamics, elliptical polarization is the polarization of electromagnetic radiation such that the tip of the electric field vector describes an ellipse in any fixed plane intersecting, and normal to, the direction of propagation. An elliptically polarized wave may be resolved into two linearly polarized waves in phase quadrature, with their polarization planes at right angles to each other. Since the electric field can rotate clockwise or counterclockwise as it propagates, elliptically polarized waves exhibit chirality.
A waveplate or retarder is an optical device that alters the polarization state of a light wave travelling through it. Two common types of waveplates are the half-wave plate, which rotates the polarization direction of linearly polarized light, and the quarter-wave plate, which converts between different elliptical polarizations
In physics, angular velocity, also known as angular frequency vector, is a pseudovector representation of how the angular position or orientation of an object changes with time, i.e. how quickly an object rotates around an axis of rotation and how fast the axis itself changes direction.
A sine wave, sinusoidal wave, or sinusoid is a periodic wave whose waveform (shape) is the trigonometric sine function. In mechanics, as a linear motion over time, this is simple harmonic motion; as rotation, it corresponds to uniform circular motion. Sine waves occur often in physics, including wind waves, sound waves, and light waves, such as monochromatic radiation. In engineering, signal processing, and mathematics, Fourier analysis decomposes general functions into a sum of sine waves of various frequencies, relative phases, and magnitudes.
Particle velocity is the velocity of a particle in a medium as it transmits a wave. The SI unit of particle velocity is the metre per second (m/s). In many cases this is a longitudinal wave of pressure as with sound, but it can also be a transverse wave as with the vibration of a taut string.
Particle displacement or displacement amplitude is a measurement of distance of the movement of a sound particle from its equilibrium position in a medium as it transmits a sound wave. The SI unit of particle displacement is the metre (m). In most cases this is a longitudinal wave of pressure, but it can also be a transverse wave, such as the vibration of a taut string. In the case of a sound wave travelling through air, the particle displacement is evident in the oscillations of air molecules with, and against, the direction in which the sound wave is travelling.
In physics, a wave vector is a vector used in describing a wave, with a typical unit being cycle per metre. It has a magnitude and direction. Its magnitude is the wavenumber of the wave, and its direction is perpendicular to the wavefront. In isotropic media, this is also the direction of wave propagation.
Geometrical optics, or ray optics, is a model of optics that describes light propagation in terms of rays. The ray in geometrical optics is an abstraction useful for approximating the paths along which light propagates under certain circumstances.
Etendue or étendue is a property of light in an optical system, which characterizes how "spread out" the light is in area and angle. It corresponds to the beam parameter product (BPP) in Gaussian beam optics. Other names for etendue include acceptance, throughput, light grasp, light-gathering power, optical extent, and the AΩ product. Throughput and AΩ product are especially used in radiometry and radiative transfer where it is related to the view factor. It is a central concept in nonimaging optics.
Acousto-optics is a branch of physics that studies the interactions between sound waves and light waves, especially the diffraction of laser light by ultrasound through an ultrasonic grating.
For a pure wave motion in fluid dynamics, the Stokes drift velocity is the average velocity when following a specific fluid parcel as it travels with the fluid flow. For instance, a particle floating at the free surface of water waves, experiences a net Stokes drift velocity in the direction of wave propagation.
In crystallography and solid state physics, the Laue equations relate incoming waves to outgoing waves in the process of elastic scattering, where the photon energy or light temporal frequency does not change upon scattering by a crystal lattice. They are named after physicist Max von Laue (1879–1960).
In mathematics, vector spherical harmonics (VSH) are an extension of the scalar spherical harmonics for use with vector fields. The components of the VSH are complex-valued functions expressed in the spherical coordinate basis vectors.
Phase stretch transform (PST) is a computational approach to signal and image processing. One of its utilities is for feature detection and classification. PST is related to time stretch dispersive Fourier transform. It transforms the image by emulating propagation through a diffractive medium with engineered 3D dispersive property. The operation relies on symmetry of the dispersion profile and can be understood in terms of dispersive eigenfunctions or stretch modes. PST performs similar functionality as phase-contrast microscopy, but on digital images. PST can be applied to digital images and temporal data. It is a physics-based feature engineering algorithm.
Lightfieldmicroscopy (LFM) is a scanning-free 3-dimensional (3D) microscopic imaging method based on the theory of light field. This technique allows sub-second (~10 Hz) large volumetric imaging with ~1 μm spatial resolution in the condition of weak scattering and semi-transparence, which has never been achieved by other methods. Just as in traditional light field rendering, there are two steps for LFM imaging: light field capture and processing. In most setups, a microlens array is used to capture the light field. As for processing, it can be based on two kinds of representations of light propagation: the ray optics picture and the wave optics picture. The Stanford University Computer Graphics Laboratory published their first prototype LFM in 2006 and has been working on the cutting edge since then.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)