High-resolution transmission electron microscopy is an imaging mode of specialized transmission electron microscopes that allows for direct imaging of the atomic structure of samples. [1] [2] It is a powerful tool to study properties of materials on the atomic scale, such as semiconductors, metals, nanoparticles and sp2-bonded carbon (e.g., graphene, C nanotubes). While this term is often also used to refer to high resolution scanning transmission electron microscopy, mostly in high angle annular dark field mode, this article describes mainly the imaging of an object by recording the two-dimensional spatial wave amplitude distribution in the image plane, similar to a "classic" light microscope. For disambiguation, the technique is also often referred to as phase contrast transmission electron microscopy, although this term is less appropriate. At present, the highest point resolution realised in high resolution transmission electron microscopy is around 0.5 ångströms (0.050 nm ). [3] At these small scales, individual atoms of a crystal and defects can be resolved. For 3-dimensional crystals, it is necessary to combine several views, taken from different angles, into a 3D map. This technique is called electron tomography.
One of the difficulties with high resolution transmission electron microscopy is that image formation relies on phase contrast. In phase-contrast imaging, contrast is not intuitively interpretable, as the image is influenced by aberrations of the imaging lenses in the microscope. The largest contributions for uncorrected instruments typically come from defocus and astigmatism. The latter can be estimated from the so-called Thon ring pattern appearing in the Fourier transform modulus of an image of a thin amorphous film.
The contrast of a high resolution transmission electron microscopy image arises from the interference in the image plane of the electron wave with itself. Due to our inability to record the phase of an electron wave, only the amplitude in the image plane is recorded. However, a large part of the structure information of the sample is contained in the phase of the electron wave. In order to detect it, the aberrations of the microscope (like defocus) have to be tuned in a way that converts the phase of the wave at the specimen exit plane into amplitudes in the image plane.
The interaction of the electron wave with the crystallographic structure of the sample is complex, but a qualitative idea of the interaction can readily be obtained. Each imaging electron interacts independently with the sample. Above the sample, the wave of an electron can be approximated as a plane wave incident on the sample surface. As it penetrates the sample, it is attracted by the positive atomic potentials of the atom cores, and channels along the atom columns of the crystallographic lattice (s-state model [4] ). At the same time, the interaction between the electron wave in different atom columns leads to Bragg diffraction. The exact description of dynamical scattering of electrons in a sample not satisfying the weak phase object approximation, which is almost all real samples, still remains the holy grail of electron microscopy. However, the physics of electron scattering and electron microscope image formation are sufficiently well known to allow accurate simulation of electron microscope images. [5]
As a result of the interaction with a crystalline sample, the electron exit wave right below the sample φe(x,u) as a function of the spatial coordinate x is a superposition of a plane wave and a multitude of diffracted beams with different in plane spatial frequencies u (spatial frequencies correspond to scattering angles, or distances of rays from the optical axis in a diffraction plane). The phase change φe(x,u) relative to the incident wave peaks at the location of the atom columns. The exit wave now passes through the imaging system of the microscope where it undergoes further phase change and interferes as the image wave in the imaging plane (mostly a digital pixel detector like a CCD camera). The recorded image is not a direct representation of the samples crystallographic structure. For instance, high intensity might or might not indicate the presence of an atom column in that precise location (see simulation). The relationship between the exit wave and the image wave is a highly nonlinear one and is a function of the aberrations of the microscope. It is described by the contrast transfer function.
The phase contrast transfer function is a function of limiting apertures and aberrations in the imaging lenses of a microscope. It describes their effect on the phase of the exit wave φe(x,u) and propagates it to the image wave. Following Williams and Carter, [6] assume the weak phase object approximation (thin sample), then the contrast transfer function becomes
where A(u) is the aperture function, E(u) describes the attenuation of the wave for higher spatial frequency u, also called envelope function. χ(u) is a function of the aberrations of the electron optical system.
The last, sinusoidal term of the contrast transfer function will determine the sign with which components of frequency u will enter contrast in the final image. If one takes into account only spherical aberration to third order and defocus, χ is rotationally symmetric about the optical axis of the microscope and thus only depends on the modulus u = |u|, given by
where Cs is the spherical aberration coefficient, λ is the electron wavelength, and Δf is the defocus. In transmission electron microscopy, defocus can easily be controlled and measured to high precision. Thus one can easily alter the shape of the contrast transfer function by defocusing the sample. Contrary to optical applications, defocusing can increase the precision and interpretability of the micrographs.
The aperture function cuts off beams scattered above a certain critical angle (given by the objective pole piece for ex), thus effectively limiting the attainable resolution. However it is the envelope functionE(u) which usually dampens the signal of beams scattered at high angles, and imposes a maximum to the transmitted spatial frequency. This maximum determines the highest resolution attainable with a microscope and is known as the information limit. E(u) can be described as a product of single envelopes:
due to
Specimen drift and vibration can be minimized in a stable environment. It is usually the spherical aberration Cs that limits spatial coherency and defines Es(u) and the chromatic aberration Cc, together with current and voltage instabilities that define the temporal coherency in Ec(u). These two envelopes determine the information limit by damping the signal transfer in Fourier space with increasing spatial frequency u
where α is the semiangle of the pencil of rays illuminating the sample. Clearly, if the wave aberration ('here represented by Cs and Δf) vanished, this envelope function would be a constant one. In case of an uncorrected transmission electron microscope with fixed Cs, the damping due to this envelope function can be minimized by optimizing the defocus at which the image is recorded (Lichte defocus).
The temporal envelope function can be expressed as
Here, δ is the focal spread with the chromatic aberration Cc as the parameter:
The terms and represent instabilities in of the total current in the magnetic lenses and the acceleration voltage. is the energy spread of electrons emitted by the source.
The information limit of current state-of-the-art transmission electron microscopes is well below 1 Å. The TEAM project at Lawrence Berkeley National Laboratory resulted in the first transmission electron microscope to reach an information limit of <0.5 Å in 2009 [7] by the use of a highly stable mechanical and electrical environment, an ultra-bright, monochromated electron source and double-hexapole aberration correctors.
Choosing the optimum defocus is crucial to fully exploit the capabilities of an electron microscope in high resolution transmission electron microscopy mode. However, there is no simple answer as to which one is the best.
In Gaussian focus one sets the defocus to zero, the sample is in focus. As a consequence contrast in the image plane gets its image components from the minimal area of the sample, the contrast is localized (no blurring and information overlap from other parts of the sample). The contrast transfer function becomes a function that oscillates quickly with Csu4. This means that for certain diffracted beams with a spatial frequency u the contribution to contrast in the recorded image will be reversed, thus making interpretation of the image difficult.
In Scherzer defocus, one aims to counter the term in u4 with the parabolic term Δfu2 of χ(u). Thus by choosing the right defocus value Δf one flattens χ(u) and creates a wide band where low spatial frequencies u are transferred into image intensity with a similar phase. In 1949, Scherzer found that the optimum defocus depends on microscope properties like the spherical aberration Cs and the accelerating voltage (through λ) in the following way:
where the factor 1.2 defines the extended Scherzer defocus. For the CM300 at NCEM, Cs = 0.6mm and an accelerating voltage of 300keV (λ = 1.97 pm) (Wavelength calculation) result in ΔfScherzer = -41.25 nm.
The point resolution of a microscope is defined as the spatial frequency ures where the contrast transfer function crosses the abscissa for the first time. At Scherzer defocus this value is maximized:
which corresponds to 6.1 nm−1 on the CM300. Contributions with a spatial frequency higher than the point resolution can be filtered out with an appropriate aperture leading to easily interpretable images at the cost of a lot of information lost.
Gabor defocus is used in electron holography where both amplitude and phase of the image wave are recorded. One thus wants to minimize crosstalk between the two. The Gabor defocus can be expressed as a function of the Scherzer defocus as
To exploit all beams transmitted through the microscope up to the information limit, one relies on a complex method called exit wave reconstruction which consists in mathematically reversing the effect of the contrast transfer function to recover the original exit wave φe(x,u). To maximize the information throughput, Hannes Lichte proposed in 1991 a defocus of a fundamentally different nature than the Scherzer defocus: because the dampening of the envelope function scales with the first derivative of χ(u), Lichte proposed a focus minimizing the modulus of dχ(u)/du [8]
where umax is the maximum transmitted spatial frequency. For the CM300 with an information limit of 0.8 Å Lichte defocus lies at −272 nm.
To calculate back to φe(x,u) the wave in the image plane is back propagated numerically to the sample. If all properties of the microscope are well known, it is possible to recover the real exit wave with very high accuracy.
First however, both phase and amplitude of the electron wave in the image plane must be measured. As our instruments only record amplitudes, an alternative method to recover the phase has to be used. There are two methods in use today:
Both methods extend the point resolution of the microscope past the information limit, which is the highest possible resolution achievable on a given machine. The ideal defocus value for this type of imaging is known as Lichte defocus and is usually several hundred nanometers negative.
Microscopy is the technical field of using microscopes to view objects and areas of objects that cannot be seen with the naked eye. There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy.
In physics and mathematics, wavelength or spatial period of a wave or periodic function is the distance over which the wave's shape repeats. In other words, it is the distance between consecutive corresponding points of the same phase on the wave, such as two adjacent crests, troughs, or zero crossings. Wavelength is a characteristic of both traveling waves and standing waves, as well as other spatial wave patterns. The inverse of the wavelength is called the spatial frequency. Wavelength is commonly designated by the Greek letter lambda (λ). The term "wavelength" is also sometimes applied to modulated waves, and to the sinusoidal envelopes of modulated waves or waves formed by interference of several sinusoids.
Transmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through a specimen to form an image. The specimen is most often an ultrathin section less than 100 nm thick or a suspension on a grid. An image is formed from the interaction of the electrons with the sample as the beam is transmitted through the specimen. The image is then magnified and focused onto an imaging device, such as a fluorescent screen, a layer of photographic film, or a detector such as a scintillator attached to a charge-coupled device or a direct electron detector.
Angular resolution describes the ability of any image-forming device such as an optical or radio telescope, a microscope, a camera, or an eye, to distinguish small details of an object, thereby making it a major determinant of image resolution. It is used in optics applied to light waves, in antenna theory applied to radio waves, and in acoustics applied to sound waves. The colloquial use of the term "resolution" sometimes causes confusion; when an optical system is said to have a high resolution or high angular resolution, it means that the perceived distance, or actual angular distance, between resolved neighboring objects is small. The value that quantifies this property, θ, which is given by the Rayleigh criterion, is low for a system with a high resolution. The closely related term spatial resolution refers to the precision of a measurement with respect to space, which is directly connected to angular resolution in imaging instruments. The Rayleigh criterion shows that the minimum angular spread that can be resolved by an image-forming system is limited by diffraction to the ratio of the wavelength of the waves to the aperture width. For this reason, high-resolution imaging systems such as astronomical telescopes, long distance telephoto camera lenses and radio telescopes have large apertures.
Fourier optics is the study of classical optics using Fourier transforms (FTs), in which the waveform being considered is regarded as made up of a combination, or superposition, of plane waves. It has some parallels to the Huygens–Fresnel principle, in which the wavefront is regarded as being made up of a combination of spherical wavefronts whose sum is the wavefront being studied. A key difference is that Fourier optics considers the plane waves to be natural modes of the propagation medium, as opposed to Huygens–Fresnel, where the spherical waves originate in the physical medium.
In optics, any optical instrument or system – a microscope, telescope, or camera – has a principal limit to its resolution due to the physics of diffraction. An optical instrument is said to be diffraction-limited if it has reached this limit of resolution performance. Other factors may affect an optical system's performance, such as lens imperfections or aberrations, but these are caused by errors in the manufacture or calculation of a lens, whereas the diffraction limit is the maximum resolution possible for a theoretically perfect, or ideal, optical system.
Photoemission electron microscopy is a type of electron microscopy that utilizes local variations in electron emission to generate image contrast. The excitation is usually produced by ultraviolet light, synchrotron radiation or X-ray sources. PEEM measures the coefficient indirectly by collecting the emitted secondary electrons generated in the electron cascade that follows the creation of the primary core hole in the absorption process. PEEM is a surface sensitive technique because the emitted electrons originate from a shallow layer. In physics, this technique is referred to as PEEM, which goes together naturally with low-energy electron diffraction (LEED), and low-energy electron microscopy (LEEM). In biology, it is called photoelectron microscopy (PEM), which fits with photoelectron spectroscopy (PES), transmission electron microscopy (TEM), and scanning electron microscopy (SEM).
The point spread function (PSF) describes the response of a focused optical imaging system to a point source or point object. A more general term for the PSF is the system's impulse response; the PSF is the impulse response or impulse response function (IRF) of a focused optical imaging system. The PSF in many contexts can be thought of as the extended blob in an image that represents a single point object, that is considered as a spatial impulse. In functional terms, it is the spatial domain version of the optical transfer function (OTF) of an imaging system. It is a useful concept in Fourier optics, astronomical imaging, medical imaging, electron microscopy and other imaging techniques such as 3D microscopy and fluorescence microscopy.
A scanning transmission electron microscope (STEM) is a type of transmission electron microscope (TEM). Pronunciation is [stɛm] or [ɛsti:i:ɛm]. As with a conventional transmission electron microscope (CTEM), images are formed by electrons passing through a sufficiently thin specimen. However, unlike CTEM, in STEM the electron beam is focused to a fine spot which is then scanned over the sample in a raster illumination system constructed so that the sample is illuminated at each point with the beam parallel to the optical axis. The rastering of the beam across the sample makes STEM suitable for analytical techniques such as Z-contrast annular dark-field imaging, and spectroscopic mapping by energy dispersive X-ray (EDX) spectroscopy, or electron energy loss spectroscopy (EELS). These signals can be obtained simultaneously, allowing direct correlation of images and spectroscopic data.
Optical resolution describes the ability of an imaging system to resolve detail, in the object that is being imaged. An imaging system may have many individual components, including one or more lenses, and/or recording and display components. Each of these contributes to the optical resolution of the system; the environment in which the imaging is done often is a further important factor.
Phase-contrast imaging is a method of imaging that has a range of different applications. It measures differences in the refractive index of different materials to differentiate between structures under analysis. In conventional light microscopy, phase contrast can be employed to distinguish between structures of similar transparency, and to examine crystals on the basis of their double refraction. This has uses in biological, medical and geological science. In X-ray tomography, the same physical principles can be used to increase image contrast by highlighting small details of differing refractive index within structures that are otherwise uniform. In transmission electron microscopy (TEM), phase contrast enables very high resolution (HR) imaging, making it possible to distinguish features a few Angstrom apart.
The Strehl ratio is a measure of the quality of optical image formation, originally proposed by Karl Strehl, after whom the term is named. Used variously in situations where optical resolution is compromised due to lens aberrations or due to imaging through the turbulent atmosphere, the Strehl ratio has a value between 0 and 1, with a hypothetical, perfectly unaberrated optical system having a Strehl ratio of 1.
The multislice algorithm is a method for the simulation of the elastic scattering of an electron beam with matter, including all multiple scattering effects. The method is reviewed in the book by John M. Cowley, and also the work by Ishizuka. The algorithm is used in the simulation of high resolution transmission electron microscopy (HREM) micrographs, and serves as a useful tool for analyzing experimental images. This article describes some relevant background information, the theoretical basis of the technique, approximations used, and several software packages that implement this technique. Some of the advantages and limitations of the technique and important considerations that need to be taken into account are described.
Optical sectioning is the process by which a suitably designed microscope can produce clear images of focal planes deep within a thick sample. This is used to reduce the need for thin sectioning using instruments such as the microtome. Many different techniques for optical sectioning are used and several microscopy techniques are specifically designed to improve the quality of optical sectioning.
The contrast transfer function (CTF) mathematically describes how aberrations in a transmission electron microscope (TEM) modify the image of a sample. This contrast transfer function (CTF) sets the resolution of high-resolution transmission electron microscopy (HRTEM), also known as phase contrast TEM.
Crystallographic image processing (CIP) is traditionally understood as being a set of key steps in the determination of the atomic structure of crystalline matter from high-resolution electron microscopy (HREM) images obtained in a transmission electron microscope (TEM) that is run in the parallel illumination mode. The term was created in the research group of Sven Hovmöller at Stockholm University during the early 1980s and became rapidly a label for the "3D crystal structure from 2D transmission/projection images" approach. Since the late 1990s, analogous and complementary image processing techniques that are directed towards the achieving of goals with are either complementary or entirely beyond the scope of the original inception of CIP have been developed independently by members of the computational symmetry/geometry, scanning transmission electron microscopy, scanning probe microscopy communities, and applied crystallography communities.
As described here, white light interferometry is a non-contact optical method for surface height measurement on 3D structures with surface profiles varying between tens of nanometers and a few centimeters. It is often used as an alternative name for coherence scanning interferometry in the context of areal surface topography instrumentation that relies on spectrally-broadband, visible-wavelength light.
The scanning helium microscope (SHeM) is a form of microscopy that uses low-energy (5–100 meV) neutral helium atoms to image the surface of a sample without any damage to the sample caused by the imaging process. Since helium is inert and neutral, it can be used to study delicate and insulating surfaces. Images are formed by rastering a sample underneath an atom beam and monitoring the flux of atoms that are scattered into a detector at each point.
The transport-of-intensity equation (TIE) is a computational approach to reconstruct the phase of a complex wave in optical and electron microscopy. It describes the internal relationship between the intensity and phase distribution of a wave.
Convergent beam electron diffraction (CBED) is an electron diffraction technique where a convergent or divergent beam of electrons is used to study materials.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)