In physics, a wavefront of a time-varying field is the set (locus) of all points where the wave has the same phase of the sinusoid.The term is generally meaningful only for fields that, at each point, vary sinusoidally in time with a single temporal frequency (otherwise the phase is not well defined).
Physics is the natural science that studies matter, its motion and behavior through space and time, and that studies the related entities of energy and force. Physics is one of the most fundamental scientific disciplines, and its main goal is to understand how the universe behaves.
In geometry, a locus is a set of all points, whose location satisfies or is determined by one or more specified conditions.
In modern mathematics, a point refers usually to an element of some set called a space.
Wavefronts usually move with time. For waves propagating in a unidimensional medium, the wavefronts are usually single points; they are curves in a two dimensional medium, and surfaces in a three-dimensional one.
In mathematics, a curve is, generally speaking, an object similar to a line but that need not be straight. Thus, a curve is a generalization of a line, in that it may be curved.
In mathematics, a surface is a generalization of a plane which doesn't need to be flat – that is, the curvature is not necessarily zero. This is analogous to a curve generalizing a straight line. There are several more precise definitions, depending on the context and the mathematical tools that are used for the study.
For a sinusoidal plane wave, the wavefronts are planes perpendicular to the direction of propagation, that move in that direction together with the wave. For a sinusoidal spherical wave, the wavefronts are spherical surfaces that expand with it. If the speed of propagation is different at different points of a wavefront, the shape and/or orientation of the waverfronts may change by refraction. In particular, lenses can change the shape of optical wavefronts from planar to spherical, or vice-versa.
In physics, sinusoidalplane wave is a special case of plane wave: a field whose value varies as a sinusoidal function of time and of the distance from some fixed plane.
In physics refraction is the change in direction of a wave passing from one medium to another or from a gradual change in the medium. Refraction of light is the most commonly observed phenomenon, but other waves such as sound waves and water waves also experience refraction. How much a wave is refracted is determined by the change in wave speed and the initial direction of wave propagation relative to the direction of change in speed.
A lens is a transmissive optical device that focuses or disperses a light beam by means of refraction. A simple lens consists of a single piece of transparent material, while a compound lens consists of several simple lenses (elements), usually arranged along a common axis. Lenses are made from materials such as glass or plastic, and are ground and polished or molded to a desired shape. A lens can focus light to form an image, unlike a prism, which refracts light without focusing. Devices that similarly focus or disperse waves and radiation other than visible light are also called lenses, such as microwave lenses, electron lenses, acoustic lenses, or explosive lenses.
Optical systems can be described with Maxwell's equations, and linear propagating waves such as sound or electron beams have similar wave equations. However, given the above simplifications, Huygens' principle provides a quick method to predict the propagation of a wavefront through, for example, free space. The construction is as follows: Let every point on the wavefront be considered a new point source. By calculating the total effect from every point source, the resulting field at new points can be computed. Computational algorithms are often based on this approach. Specific cases for simple wavefronts can be computed directly. For example, a spherical wavefront will remain spherical as the energy of the wave is carried away equally in all directions. Such directions of energy flow, which are always perpendicular to the wavefront, are called rays creating multiple wavefronts.
Maxwell's equations are a set of coupled partial differential equations that, together with the Lorentz force law, form the foundation of classical electromagnetism, classical optics, and electric circuits. The equations provide a mathematical model for electric, optical, and radio technologies, such as power generation, electric motors, wireless communication, lenses, radar etc. Maxwell's equations describe how electric and magnetic fields are generated by charges, currents, and changes of the fields. One important consequence of the equations is that they demonstrate how fluctuating electric and magnetic fields propagate at a constant speed (c) in the vacuum, the "speed of light". Known as electromagnetic radiation, these waves may occur at various wavelengths to produce a spectrum from radio waves to γ-rays. The equations are named after the physicist and mathematician James Clerk Maxwell, who between 1861 and 1862 published an early form of the equations that included the Lorentz force law. He also first used the equations to propose that light is an electromagnetic phenomenon.
A point source is a single identifiable localised source of something. A point source has negligible extent, distinguishing it from other source geometries. Sources are called point sources because in mathematical modeling, these sources can usually be approximated as a mathematical point to simplify analysis.
The simplest form of a wavefront is the plane wave, where the rays are parallel to one another. The light from this type of wave is referred to as collimated light. The plane wavefront is a good model for a surface-section of a very large spherical wavefront; for instance, sunlight strikes the earth with a spherical wavefront that has a radius of about 150 million kilometers (1 AU). For many purposes, such a wavefront can be considered planar over distances of the diameter of Earth.
In physics, a plane wave is a special case of wave or field: a physical quantity whose value, at any moment, is constant over any plane that is perpendicular to a fixed direction in space.
In geometry, parallel lines are lines in a plane which do not meet; that is, two lines in a plane that do not intersect or touch each other at any point are said to be parallel. By extension, a line and a plane, or two planes, in three-dimensional Euclidean space that do not share a point are said to be parallel. However, two lines in three-dimensional space which do not meet must be in a common plane to be considered parallel; otherwise they are called skew lines. Parallel planes are planes in the same three-dimensional space that never meet.
Wavefronts travel with the speed of light in all directions in an isotropic medium.
Methods utilizing wavefront measurements or predictions can be considered an advanced approach to lens optics, where a single focal distance may not exist due to lens thickness or imperfections. Note also that for manufacturing reasons, a perfect lens has a spherical (or toroidal) surface shape though, theoretically, the ideal surface would be aspheric. Shortcomings such as these in an optical system cause what are called optical aberrations. The best-known aberrations include spherical aberration and coma.
An aspheric lens or asphere is a lens whose surface profiles are not portions of a sphere or cylinder. In photography, a lens assembly that includes an aspheric element is often called an aspherical lens.
Spherical aberration is a type of aberration found in optical systems that use elements with spherical surfaces. Lenses and curved mirrors are most often made with surfaces that are spherical, because this shape is easier to form than non-spherical curved surfaces. Light rays that strike a spherical surface off-centre are refracted or reflected more or less than those that strike close to the centre. This deviation reduces the quality of images produced by optical systems.
In optics, the coma, or comatic aberration, in an optical system refers to aberration inherent to certain optical designs or due to imperfection in the lens or other components that results in off-axis point sources such as stars appearing distorted, appearing to have a tail (coma) like a comet. Specifically, coma is defined as a variation in magnification over the entrance pupil. In refractive or diffractive optical systems, especially those imaging a wide spectral range, coma can be a function of wavelength, in which case it is a form of chromatic aberration.
However there may be more complex sources of aberrations such as in a large telescope due to spatial variations in the index of refraction of the atmosphere. The deviation of a wavefront in an optical system from a desired perfect planar wavefront is called the wavefront aberration. Wavefront aberrations are usually described as either a sampled image or a collection of two-dimensional polynomial terms. Minimization of these aberrations is considered desirable for many applications in optical systems.
A wavefront sensor is a device which measures the wavefront aberration in a coherent signal to describe the optical quality or lack thereof in an optical system. A very common method is to use a Shack–Hartmann lenslet array. There are many applications that include adaptive optics, optical metrology and even the measurement of the aberrations in the eye itself. In this approach, a weak laser source is directed into the eye and the reflection off the retina is sampled and processed.
Alternative wavefront sensing techniques to the Shack–Hartmann system are emerging. Mathematical techniques like phase imaging or curvature sensing are also capable of providing wavefront estimations. These algorithms compute wavefront images from conventional brightfield images at different focal planes without the need for specialised wavefront optics. While Shack-Hartmann lenslet arrays are limited in lateral resolution to the size of the lenslet array, techniques such as these are only limited by the resolution of digital images used to compute the wavefront measurements.
Another application of software reconstruction of the phase is the control of telescopes through the use of adaptive optics. A common method is the Roddier test, also called wavefront curvature sensing. It yields good correction, but needs an already good system as a starting point.
In optics, aberration is a property of optical systems such as lenses that causes light to be spread out over some region of space rather than focused to a point. Aberrations cause the image formed by a lens to be blurred or distorted, with the nature of the distortion depending on the type of aberration. Aberration can be defined as a departure of the performance of an optical system from the predictions of paraxial optics. In an imaging system, it occurs when light from one point of an object does not converge into a single point after transmission through the system. Aberrations occur because the simple paraxial theory is not a completely accurate model of the effect of an optical system on light, rather than due to flaws in the optical elements.
Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behaviour of visible, ultraviolet, and infrared light. Because light is an electromagnetic wave, other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.
In optics, chromatic aberration is a failure of a lens to focus all colors to the same point. It is caused by dispersion: the refractive index of the lens elements varies with the wavelength of light. The refractive index of most transparent materials decreases with increasing wavelength. Since the focal length of a lens depends on the refractive index, this variation in refractive index affects focusing. Chromatic aberration manifests itself as "fringes" of color along boundaries that separate dark and bright parts of the image.
Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effect of incoming wavefront distortions by deforming a mirror in order to compensate for the distortion. It is used in astronomical telescopes and laser communication systems to remove the effects of atmospheric distortion, in microscopy, optical fabrication and in retinal imaging systems to reduce optical aberrations. Adaptive optics works by measuring the distortions in a wavefront and compensating for them with a device that corrects those errors such as a deformable mirror or a liquid crystal array.
Fourier optics is the study of classical optics using Fourier transforms (FTs), in which the waveform being considered is regarded as made up of a combination, or superposition, of plane waves. It has some parallels to the Huygens–Fresnel principle, in which the wavefront is regarded as being made up of a combination of spherical wavefronts whose sum is the wavefront being studied. A key difference is that Fourier optics considers the plane waves to be natural modes of the propagation medium, as opposed to Huygens–Fresnel, where the spherical waves originate in the physical medium.
Gradient-index (GRIN) optics is the branch of optics covering optical effects produced by a gradient of the refractive index of a material. Such gradual variation can be used to produce lenses with flat surfaces, or lenses that do not have the aberrations typical of traditional spherical lenses. Gradient-index lenses may have a refraction gradient that is spherical, axial, or radial.
Geometrical optics, or ray optics, is a model of optics that describes light propagation in terms of rays. The ray in geometric optics is an abstraction useful for approximating the paths along which light propagates under certain circumstances.
The point spread function (PSF) describes the response of an imaging system to a point source or point object. A more general term for the PSF is a system's impulse response, the PSF being the impulse response of a focused optical system. The PSF in many contexts can be thought of as the extended blob in an image that represents a single point object. In functional terms it is the spatial domain version of the optical transfer function of the imaging system. It is a useful concept in Fourier optics, astronomical imaging, medical imaging, electron microscopy and other imaging techniques such as 3D microscopy and fluorescence microscopy. The degree of spreading (blurring) of the point object is a measure for the quality of an imaging system. In non-coherent imaging systems such as fluorescent microscopes, telescopes or optical microscopes, the image formation process is linear in the image intensity and described by linear system theory. This means that when two objects A and B are imaged simultaneously, the resulting image is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa, owing to the non-interacting property of photons. In space-invariant system, i.e. the PSF is the same everywhere in the imagin space, the image of a complex object can then is the convolution of the true object and the PSF. However, when the detected light is coherent, image formation is linear in the complex field. The recorded intensity image then can show cancellations or other non-linear effects.
A catadioptric optical system is one where refraction and reflection are combined in an optical system, usually via lenses (dioptrics) and curved mirrors (catoptrics). Catadioptric combinations are used in focusing systems such as searchlights, headlamps, early lighthouse focusing systems, optical telescopes, microscopes, and telephoto lenses. Other optical systems that use lenses and mirrors are also referred to as "catadioptric" such as surveillance catadioptric sensors.
In optics a ray is an idealized model of light, obtained by choosing a line that is perpendicular to the wavefronts of the actual light, and that points in the direction of energy flow. Rays are used to model the propagation of light through an optical system, by dividing the real light field up into discrete rays that can be computationally propagated through the system by the techniques of ray tracing. This allows even very complex optical systems to be analyzed mathematically or simulated by computer. Ray tracing uses approximate solutions to Maxwell's equations that are valid as long as the light waves propagate through and around objects whose dimensions are much greater than the light's wavelength. Ray theory does not describe phenomena such as interference and diffraction, which require wave theory.
A spatial filter is an optical device which uses the principles of Fourier optics to alter the structure of a beam of light or other electromagnetic radiation, typically coherent laser light. Spatial filtering is commonly used to "clean up" the output of lasers, removing aberrations in the beam due to imperfect, dirty, or damaged optics, or due to variations in the laser gain medium itself. This filtering can be applied to transmit a pure transverse mode from a multimode laser while blocking other modes emitted from the optical resonator. The term "filtering" indicates that the desirable structural features of the original source pass through the filter, while the undesirable features are blocked. Apparatus which follows the filter effectively sees a higher-quality but lower-powered image of the source, instead of the actual source directly. An example of the use of spatial filter can be seen in advanced setup of micro-Raman spectroscopy.
In optics, defocus is the aberration in which an image is simply out of focus. This aberration is familiar to anyone who has used a camera, videocamera, microscope, telescope, or binoculars. Optically, defocus refers to a translation of the focus along the optical axis away from the detection surface. In general, defocus reduces the sharpness and contrast of the image. What should be sharp, high-contrast edges in a scene become gradual transitions. Fine detail in the scene is blurred or even becomes invisible. Nearly all image-forming optical devices incorporate some form of focus adjustment to minimize defocus and maximize image quality.
A wavefront sensor is a device for measuring the aberrations of an optical wavefront. Although an amplitude splitting interferometer such as the Michelson interferometer could be called a wavefront sensor, the term is normally applied to instruments that do not require an unaberrated reference beam to interfere with. They are commonly used in adaptive optics systems, lens testing and increasingly in ophthalmology.
A Shack–Hartmannwavefront sensor (SHWFS) is an optical instrument used for characterizing an imaging system. It is a wavefront sensor commonly used in adaptive optics systems. It consists of an array of lenses of the same focal length. Each is focused onto a photon sensor. The local tilt of the wavefront across each lens then maps to the position of the focal spot on the sensor. Any phase aberration can be approximated by a set of discrete tilts. By sampling the wavefront with an array of lenslets, all of these local tilts can be measured and the whole wavefront reconstructed.
A wavefront curvature sensor is a device for measuring the aberrations of an optical wavefront. Like a Shack–Hartmann wavefront sensor it uses an array of small lenses to focus the wavefront into an array of spots. Unlike the Shack-Hartmann, which measures the position of the spots, the curvature sensor measures the intensity on either side of the focal plane. If a wavefront has a phase curvature, it will alter the position of the focal spot along the axis of the beam, thus by measuring the relative intensities in two places the curvature can be deduced.
In physics, ray tracing is a method for calculating the path of waves or particles through a system with regions of varying propagation velocity, absorption characteristics, and reflecting surfaces. Under these circumstances, wavefronts may bend, change direction, or reflect off surfaces, complicating analysis. Ray tracing solves the problem by repeatedly advancing idealized narrow beams called rays through the medium by discrete amounts. Simple problems can be analyzed by propagating a few rays using simple mathematics. More detailed analysis can be performed by using a computer to propagate many rays.
Petzval field curvature, named for Joseph Petzval, describes the optical aberration in which a flat object normal to the optical axis cannot be brought properly into focus on a flat image plane.
The eye, like any other optical system, suffers from a number of specific optical aberrations. The optical quality of the eye is limited by optical aberrations, diffraction and scatter. Correction of spherocylindrical refractive errors has been possible for nearly two centuries following Airy's development of methods to measure and correct ocular astigmatism. It has only recently become possible to measure the aberrations of the eye and with the advent of refractive surgery it might be possible to correct certain types of irregular astigmatism.
The pupil function or aperture function describes how a light wave is affected upon transmission through an optical imaging system such as a camera, microscope, or the human eye. More specifically, it is a complex function of the position in the pupil or aperture that indicates the relative change in amplitude and phase of the light wave. Sometimes this function is referred to as the generalized pupil function, in which case pupil function only indicates whether light is transmitted or not. Imperfections in the optics typically have a direct effect on the pupil function, it is therefore an important tool to study optical imaging systems and their performance.