In physics, ray tracing is a method for calculating the path of waves or particles through a system with regions of varying propagation velocity, absorption characteristics, and reflecting surfaces. Under these circumstances, wavefronts may bend, change direction, or reflect off surfaces, complicating analysis. Strictly speaking Ray tracing is when analytic solutions to the ray's trajectories are solved; however Ray tracing is often confused with ray-marching which numerically solves problems by repeatedly advancing idealized narrow beams called rays through the medium by discrete amounts. Simple problems can be analyzed by propagating a few rays using simple mathematics. More detailed analysis can be performed by using a computer to propagate many rays.
When applied to problems of electromagnetic radiation, ray tracing often relies on approximate solutions to Maxwell's equations that are valid as long as the light waves propagate through and around objects whose dimensions are much greater than the light's wavelength. Ray theory does not describe phenomena such as interference and diffraction, which require wave theory (involving the phase of the wave).
Ray tracing works by assuming that the particle or wave can be modeled as a large number of very narrow beams (rays), and that there exists some distance, possibly very small, over which such a ray is locally straight. The ray tracer will advance the ray over this distance, and then use a local derivative of the medium to calculate the ray's new direction. From this location, a new ray is sent out and the process is repeated until a complete path is generated. If the simulation includes solid objects, the ray may be tested for intersection with them at each step, making adjustments to the ray's direction if a collision is found. Other properties of the ray may be altered as the simulation advances as well, such as intensity, wavelength, or polarization. This process is repeated with as many rays as are necessary to understand the behavior of the system.
Ray tracing is being increasingly used in astronomy to simulate realistic images of the sky. Unlike conventional simulations, ray tracing does not use the expected or calculated point spread function (PSF) of a telescope and instead traces the journey of each photon from entrance in the upper atmosphere to collision with the detector. [1] Most of the dispersion and distortion, arising mainly from atmosphere, optics and detector are taken into account. While this method of simulating images is inherently slow, advances in CPU and GPU capabilities has somewhat mitigated this problem. It can also be used in designing telescopes. Notable examples include Large Synoptic Survey Telescope where this kind of ray tracing was first used with PhoSim [2] to create simulated images. [3]
One particular form of ray tracing is radio signal ray tracing, which traces radio signals, modeled as rays, through the ionosphere where they are refracted and/or reflected back to the Earth. This form of ray tracing involves the integration of differential equations that describe the propagation of electromagnetic waves through dispersive and anisotropic media such as the ionosphere. An example of physics-based radio signal ray tracing is shown to the right. Radio communicators use ray tracing to help determine the precise behavior of radio signals as they propagate through the ionosphere.
The image at the right illustrates the complexity of the situation. Unlike optical ray tracing where the medium between objects typically has a constant refractive index, signal ray tracing must deal with the complexities of a spatially varying refractive index, where changes in ionospheric electron densities influence the refractive index and hence, ray trajectories. Two sets of signals are broadcast at two different elevation angles. When the main signal penetrates into the ionosphere, the magnetic field splits the signal into two component waves which are separately ray traced through the ionosphere. The ordinary wave (red) component follows a path completely independent of the extraordinary wave (green) component.
Sound velocity in the ocean varies with depth due to changes in density and temperature, reaching a local minimum near a depth of 800–1000 meters. This local minimum, called the SOFAR channel, acts as a waveguide, as sound tends to bend towards it. Ray tracing may be used to calculate the path of sound through the ocean up to very large distances, incorporating the effects of the SOFAR channel, as well as reflections and refractions off the ocean surface and bottom. From this, locations of high and low signal intensity may be computed, which are useful in the fields of ocean acoustics, underwater acoustic communication, and acoustic thermometry.
Ray tracing may be used in the design of lenses and optical systems, such as in cameras, microscopes, telescopes, and binoculars, and its application in this field dates back to the 1900s. Geometric ray tracing is used to describe the propagation of light rays through a lens system or optical instrument, allowing the image-forming properties of the system to be modeled. The following effects can be integrated into a ray tracer in a straightforward fashion:
For the application of lens design, two special cases of wave interference are important to account for. In a focal point, rays from a point light source meet again and may constructively or destructively interfere with each other. Within a very small region near this point, incoming light may be approximated by plane waves which inherit their direction from the rays. The optical path length from the light source is used to compute the phase. The derivative of the position of the ray in the focal region on the source position is used to obtain the width of the ray, and from that the amplitude of the plane wave. The result is the point spread function, whose Fourier transform is the optical transfer function. From this, the Strehl ratio can also be calculated.
The other special case to consider is that of the interference of wavefronts, which are approximated as planes. However, when the rays come close together or even cross, the wavefront approximation breaks down. Interference of spherical waves is usually not combined with ray tracing, thus diffraction at an aperture cannot be calculated. However, these limitations can be resolved by an advanced modeling technique called Field Tracing. Field Tracing is a modelling technique, combining geometric optics with physical optics enabling to overcome the limitations of interference and diffraction in designing.
The ray tracing techniques are used to optimize the design of the instrument by minimizing aberrations, for photography, and for longer wavelength applications such as designing microwave or even radio systems, and for shorter wavelengths, such as ultraviolet and X-ray optics.
Before the advent of the computer, ray tracing calculations were performed by hand using trigonometry and logarithmic tables. The optical formulas of many classic photographic lenses were optimized by roomfuls of people, each of whom handled a small part of the large calculation. Now they are worked out in optical design software. A simple version of ray tracing known as ray transfer matrix analysis is often used in the design of optical resonators used in lasers. The basic principles of the most frequently used algorithm could be found in Spencer and Murty's fundamental paper: "General ray tracing Procedure". [4]
There is a ray tracing technique called focal-plane ray tracing where how an optical ray will be after a lens is determined based on a lens focal plane and how the ray crosses the plane. [5] This method utilizes the fact that rays from a point on the front focal plane of a positive lens will be parallel right after the lens and rays toward a point on the back or rear focal plane of a negative lens will also be parallel after the lens. In each case, the direction of the parallel rays after the lens is determined by a ray appearing to cross the lens nodal points (or the lens center for a thin lens).
In seismology, geophysicists use ray tracing to aid in earthquake location and tomographic reconstruction of the Earth's interior. [6] [7] Seismic wave velocity varies within and beneath Earth's crust, causing these waves to bend and reflect. Ray tracing may be used to compute paths through a geophysical model, following them back to their source, such as an earthquake, or deducing the properties of the intervening material. [8] In particular, the discovery of the seismic shadow zone (illustrated at right) allowed scientists to deduce the presence of Earth's molten core.
In general relativity, where gravitational lensing can occur, the geodesics of the light rays receiving at the observer are integrated backwards in time until they hit the region of interest. Image synthesis under this technique can be view as an extension of the usual ray tracing in computer graphics. [9] [10] An example of such synthesis is found in the 2014 film Interstellar . [11]
In laser-plasma physics ray-tracing can be used to simplify the calculations of laser propagation inside a plasma. Analytic solutions for ray trajectories in simple plasma density profiles are a well established, [12] however researchers in laser-plasma physics often rely on ray-marching techniques due to the complexity of plasma density, temperature, and flow profiles which are often solved for using computational fluid dynamics simulations.
In optics, aberration is a property of optical systems, such as lenses, that causes light to be spread out over some region of space rather than focused to a point. Aberrations cause the image formed by a lens to be blurred or distorted, with the nature of the distortion depending on the type of aberration. Aberration can be defined as a departure of the performance of an optical system from the predictions of paraxial optics. In an imaging system, it occurs when light from one point of an object does not converge into a single point after transmission through the system. Aberrations occur because the simple paraxial theory is not a completely accurate model of the effect of an optical system on light, rather than due to flaws in the optical elements.
Diffraction is the interference or bending of waves around the corners of an obstacle or through an aperture into the region of geometrical shadow of the obstacle/aperture. The diffracting object or aperture effectively becomes a secondary source of the propagating wave. Italian scientist Francesco Maria Grimaldi coined the word diffraction and was the first to record accurate observations of the phenomenon in 1660.
Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behaviour of visible, ultraviolet, and infrared light. Light is a type of electromagnetic radiation, and other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.
In optics, the refractive index of an optical medium is a dimensionless number that gives the indication of the light bending ability of that medium.
In physics, refraction is the redirection of a wave as it passes from one medium to another. The redirection can be caused by the wave's change in speed or by a change in the medium. Refraction of light is the most commonly observed phenomenon, but other waves such as sound waves and water waves also experience refraction. How much a wave is refracted is determined by the change in wave speed and the initial direction of wave propagation relative to the direction of change in speed.
In optics, a diffraction grating is an optical grating with a periodic structure that diffracts light, or another type of electromagnetic radiation, into several beams traveling in different directions. The emerging coloration is a form of structural coloration. The directions or diffraction angles of these beams depend on the wave (light) incident angle to the diffraction grating, the spacing or periodic distance between adjacent diffracting elements on the grating, and the wavelength of the incident light. The grating acts as a dispersive element. Because of this, diffraction gratings are commonly used in monochromators and spectrometers, but other applications are also possible such as optical encoders for high-precision motion control and wavefront measurement.
Interferometry is a technique which uses the interference of superimposed waves to extract information. Interferometry typically uses electromagnetic waves and is an important investigative technique in the fields of astronomy, fiber optics, engineering metrology, optical metrology, oceanography, seismology, spectroscopy, quantum mechanics, nuclear and particle physics, plasma physics, biomolecular interactions, surface profiling, microfluidics, mechanical stress/strain measurement, velocimetry, optometry, and making holograms.
Adaptive optics (AO) is a technique of precisely deforming a mirror in order to compensate for light distortion. It is used in astronomical telescopes and laser communication systems to remove the effects of atmospheric distortion, in microscopy, optical fabrication and in retinal imaging systems to reduce optical aberrations. Adaptive optics works by measuring the distortions in a wavefront and compensating for them with a device that corrects those errors such as a deformable mirror or a liquid crystal array.
Angular resolution describes the ability of any image-forming device such as an optical or radio telescope, a microscope, a camera, or an eye, to distinguish small details of an object, thereby making it a major determinant of image resolution. It is used in optics applied to light waves, in antenna theory applied to radio waves, and in acoustics applied to sound waves. The colloquial use of the term "resolution" sometimes causes confusion; when an optical system is said to have a high resolution or high angular resolution, it means that the perceived distance, or actual angular distance, between resolved neighboring objects is small. The value that quantifies this property, θ, which is given by the Rayleigh criterion, is low for a system with a high resolution. The closely related term spatial resolution refers to the precision of a measurement with respect to space, which is directly connected to angular resolution in imaging instruments. The Rayleigh criterion shows that the minimum angular spread that can be resolved by an image-forming system is limited by diffraction to the ratio of the wavelength of the waves to the aperture width. For this reason, high-resolution imaging systems such as astronomical telescopes, long distance telephoto camera lenses and radio telescopes have large apertures.
Optics is the branch of physics which involves the behavior and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behavior of visible, ultraviolet, and infrared light. Because light is an electromagnetic wave, other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.
Reflection is the change in direction of a wavefront at an interface between two different media so that the wavefront returns into the medium from which it originated. Common examples include the reflection of light, sound and water waves. The law of reflection says that for specular reflection the angle at which the wave is incident on the surface equals the angle at which it is reflected.
In optics, the Fraunhofer diffraction equation is used to model the diffraction of waves when plane waves are incident on a diffracting object, and the diffraction pattern is viewed at a sufficiently long distance from the object, and also when it is viewed at the focal plane of an imaging lens. In contrast, the diffraction pattern created near the diffracting object and is given by the Fresnel diffraction equation.
In physics, the wavefront of a time-varying wave field is the set (locus) of all points having the same phase. The term is generally meaningful only for fields that, at each point, vary sinusoidally in time with a single temporal frequency.
Electron optics is a mathematical framework for the calculation of electron trajectories in the presence of electromagnetic fields. The term optics is used because magnetic and electrostatic lenses act upon a charged particle beam similarly to optical lenses upon a light beam.
In optics, a ray is an idealized geometrical model of light or other electromagnetic radiation, obtained by choosing a curve that is perpendicular to the wavefronts of the actual light, and that points in the direction of energy flow. Rays are used to model the propagation of light through an optical system, by dividing the real light field up into discrete rays that can be computationally propagated through the system by the techniques of ray tracing. This allows even very complex optical systems to be analyzed mathematically or simulated by computer. Ray tracing uses approximate solutions to Maxwell's equations that are valid as long as the light waves propagate through and around objects whose dimensions are much greater than the light's wavelength. Ray optics or geometrical optics does not describe phenomena such as diffraction, which require wave optics theory. Some wave phenomena such as interference can be modeled in limited circumstances by adding phase to the ray model.
X-ray optics is the branch of optics that manipulates X-rays instead of visible light. It deals with focusing and other ways of manipulating the X-ray beams for research techniques such as X-ray diffraction, X-ray crystallography, X-ray fluorescence, small-angle X-ray scattering, X-ray microscopy, X-ray phase-contrast imaging, and X-ray astronomy.
A Shack–Hartmannwavefront sensor (SHWFS) is an optical instrument used for characterizing an imaging system. It is a wavefront sensor commonly used in adaptive optics systems. It consists of an array of lenses of the same focal length. Each is focused onto a photon sensor. If the sensor is placed at the geometric focal plane of the lenslet, and is uniformly illuminated, then, the integrated gradient of the wavefront across the lenslet is proportional to the displacement of the centroid. Consequently, any phase aberration can be approximated by a set of discrete tilts. By sampling the wavefront with an array of lenslets, all of these local tilts can be measured and the whole wavefront reconstructed. Since only tilts are measured the Shack–Hartmann cannot detect discontinuous steps in the wavefront.
In optics, vergence is the angle formed by rays of light that are not perfectly parallel to one another. Rays that move closer to the optical axis as they propagate are said to be converging, while rays that move away from the axis are diverging. These imaginary rays are always perpendicular to the wavefront of the light, thus the vergence of the light is directly related to the radii of curvature of the wavefronts. A convex lens or concave mirror will cause parallel rays to focus, converging toward a point. Beyond that focal point, the rays diverge. Conversely, a concave lens or convex mirror will cause parallel rays to diverge.
A METATOY is a sheet, formed by a two-dimensional array of small, telescopic optical components, that switches the path of transmitted light rays. METATOY is an acronym for "metamaterial for rays", representing a number of analogies with metamaterials; METATOYs even satisfy a few definitions of metamaterials, but are certainly not metamaterials in the usual sense. When seen from a distance, the view through each individual telescopic optical component acts as one pixel of the view through the METATOY as a whole. In the simplest case, the individual optical components are all identical; the METATOY then behaves like a homogeneous, but pixellated, window that can have very unusual optical properties.
A flat lens is a lens whose flat shape allows it to provide distortion-free imaging, potentially with arbitrarily-large apertures. The term is also used to refer to other lenses that provide a negative index of refraction. Flat lenses require a refractive index close to −1 over a broad angular range. In recent years, flat lenses based on metasurfaces were also demonstrated.