A lens flare happens when light is scattered or flared in a lens system, often in response to a bright light, producing a sometimes undesirable artifact in the image. This happens through light scattered by the imaging mechanism itself, for example through internal reflection and forward scatter from material imperfections in the lens. Lenses with large numbers of elements such as zooms tend to have more lens flare, as they contain a relatively large number of interfaces at which internal scattering may occur. These mechanisms differ from the focused image generation mechanism, which depends on rays from the refraction of light from the subject itself.
There are two types of flare: visible artifacts and glare across the image. The glare makes the image look "washed out" by reducing contrast and color saturation (adding light to dark image regions, and adding white to saturated regions, reducing their saturation). Visible artifacts, usually in the shape of the aperture made by the iris diaphragm, are formed when light follows a pathway through the lens that contains one or more reflections from the lens surfaces.
Flare is particularly caused by very bright light sources. Most commonly, this occurs when aiming toward the Sun (when the Sun is in frame or the lens is pointed sunward), and is reduced by using a lens hood or other shade. For good-quality optical systems, and for most images (which do not have a bright light shining into the lens), flare is a secondary effect that is widely distributed across the image and thus not visible, although it does reduce contrast.
The spatial distribution of the lens flare typically manifests as several starbursts, rings, or circles in a row across the image or view. Lens flare patterns typically spread widely across the scene and change location with the camera's movement relative to light sources, tracking with the light position and fading as the camera points away from the bright light until it causes no flare at all. The specific spatial distribution of the flare depends on the shape of the aperture of the image formation elements. For example, if the lens has a 6-bladed aperture, the flare may have a hexagonal pattern.
Such internal scattering is also present in the human eye, and manifests in an unwanted veiling glare most obvious when viewing very bright lights or highly reflective surfaces. In some situations, eyelashes can also create flare-like irregularities, although these are technically diffraction artifacts.
When a bright light source is shining on the lens but not in its field of view, lens flare appears as a haze that washes out the image and reduces contrast. This can be avoided by shading the lens using a lens hood. In a studio, a gobo or set of barn doors can be attached to the lighting to keep it from shining on the camera. Filters can be attached to the camera lens which will also minimise lens flare, which is especially useful for outdoor photographers.
When using an anamorphic lens, as is common in analog cinematography, lens flare can manifest itself as horizontal lines. This is most commonly seen in car headlights in a dark scene, and may be desired as part of the "film look".
A lens flare is often deliberately used to invoke a sense of drama. A lens flare is also useful when added to an artificial or modified image composition because it adds a sense of realism, implying that the image is an un-edited original photograph of a "real life" scene.
For both of these reasons (implying realism and/or drama) artificial lens flare is a common effect in various graphics editing programs, although its use can be a point of contention among professional graphic designers. [1] Lens flare was one of the first special effects developed for computer graphics because it can be imitated using relatively simple means. Basic flare-like effects, for instance in video games, can be obtained by drawing starburst, ring, and disc textures over the image and moving them as the location of the light source changes. [2] More sophisticated rendering techniques have been developed based on ray tracing [3] or photon mapping. [4]
Lens flare was typically avoided by Hollywood cinematographers, but when filming Easy Rider (1969), Harrison Arnold was forced to modify a camera car for his Arriflex, which resulted in numerous lens flares as he shot motorcycle footage against landscapes of the Southwestern United States. [5]
Director J. J. Abrams added numerous lens flares to his films Star Trek (2009) and Super 8 (2011) by aiming powerful off-camera light sources at the lens. He explained in an interview about Star Trek: "I wanted a visual system that felt unique. I know there are certain shots where even I watch and think, 'Oh that's ridiculous, that was too many.' But I love the idea that the future was so bright it couldn't be contained in the frame." Many complained of the frequent use; Abrams conceded it was "overdone, in some places." [6]
David Boyd, the director of photography of the sci-fi Firefly series, desired this style so much (harkening back to 1970s television), that he sent back cutting-edge lenses that reduced lens flare in exchange for cheaper ones. [7] [ verification needed ]
The use of photographic filters can cause flare, particularly ghosts of bright lights (under central inversion). [8] This can be eliminated by not using a filter, and reduced by using higher-quality filters or narrower aperture.
One form of flare is specific to digital cameras. With the sun shining on an unprotected lens, a group of small rainbows appears. This artifact is formed by internal diffraction on the image sensor, which acts like a diffraction grating. Unlike true lens flare, this artifact is not visible in the eyepiece of a digital SLR camera, making it more difficult to avoid.
Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behaviour of visible, ultraviolet, and infrared light. Light is a type of electromagnetic radiation, and other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.
A camera lens is an optical lens or assembly of lenses used in conjunction with a camera body and mechanism to make images of objects either on photographic film or on other media capable of storing an image chemically or electronically.
In optics, any optical instrument or system – a microscope, telescope, or camera – has a principal limit to its resolution due to the physics of diffraction. An optical instrument is said to be diffraction-limited if it has reached this limit of resolution performance. Other factors may affect an optical system's performance, such as lens imperfections or aberrations, but these are caused by errors in the manufacture or calculation of a lens, whereas the diffraction limit is the maximum resolution possible for a theoretically perfect, or ideal, optical system.
In photography and cinematography, a filter is a camera accessory consisting of an optical filter that can be inserted into the optical path. The filter can be of a square or oblong shape and mounted in a holder accessory, or, more commonly, a glass or plastic disk in a metal or plastic ring frame, which can be screwed into the front of or clipped onto the camera lens.
A coronagraph is a telescopic attachment designed to block out the direct light from a star or other bright object so that nearby objects – which otherwise would be hidden in the object's bright glare – can be resolved. Most coronagraphs are intended to view the corona of the Sun, but a new class of conceptually similar instruments are being used to find extrasolar planets and circumstellar disks around nearby stars as well as host galaxies in quasars and other similar objects with active galactic nuclei (AGN).
In infrared photography, the photographic film or image sensor used is sensitive to infrared light. The part of the spectrum used is referred to as near-infrared to distinguish it from far-infrared, which is the domain of thermal imaging. Wavelengths used for photography range from about 700 nm to about 900 nm. Film is usually sensitive to visible light too, so an infrared-passing filter is used; this lets infrared (IR) light pass through to the camera, but blocks all or most of the visible light spectrum.
Day for night is a set of cinematic techniques used to simulate a night scene while filming in daylight. It is often employed when it is too difficult or expensive to actually shoot during nighttime. Because both film stocks and digital image sensors lack the sensitivity of the human eye in low light conditions, night scenes recorded in natural light, with or without moonlight, may be underexposed to the point where little or nothing is visible. This problem can be avoided by using daylight to substitute for darkness. When shooting day for night, the scene is typically underexposed in-camera or darkened during post-production, with a blue tint added. Additional effects are often used to heighten the impression of night.
In optics, a diaphragm is a thin opaque structure with an opening (aperture) at its center. The role of the diaphragm is to stop the passage of light, except for the light passing through the aperture. Thus it is also called a stop. The diaphragm is placed in the light path of a lens or objective, and the size of the aperture regulates the amount of light that passes through the lens. The centre of the diaphragm's aperture coincides with the optical axis of the lens system.
In photography and optics, vignetting is a reduction of an image's brightness or saturation toward the periphery compared to the image center. The word vignette, from the same root as vine, originally referred to a decorative border in a book. Later, the word came to be used for a photographic portrait that is clear at the center and fades off toward the edges. A similar effect is visible in photographs of projected images or videos off a projection screen, resulting in a so-called "hotspot" effect.
The science of photography is the use of chemistry and physics in all aspects of photography. This applies to the camera, its lenses, physical operation of the camera, electronic camera internals, and the process of developing film in order to take and develop pictures properly.
In photography, a lens hood or lens shade is a device used on the front end of a lens to block the Sun or other light source(s) to prevent glare and lens flare. Lens hoods may also be used to protect the lens from scratches and the elements without having to put on a lens cover.
The geometry of a lens hood is dependent on three parameters: the focal length of the lens, the size of the front lens element and the dimensions of the image sensor or film in the camera.
In photography and optics, a neutral-density filter, or ND filter, is a filter that reduces or modifies the intensity of all wavelengths, or colors, of light equally, giving no changes in hue of color rendition. It can be a colorless (clear) or grey filter, and is denoted by Wratten number 96. The purpose of a standard photographic neutral-density filter is to reduce the amount of light entering the lens. Doing so allows the photographer to select combinations of aperture, exposure time and sensor sensitivity that would otherwise produce overexposed pictures. This is done to achieve effects such as a shallower depth of field or motion blur of a subject in a wider range of situations and atmospheric conditions.
In photography, purple fringing is the term for an unfocused purple or magenta "ghost" image on a photograph. This optical aberration is generally most visible as a coloring and lightening of dark edges adjacent to bright areas of broad-spectrum illumination, such as daylight or various types of gas-discharge lamps.
In signal processing, apodization is the modification of the shape of a mathematical function. The function may represent an electrical signal, an optical transmission, or a mechanical structure. In optics, it is primarily used to remove Airy disks caused by diffraction around an intensity peak, improving the focus.
A polarizing filter or polarising filter is a filter that is often placed in front of a camera lens in photography in order to darken skies, manage reflections, or suppress glare from the surface of lakes or the sea. Since reflections tend to be at least partially linearly-polarized, a linear polarizer can be used to change the balance of the light in the photograph. The rotational orientation of the filter is adjusted for the preferred artistic effect.
Bloom is a computer graphics effect used in video games, demos, and high-dynamic-range rendering (HDRR) to reproduce an imaging artifact of real-world cameras. The effect produces fringes of light extending from the borders of bright areas in an image, contributing to the illusion of an extremely bright light overwhelming the camera or eye capturing the scene. It became widely used in video games after an article on the technique was published by the authors of Tron 2.0 in 2004.
Diffraction spikes are lines radiating from bright light sources, causing what is known as the starburst effect or sunstars in photographs and in vision. They are artifacts caused by light diffracting around the support vanes of the secondary mirror in reflecting telescopes, or edges of non-circular camera apertures, and around eyelashes and eyelids in the eye.
Dark-field microscopy describes microscopy methods, in both light and electron microscopy, which exclude the unscattered beam from the image. Consequently, the field around the specimen is generally dark.
Köhler illumination is a method of specimen illumination used for transmitted and reflected light optical microscopy. Köhler illumination acts to generate an even illumination of the sample and ensures that an image of the illumination source is not visible in the resulting image. Köhler illumination is the predominant technique for sample illumination in modern scientific light microscopy. It requires additional optical elements which are more expensive and may not be present in more basic light microscopes.
Image quality can refer to the level of accuracy with which different imaging systems capture, process, store, compress, transmit and display the signals that form an image. Another definition refers to image quality as "the weighted combination of all of the visually significant attributes of an image". The difference between the two definitions is that one focuses on the characteristics of signal processing in different imaging systems and the latter on the perceptual assessments that make an image pleasant for human viewers.