Monochrome-astrophotography-techniques

Last updated

Monochrome photography is one of the earliest styles of photography and dates back to the 1800s. [1] Monochrome photography is also a popular technique among astrophotographers. This is due to the omission of the Bayer filter, a colour filter array that sits in front of the CMOS or CCD sensor, allowing for a single sensor to produce a colour image.

Contents

Sensor design

Colour cameras produce colour images using a Bayer matrix, a colour filter array that sits in front of the sensor. The matrix allows light of primary colours, red, green and blue, to enter the sensor. A typical matrix arrangement consists of a 25% red pass through area, 25% blue, and 50% green. The Bayer matrix allows a single chip sensor to produce a colour image. [2]

The Bayer arrangement of colour filters on the pixel array of an image sensor Bayer pattern on sensor.svg
The Bayer arrangement of colour filters on the pixel array of an image sensor

Many objects in deep space are made up of hydrogen, oxygen and sulphur. These elements emit light in the red, blue and red/orange spectrum respectively. [3]

When imaging an object rich in hydrogen, the object will primarily emit light in the hydrogen-alpha/red wavelengths. In this scenario, the Bayer matrix will only allow 25% off the incoming light from the nebula to reach the sensor, as only 25% of the matrix area will allow red light to pass through. [2]

Profile/cross-section of sensor Bayer pattern on sensor profile.svg
Profile/cross-section of sensor

A monochromatic sensor does not have a Bayer matrix. This means the entire sensor can be utilised to capture specific wavelengths using specialised colour filters known as narrowband filters. [4] Many nebulae are made up of hydrogen, oxygen and sulphur. These nebulae emit light in red, blue and orange wavelengths respectively. A narrowband filter can be used for each colour to produce three discrete monochrome images. These images can then be combined to produce a colour image.

Advantages

Monochrome astrophotography has gained its popularity as a method of combating the effects of modern light-pollution. The Bayer matrix in a traditional sensor will limit the available sensor area capable of collecting light from deep space objects to approximately 25%. The remaining 75% however is still capable of collecting light, often in the form of surrounding light pollution. This can adversely affect the signal-to-noise ratio. [5]

Removing the Bayer matrix means a narrowband filter can be used to only allow specific wavelengths of light to reach the sensor. This has the benefit of utilising the entire sensor area to maximise the amount of light collected, whilst also rejecting sources of external light pollution, vastly improving the signal-to-noise ratio. [6]

Monochrome image processing

Colour images in typical cameras are made by combining data from red, green and blue pixels. [7] In order to produce a colour image using a monochrome sensor, three monochrome images must be produced and combined to produce a colour image. The three monochrome images are mapped to the respective red, green and blue channels. In the case of astrophotography, this can vary to some degree, although a common colour palette is the Hubble palette, often known as "SHO". In the Hubble pallet, Sulphur is mapped to the red channel, hydrogen-alpha signals are mapped to green, and oxygen is mapped to blue [8]

Monochrome astrophotography also requires a greater number of calibration frames. Calibration frames are used capture artefacts and dust on the image sensor and filter, and light gradients due to internal reflections in the optical train. These can then be removed from the final image. Monochrome imaging requires the use of three individual filters to produce a colour image. This means three sets of calibration frames must be generated and applied during the image processing stage. This therefore increases the amount of images that need to be stored, requiring greater amounts of storage space. [9]

Monochrome photography also requires additional equipment. Due to the requirement of multiple filters, amateur astrophotographers often use an electronic filter wheel. This allows multiple filters to be installed, and a computer can be used to control the wheel and change filters throughout the night [10]

Related Research Articles

<span class="mw-page-title-main">RGB color model</span> Color model based on red, green, and blue

The RGB color model is an additive color model in which the red, green and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.

<span class="mw-page-title-main">Astrophotography</span> Imaging of astronomical objects

Astrophotography, also known as astronomical imaging, is the photography or imaging of astronomical objects, celestial events, or areas of the night sky. The first photograph of an astronomical object was taken in 1840, but it was not until the late 19th century that advances in technology allowed for detailed stellar photography. Besides being able to record the details of extended objects such as the Moon, Sun, and planets, modern astrophotography has the ability to image objects outside of the visible spectrum of the human eye such as dim stars, nebulae, and galaxies. This is accomplished through long time exposure as both film and digital cameras can accumulate and sum photons over long periods of time or using specialized optical filters which limit the photons to a certain wavelength.

<span class="mw-page-title-main">Infrared cut-off filter</span> Optical filters that block near-infrared while passing visible light

Infrared cut-off filters, sometimes called IR filters or heat-absorbing filters, are designed to reflect or block near-infrared wavelengths while passing visible light. They are often used in devices with bright incandescent light bulbs to prevent unwanted heating. There are also filters which are used in solid state video cameras to block IR due to the high sensitivity of many camera sensors to near-infrared light. These filters typically have a blue hue to them as they also sometimes block some of the light from the longer red wavelengths.

<span class="mw-page-title-main">Hubble Deep Field</span> Multiple exposure image of deep space in the constellation Ursa Major

The Hubble Deep Field (HDF) is an image of a small region in the constellation Ursa Major, constructed from a series of observations by the Hubble Space Telescope. It covers an area about 2.6 arcminutes on a side, about one 24-millionth of the whole sky, which is equivalent in angular size to a tennis ball at a distance of 100 metres. The image was assembled from 342 separate exposures taken with the Space Telescope's Wide Field and Planetary Camera 2 over ten consecutive days between December 18 and 28, 1995.

<span class="mw-page-title-main">Monochrome</span> Composed of one color

A monochrome or monochromatic image, object or palette is composed of one color. Images using only shades of grey are called grayscale or black-and-white. In physics, monochromatic light refers to electromagnetic radiation that contains a narrow band of wavelengths, which is a distinct concept.

<span class="mw-page-title-main">Color photography</span> Photography that reproduces colors

Color photography is photography that uses media capable of capturing and reproducing colors. By contrast, black-and-white or gray-monochrome photography records only a single channel of luminance (brightness) and uses media capable only of showing shades of gray.

The Foveon X3 sensor is a digital camera image sensor designed by Foveon, Inc., and manufactured by Dongbu Electronics. It uses an array of photosites that consist of three vertically stacked photodiodes. Each of the three stacked photodiodes has a different spectral sensitivity, allowing it to respond differently to different wavelengths. The signals from the three photodiodes are then processed as additive color data that are transformed to a standard RGB color space. In the late 1970s, a similar color sensor having three stacked photo detectors at each pixel location, with different spectral responses due to the differential absorption of light by the semiconductor, had been developed and patented by Kodak.

<span class="mw-page-title-main">Photographic filter</span> Camera accessory consisting of an optical filter

In photography and cinematography, a filter is a camera accessory consisting of an optical filter that can be inserted into the optical path. The filter can be of a square or oblong shape and mounted in a holder accessory, or, more commonly, a glass or plastic disk in a metal or plastic ring frame, which can be screwed into the front of or clipped onto the camera lens.

<span class="mw-page-title-main">Bayer filter</span> Color filter array

A Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital image sensors used in digital cameras, and camcorders to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB.

<span class="mw-page-title-main">Optical filter</span> Filters which selectively transmit specific colors

An optical filter is a device that selectively transmits light of different wavelengths, usually implemented as a glass plane or plastic device in the optical path, which are either dyed in the bulk or have interference coatings. The optical properties of filters are completely described by their frequency response, which specifies how the magnitude and phase of each frequency component of an incoming signal is modified by the filter.

<span class="mw-page-title-main">Infrared photography</span> Near-infrared imaging

In infrared photography, the photographic film or image sensor used is sensitive to infrared light. The part of the spectrum used is referred to as near-infrared to distinguish it from far-infrared, which is the domain of thermal imaging. Wavelengths used for photography range from about 700 nm to about 900 nm. Film is usually sensitive to visible light too, so an infrared-passing filter is used; this lets infrared (IR) light pass through to the camera, but blocks all or most of the visible light spectrum; these filters thus look black (opaque) or deep red.

A panchromatic emulsion is a type of photographic emulsion that is sensitive to all wavelengths of visible light, and produces a monochrome photograph—typically black and white. Most modern commercially available film is panchromatic, and the technology is usually contrasted with earlier methods that cannot register all wavelengths, especially orthochromatic film.

<span class="mw-page-title-main">Color balance</span> Adjustment of color intensities in photography

In photography and image processing, color balance is the global adjustment of the intensities of the colors. An important goal of this adjustment is to render specific colors – particularly neutral colors like white or grey – correctly. Hence, the general method is sometimes called gray balance, neutral balance, or white balance. Color balance changes the overall mixture of colors in an image and is used for color correction. Generalized versions of color balance are used to correct colors other than neutrals or to deliberately change them for effect. White balance is one of the most common kinds of balancing, and is when colors are adjusted to make a white object appear white and not a shade of any other colour.

<span class="mw-page-title-main">Three-CCD camera</span> Camera whose imaging system uses three separate charge-coupled devices

A three-CCD (3CCD) camera is a camera whose imaging system uses three separate charge-coupled devices (CCDs), each one receiving filtered red, green, or blue color ranges. Light coming in from the lens is split by a beam-splitter prism into three beams, which are then filtered to produce colored light in three color ranges or "bands". The system is employed by high quality still cameras, telecine systems, professional video cameras and some prosumer video cameras.

The aim of color calibration is to measure and/or adjust the color response of a device to a known state. In International Color Consortium (ICC) terms, this is the basis for an additional color characterization of the device and later profiling. In non-ICC workflows, calibration sometimes refers to establishing a known relationship to a standard color space in one go. The device that is to be calibrated is sometimes known as a calibration source; the color space that serves as a standard is sometimes known as a calibration target. Color calibration is a requirement for all devices taking an active part in a color-managed workflow and is used by many industries, such as television production, gaming, photography, engineering, chemistry, medicine, and more.

<span class="mw-page-title-main">CYGM filter</span>

In digital photography, the CYGM filter is an alternative color filter array to the Bayer filter (GRGB). It similarly uses a mosaic of pixel filters, of cyan, yellow, green and magenta, and so also requires demosaicing to produce a full-color image.

<span class="mw-page-title-main">Color filter array</span> Pattern of color filters over an image sensor

In digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an image sensor to capture color information.

<span class="mw-page-title-main">Monochrome photography</span> Photography in which every point in the image has the same hue but different intensity

Monochrome photography is photography where each position on an image can record and show a different amount of light (value), but not a different color (hue). The majority of monochrome photographs produced today are black-and-white, either from a gelatin silver process, or as digital photography. Other hues besides grey can be used to create monochrome photography, but brown and sepia tones are the result of older processes like the albumen print, and cyan tones are the product of cyanotype prints.

<span class="mw-page-title-main">Astronomical filter</span> Telescope accessory used to improve details of viewed objects

An astronomical filter is a telescope accessory consisting of an optical filter used by amateur astronomers to simply improve the details and contrast of celestial objects, either for viewing or for photography. Research astronomers, on the other hand, use various band-pass filters for photometry on telescopes, in order to obtain measurements which reveal objects' astrophysical properties, such as stellar classification and placement of a celestial body on its Wien curve.

Image fidelity, often referred to as the ability to discriminate between two images or how closely the image represents the real source distribution. Different from image quality, which is often referred to as the subject preference for one image over another, image fidelity represents to the ability of a process to render an image accurately, without any visible distortion or information loss. The two terms are often used interchangeably, but they are not the same.

References

  1. Robert, Hirsch (2000). Seizing the Light: A History of Photography. McGraw-Hill. ISBN   9780697143617.
  2. 1 2 Wang, Peng; Menon, Rajesh (2015-11-20). "Ultra-high-sensitivity color imaging via a transparent diffractive-filter array and computational optics". Optica. 2 (11): 933–939. Bibcode:2015Optic...2..933W. doi: 10.1364/OPTICA.2.000933 . ISSN   2334-2536.
  3. "Hubble Captures a Perfect Storm of Turbulent Gases". HubbleSite.org. Retrieved 2022-03-09.
  4. Bull, David (2014). Digital Picture Formats and Representations. Elsevier Science. pp. Section 4.5.3. ISBN   9780124059061.
  5. "Deep Sky Astrophotography in City Light Pollution | Results with DSLR Camera". AstroBackyard | Astrophotography Tips and Tutorials. 2018-10-12. Retrieved 2022-01-20.
  6. "Narrowband Imaging Primer | Beginners Guide to the Hubble Palette & More". AstroBackyard | Astrophotography Tips and Tutorials. Retrieved 2022-03-15.
  7. US 3971065,Bayer, Bryce E.,"Color imaging array",published 1976-07-20, assigned to -Eastman Kodak Co.
  8. "The Truth About Hubble, JWST, and False Color". NASA Blueshift. Retrieved 2022-01-20.
  9. "Demystifying Flat-Frame Calibration". Sky & Telescope. 2021-06-17. Retrieved 2022-03-15.
  10. "Do You Need a Filter Wheel for Astrophotography? (LRGB Imaging)". AstroBackyard | Astrophotography Tips and Tutorials. Retrieved 2022-03-15.