Imaging

Last updated
Comparison of two imaging modalities--optical tomography (A, C) and computed tomography (B, D)--as applied to a Lego minifigure Comparison of optical light Kitchen-Based Light Tomography (KBLT) versus X-ray microtomography (X-ray mCT) of a Lego minifigure - 1-s2.0-S2949673X22000018-gr4 lrg.jpg
Comparison of two imaging modalities optical tomography (A, C) and computed tomography (B, D)as applied to a Lego minifigure

Imaging is the representation or reproduction of an object's form; especially a visual representation (i.e., the formation of an image).

Contents

Imaging technology is the application of materials and methods to create, preserve, or duplicate images.

Imaging science is a multidisciplinary field concerned with the generation, collection, duplication, analysis, modification, and visualization of images, [1] including imaging things that the human eye cannot detect. As an evolving field it includes research and researchers from physics, mathematics, electrical engineering, computer vision, computer science, and perceptual psychology.

Imagers are imaging sensors.

Imaging chain

The foundation of imaging science as a discipline is the "imaging chain" – a conceptual model describing all of the factors which must be considered when developing a system for creating visual renderings (images). In general, the links of the imaging chain include:

  1. The human visual system. Designers must also consider the psychophysical processes which take place in human beings as they make sense of information received through the visual system.
  2. The subject of the image. When developing an imaging system, designers must consider the observables associated with the subjects which will be imaged. These observables generally take the form of emitted or reflected energy, such as electromagnetic energy or mechanical energy.
  3. The capture device. Once the observables associated with the subject are characterized, designers can then identify and integrate the technologies needed to capture those observables. For example, in the case of consumer digital cameras, those technologies include optics for collecting energy in the visible portion of the electromagnetic spectrum, and electronic detectors for converting the electromagnetic energy into an electronic signal.
  4. The processor. For all digital imaging systems, the electronic signals produced by the capture device must be manipulated by an algorithm which formats the signals so they can be displayed as an image. In practice, there are often multiple processors involved in the creation of a digital image.
  5. The display. The display takes the electronic signals which have been manipulated by the processor and renders them on some visual medium. Examples include paper (for printed, or "hard copy" images), television, computer monitor, or projector.

Note that some imaging scientists will include additional "links" in their description of the imaging chain. For example, some will include the "source" of the energy which "illuminates" or interacts with the subject of the image. Others will include storage and/or transmission systems.

Subfields

Subfields within imaging science include: image processing, computer vision, 3D computer graphics, animations, atmospheric optics, astronomical imaging, biological imaging, digital image restoration, digital imaging, color science, digital photography, holography, magnetic resonance imaging, medical imaging, microdensitometry, optics, photography, remote sensing, radar imaging, radiometry, silver halide, ultrasound imaging, photoacoustic imaging, thermal imaging, visual perception, and various printing technologies.

Methodologies

Examples

False-color image from a thermographic camera Passivhaus thermogram gedaemmt ungedaemmt.png
False-color image from a thermographic camera

Imaging technology materials and methods include:

See also

Related Research Articles

Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.

<span class="mw-page-title-main">Microscopy</span> Viewing of objects which are too small to be seen with the naked eye

Microscopy is the technical field of using microscopes to view objects and areas of objects that cannot be seen with the naked eye. There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy.

<span class="mw-page-title-main">Microscope</span> Scientific instrument

A microscope is a laboratory instrument used to examine objects that are too small to be seen by the naked eye. Microscopy is the science of investigating small objects and structures using a microscope. Microscopic means being invisible to the eye unless aided by a microscope.

<span class="mw-page-title-main">Optics</span> Branch of physics that studies light

Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behaviour of visible, ultraviolet, and infrared light. Light is a type of electromagnetic radiation, and other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.

<span class="mw-page-title-main">Photonics</span> Technical applications of optics

Photonics is a branch of optics that involves the application of generation, detection, and manipulation of light in the form of photons through emission, transmission, modulation, signal processing, switching, amplification, and sensing. Photonics is closely related to quantum electronics, where quantum electronics deals with the theoretical part of it while photonics deal with its engineering applications. Though covering all light's technical applications over the whole spectrum, most photonic applications are in the range of visible and near-infrared light. The term photonics developed as an outgrowth of the first practical semiconductor light emitters invented in the early 1960s and optical fibers developed in the 1970s.

Optics is the branch of physics which involves the behavior and properties of light, including its interactions with matter and the construction of instruments that use or detect it. Optics usually describes the behavior of visible, ultraviolet, and infrared light. Because light is an electromagnetic wave, other forms of electromagnetic radiation such as X-rays, microwaves, and radio waves exhibit similar properties.

<span class="mw-page-title-main">Backscatter</span> Reflection which reverses the direction of a wave, particle, or signal

In physics, backscatter is the reflection of waves, particles, or signals back to the direction from which they came. It is usually a diffuse reflection due to scattering, as opposed to specular reflection as from a mirror, although specular backscattering can occur at normal incidence with a surface. Backscattering has important applications in astronomy, photography, and medical ultrasonography. The opposite effect is forward scatter, e.g. when a translucent material like a cloud diffuses sunlight, giving soft light.

<span class="mw-page-title-main">Computational photography</span> Set of digital image capture and processing techniques

Computational photography refers to digital image capture and processing techniques that use digital computation instead of optical processes. Computational photography can improve the capabilities of a camera, or introduce features that were not possible at all with film-based photography, or reduce the cost or size of camera elements. Examples of computational photography include in-camera computation of digital panoramas, high-dynamic-range images, and light field cameras. Light field cameras use novel optical elements to capture three dimensional scene information which can then be used to produce 3D images, enhanced depth-of-field, and selective de-focusing. Enhanced depth-of-field reduces the need for mechanical focusing systems. All of these features use computational imaging techniques.

Super-resolution imaging (SR) is a class of techniques that enhance (increase) the resolution of an imaging system. In optical SR the diffraction limit of systems is transcended, while in geometrical SR the resolution of digital imaging sensors is enhanced.

<span class="mw-page-title-main">Applied physics</span> Connection between physics and engineering

Applied physics is the application of physics to solve scientific or engineering problems. It is usually considered a bridge or a connection between physics and engineering. "Applied" is distinguished from "pure" by a subtle combination of factors, such as the motivation and attitude of researchers and the nature of the relationship to the technology or science that may be affected by the work. Applied physics is rooted in the fundamental truths and basic concepts of the physical sciences but is concerned with the utilization of scientific principles in practical devices and systems and with the application of physics in other areas of science and high technology.

Optical computing or photonic computing uses light waves produced by lasers or incoherent sources for data processing, data storage or data communication for computing. For decades, photons have shown promise to enable a higher bandwidth than the electrons used in conventional computers.

<span class="mw-page-title-main">Image sensor</span> Device that converts images into electronic signals

An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

The following are common definitions related to the machine vision field.

The following outline is provided as an overview of and topical guide to photography:

Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).

Optical heterodyne detection is a method of extracting information encoded as modulation of the phase, frequency or both of electromagnetic radiation in the wavelength band of visible or infrared light. The light signal is compared with standard or reference light from a "local oscillator" (LO) that would have a fixed offset in frequency and phase from the signal if the latter carried null information. "Heterodyne" signifies more than one frequency, in contrast to the single frequency employed in homodyne detection.

<span class="mw-page-title-main">Digital microscope</span>

A digital microscope is a variation of a traditional optical microscope that uses optics and a digital camera to output an image to a monitor, sometimes by means of software running on a computer. A digital microscope often has its own in-built LED light source, and differs from an optical microscope in that there is no provision to observe the sample directly through an eyepiece. Since the image is focused on the digital circuit, the entire system is designed for the monitor image. The optics for the human eye are omitted.

Speckle, speckle pattern, or speckle noise designates the granular structure observed in coherent light, resulting from random interference. Speckle patterns are used in a wide range of metrology techniques, as they generally allow high sensitivity and simple setups. They can also be a limiting factor in imaging systems, such as radar, synthetic aperture radar (SAR), medical ultrasound and optical coherence tomography. Speckle is not external noise; rather, it is an inherent fluctuation in diffuse reflections, because the scatterers are not identical for each cell, and the coherent illumination wave is highly sensitive to small variations in phase changes.

References

  1. Joseph P. Hornak, Encyclopedia of Imaging Science and Technology (John Wiley & Sons, 2002) ISBN   9780471332763
  2. Kaboutari, Keivan; Önder Tetik, Ahmet; Ghalichi, Elyar; Soner Gözü, Mehmet; Zengin, Reyhan; Güneri Gençer, Nevzat (2019). "Data acquisition system for MAET with magnetic field measurements". Physics in Medicine & Biology. 64 (11): 115016. Bibcode:2019PMB....64k5016K. doi:10.1088/1361-6560/ab1809. PMID   30970342. S2CID   108294047.

Further reading