Nanophotonic coherent imager

Last updated

Nanophotonic coherent imagers (NCI) are image sensors that determine both the appearance and distance of an imaged scene at each pixel. It uses an array of LIDARs (scanning laser beams) to gather this information about size and distance, using an optical concept called coherence (wherein waves of the same frequency align perfectly). [1]

Contents

NCIs can capture 3D images of objects with sufficient accuracy to permit the creation of high resolution replicas using 3D printing technology. [1]

The detection of both intensity and relative delay enables applications such as high-resolution 3D reflective and transmissive imaging as well as index contrast imaging. [1]

Prototype

An NCI using a 4×4 pixel grid of 16 grating couplers [2] operates based on a modified time-domain frequency modulated continuous wave (FMCW) ranging scheme, where concurrent time-domain measurements of both period and the zero-crossing time of each electrical output of the nanophotonic chip allows the NCI to overcome the resolution limits of frequency domain detection. [3] Each pixel on the chip is an independent interferometer that detects the phase and frequency of the signal in addition to the intensity. Each LIDAR pixel spanned only a few hundred microns such that the area fit in area of 300 microns square. [2]

The prototype achieved 15 μm depth resolution and 50 μm lateral resolution (limited by the pixel spacing) at up to 0.5-meter range. It was capable of detecting a 1% equivalent refractive index contrast at 1 mm thickness. [3]

Related Research Articles

<span class="mw-page-title-main">Lidar</span> Method of spatial measurement using laser

Lidar is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction or it may scan multiple directions, in which case it is known as lidar scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar has terrestrial, airborne, and mobile applications.

<span class="mw-page-title-main">Forward-looking infrared</span> Type of thermographic camera

Forward-looking infrared (FLIR) cameras, typically used on military and civilian aircraft, use a thermographic camera that senses infrared radiation.

<span class="mw-page-title-main">Thermographic camera</span> Imaging device using infrared radiation

A thermographic camera is a device that creates an image using infrared (IR) radiation, similar to a normal camera that forms an image using visible light. Instead of the 400–700 nanometre (nm) range of the visible light camera, infrared cameras are sensitive to wavelengths from about 1,000 nm to about 14,000 nm (14 μm). The practice of capturing and analyzing the data they provide is called thermography.

Microscope image processing is a broad term that covers the use of digital image processing techniques to process, analyze and present images obtained from a microscope. Such processing is now commonplace in a number of diverse fields such as medicine, biological research, cancer research, drug testing, metallurgy, etc. A number of manufacturers of microscopes now specifically design in features that allow the microscopes to interface to an image processing system.

<span class="mw-page-title-main">Diffraction-limited system</span> Optical system with resolution performance at the instruments theoretical limit

In optics, any optical instrument or system – a microscope, telescope, or camera – has a principal limit to its resolution due to the physics of diffraction. An optical instrument is said to be diffraction-limited if it has reached this limit of resolution performance. Other factors may affect an optical system's performance, such as lens imperfections or aberrations, but these are caused by errors in the manufacture or calculation of a lens, whereas the diffraction limit is the maximum resolution possible for a theoretically perfect, or ideal, optical system.

Optical resolution describes the ability of an imaging system to resolve detail, in the object that is being imaged. An imaging system may have many individual components, including one or more lenses, and/or recording and display components. Each of these contributes to the optical resolution of the system; the environment in which the imaging is done often is a further important factor.

<span class="mw-page-title-main">Optical transfer function</span> Function that specifies how different spatial frequencies are captured by an optical system

The optical transfer function (OTF) of an optical system such as a camera, microscope, human eye, or projector specifies how different spatial frequencies are captured or transmitted. It is used by optical engineers to describe how the optics project light from the object or scene onto a photographic film, detector array, retina, screen, or simply the next item in the optical transmission chain. A variant, the modulation transfer function (MTF), neglects phase effects, but is equivalent to the OTF in many situations.

The following are common definitions related to the machine vision field.

Terahertz tomography is a class of tomography where sectional imaging is done by terahertz radiation. Terahertz radiation is electromagnetic radiation with a frequency between 0.1 and 10 THz; it falls between radio waves and light waves on the spectrum; it encompasses portions of the millimeter waves and infrared wavelengths. Because of its high frequency and short wavelength, terahertz wave has a high signal-to-noise ratio in the time domain spectrum. Tomography using terahertz radiation can image samples that are opaque in the visible and near-infrared regions of the spectrum. Terahertz wave three-dimensional (3D) imaging technology has developed rapidly since its first successful application in 1997, and a series of new 3D imaging technologies have been proposed successively.

Range imaging is the name for a collection of techniques that are used to produce a 2D image showing the distance to points in a scene from a specific point, normally associated with some type of sensor device.

<span class="mw-page-title-main">Medipix</span> Family of pixel detectors

Medipix is a family of photon counting and particle tracking pixel detectors developed by an international collaboration, hosted by CERN.

Optical heterodyne detection is a method of extracting information encoded as modulation of the phase, frequency or both of electromagnetic radiation in the wavelength band of visible or infrared light. The light signal is compared with standard or reference light from a "local oscillator" (LO) that would have a fixed offset in frequency and phase from the signal if the latter carried null information. "Heterodyne" signifies more than one frequency, in contrast to the single frequency employed in homodyne detection.

Canesta was a fabless semiconductor company that was founded in April, 1999, by Cyrus Bamji, Abbas Rafii, and Nazim Kareemi.

<span class="mw-page-title-main">Time-of-flight camera</span> Range imaging camera system

A time-of-flight camera, also known as time-of-flight sensor, is a range imaging camera system for measuring distances between the camera and the subject for each point of the image based on time-of-flight, the round trip time of an artificial light signal, as provided by a laser or an LED. Laser-based time-of-flight cameras are part of a broader class of scannerless LIDAR, in which the entire scene is captured with each laser pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems. Time-of-flight camera products for civil applications began to emerge around 2000, as the semiconductor processes allowed the production of components fast enough for such devices. The systems cover ranges of a few centimeters up to several kilometers.

The photoacoustic Doppler effect is a type of Doppler effect that occurs when an intensity modulated light wave induces a photoacoustic wave on moving particles with a specific frequency. The observed frequency shift is a good indicator of the velocity of the illuminated moving particles. A potential biomedical application is measuring blood flow.

Speckle, speckle pattern, or speckle noise is a granular noise texture degrading the quality as a consequence of interference among wavefronts in coherent imaging systems, such as radar, synthetic aperture radar (SAR), medical ultrasound and optical coherence tomography. Speckle is not external noise; rather, it is an inherent fluctuation in diffuse reflections, because the scatterers are not identical for each cell, and the coherent illumination wave is highly sensitive to small variations in phase changes.

Ultrasound-modulated optical tomography (UOT), also known as Acousto-Optic Tomography (AOT), is a hybrid imaging modality that combines light and sound; it is a form of tomography involving ultrasound. It is used in imaging of biological soft tissues and has potential applications for early cancer detection. As a hybrid modality which uses both light and sound, UOT provides some of the best features of both: the use of light provides strong contrast and sensitivity ; these two features are derived from the optical component of UOT. The use of ultrasound allows for high resolution, as well as a high imaging depth. However, the difficulty of tackling the two fundamental problems with UOT have caused UOT to evolve relatively slowly; most work in the field is limited to theoretical simulations or phantom / sample studies.

Ali Hajimiri is an academic, entrepreneur, and inventor in various fields of engineering, including electrical engineering and biomedical engineering. He is the Bren Professor of Electrical Engineering and Medical Engineering at the California Institute of Technology (Caltech).

Super-resolution photoacoustic imaging is a set of techniques used to enhance spatial resolution in photoacoustic imaging. Specifically, these techniques primarily break the optical diffraction limit of the photoacoustic imaging system. It can be achieved in a variety of mechanisms, such as blind structured illumination, multi-speckle illumination, or photo-imprint photoacoustic microscopy in Figure 1.

<span class="mw-page-title-main">Functional ultrasound imaging</span>

Functional ultrasound imaging (fUS) is a medical ultrasound imaging technique of detecting or measuring changes in neural activities or metabolism, for example, the loci of brain activity, typically through measuring blood flow or hemodynamic changes. The method can be seen as an extension of Doppler imaging.

References

  1. 1 2 3 "New chip could turn phone cameras into high-res 3D scanners". www.gizmag.com. 7 April 2015. Retrieved 2017-12-21.
  2. 1 2 "Miniaturized camera chip provides superfine depth resolution for 3D printing | KurzweilAI". www.kurzweilai.net. Retrieved 2017-12-21.
  3. 1 2 Aflatouni, Firooz; Abiri, Behrooz; Rekhi, Angad; Hajimiri, Ali (April 2015). "Nanophotonic coherent imager". Optics Express. 23 (4): 5117–5125. Bibcode:2015OExpr..23.5117A. doi: 10.1364/oe.23.005117 . ISSN   1094-4087. PMID   25836545.