Active pixel sensor

Last updated

An active-pixel sensor (APS) is an image sensor where each picture element ("pixel") has a photodetector and an active amplifier. There are many types of integrated circuit active pixel sensors including the complementary metal–oxide–semiconductor (CMOS) APS used most commonly in cell phone cameras, web cameras, most digital pocket cameras since 2010, in most digital single-lens reflex cameras (DSLRs) and Mirrorless interchangeable-lens cameras (MILCs). Such an image sensor is produced using CMOS technology (and is hence also known as a CMOS sensor), and has emerged as an alternative to charge-coupled device (CCD) image sensors.

Image sensor device that converts an optical image into an electronic signal

An image sensor or imager is a sensor that detects and conveys information used to make an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, digital imaging tends to replace analog imaging.

Pixel a physical point in a raster image

In digital imaging, a pixel, pel, dots, or picture element is a physical point in a raster image, or the smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen.

Photodetector sensors of light or other electromagnetic energy

Photodetectors, also called photosensors, are sensors of light or other electromagnetic radiation. A photo detector has a p–n junction that converts light photons into current. The absorbed photons make electron–hole pairs in the depletion region. Photodiodes and photo transistors are a few examples of photo detectors. Solar cells convert some of the light energy absorbed into electrical energy.

Contents

CMOS image sensor Matrixw.jpg
CMOS image sensor

The term 'active pixel sensor' is also used to refer to the individual pixel sensor itself, as opposed to the image sensor; [1] in that case the image sensor is sometimes called an active pixel sensor imager, [2] or active-pixel image sensor. [3]

History

The term active pixel sensor was coined in 1985 by Tsutomu Nakamura who worked on the Charge Modulation Device active pixel sensor at Olympus, [4] and more broadly defined by Eric Fossum in a 1993 paper. [5]

Olympus Corporation is a Japanese manufacturer of optics and reprography products. Olympus was established on 12 October 1919, initially specializing in microscopes and thermometers. Olympus holds roughly a 70-percent share of the global endoscope market, estimated to be worth approximately US$2.5 billion. Its global headquarters are located in Shinjuku, Tokyo, Japan.

Eric R. Fossum is an American physicist and engineer known for developing the CMOS image sensor. He is currently a professor at Thayer School of Engineering in Dartmouth College.

Image sensor elements with in-pixel amplifiers were described by Noble in 1968, [6] by Chamberlain in 1969, [7] and by Weimer et al. in 1969, [8] at a time when passive-pixel sensors – that is, pixel sensors without their own amplifiers or active noise cancelling circuitry – were being investigated as a solid-state alternative to vacuum-tube imaging devices.[ citation needed ] The MOS passive-pixel sensor used just a simple switch in the pixel to read out the photodiode integrated charge. [9] Pixels were arrayed in a two-dimensional structure, with an access enable wire shared by pixels in the same row, and output wire shared by column. At the end of each column was an amplifier. Passive-pixel sensors suffered from many limitations, such as high noise, slow readout, and lack of scalability. The addition of an amplifier to each pixel addressed these problems, and resulted in the creation of the active-pixel sensor. Noble in 1968 and Chamberlain in 1969 created sensor arrays with active MOS readout amplifiers per pixel, in essentially the modern three-transistor configuration. The CCD was invented in October 1969 at Bell Labs. Because the MOS process was so variable and MOS transistors had characteristics that changed over time (Vth instability), the CCD's charge-domain operation was more manufacturable and quickly eclipsed MOS passive and active pixel sensors. A low-resolution "mostly digital" N-channel MOSFET imager with intra-pixel amplification, for an optical mouse application, was demonstrated in 1981. [10]

Photodiode type of photodetector based on a p-n-junction

A photodiode is a semiconductor device that converts light into an electrical current. The current is generated when photons are absorbed in the photodiode. Photodiodes may contain optical filters, built-in lenses, and may have large or small surface areas. Photodiodes usually have a slower response time as their surface area increases. The common, traditional solar cell used to generate electric solar power is a large area photodiode.

Image noise

Image noise is random variation of brightness or color information in images, and is usually an aspect of electronic noise. It can be produced by the sensor and circuitry of a scanner or digital camera. Image noise can also originate in film grain and in the unavoidable shot noise of an ideal photon detector. Image noise is an undesirable by-product of image capture that obscures the desired information.

Another type of active pixel sensor is the hybrid infrared focal plane array (IRFPA) designed to operate at cryogenic temperatures in the infrared spectrum. The devices are two chips that are put together like a sandwich: one chip contains detector elements made in InGaAs or HgCdTe, and the other chip is typically made of silicon and is used to read out the photodetectors. The exact date of origin of these devices is classified, but by the mid-1980s they were in widespread use.

By the late 1980s and early 1990s, the CMOS process was well established as a well controlled stable process and was the baseline process for almost all logic and microprocessors. There was a resurgence in the use of passive-pixel sensors for low-end imaging applications, [11] and active-pixel sensors for low-resolution high-function applications such as retina simulation [12] and high energy particle detector. However, CCDs continued to have much lower temporal noise and fixed-pattern noise and were the dominant technology for consumer applications such as camcorders as well as for broadcast cameras, where they were displacing video camera tubes.

Microprocessor computer processor contained on an integrated-circuit chip

A microprocessor is a computer processor that incorporates the functions of a central processing unit on a single integrated circuit (IC), or at most a few integrated circuits. The microprocessor is a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory, and provides results as output. Microprocessors contain both combinational logic and sequential digital logic. Microprocessors operate on numbers and symbols represented in the binary number system.

Video camera tube

Video camera tubes were devices based on the cathode ray tube that were used to capture television images prior to the introduction of charge-coupled devices (CCDs) in the 1980s. Several different types of tubes were in use from the early 1930s to the 1980s.

Fossum, who worked at NASA Jet Propulsion Laboratory et al., invented the image sensor that used intra-pixel charge transfer along with an in-pixel amplifier to achieve true correlated double sampling (CDS) and low temporal noise operation, and on-chip circuits for fixed-pattern noise reduction, and published the first extensive article [5] predicting the emergence of APS imagers as the commercial successor of CCDs. Between 1993 and 1995, the Jet Propulsion Laboratory developed a number of prototype devices, which validated the key features of the technology. Though primitive, these devices demonstrated good image performance with high readout speed and low power consumption.

In 1995, being frustrated by the slow pace of the technology's adoption, Fossum and his then-wife Dr. Sabrina Kemeny co-founded Photobit Corporation to commercialize the technology. [13] It continued to develop and commercialize APS technology for a number of applications, such as web cams, high speed and motion capture cameras, digital radiography, endoscopy (pill) cameras, DSLRs and camera-phones. Many other small image sensor companies also sprang to life shortly thereafter due to the accessibility of the CMOS process and all quickly adopted the active pixel sensor approach. Most recent, the CMOS sensor technology has spread to medium-format photography with Phase One being the first to launch a medium format digital back with a Sony-built CMOS sensor.

Fossum now performs research on the Quanta Image Sensor (QIS) technology. [14] The QIS is a revolutionary change in the way we collect images in a camera that is being invented at Dartmouth. In the QIS, the goal is to count every photon that strikes the image sensor, and to provide resolution of 1 billion or more specialized photoelements (called jots) per sensor, and to read out jot bit planes hundreds or thousands of times per second resulting in terabits/sec of data. [15]

Comparison to CCDs

APS pixels solve the speed and scalability issues of the passive-pixel sensor. They generally consume less power than CCDs, have less image lag, and require less specialized manufacturing facilities. Unlike CCDs, APS sensors can combine the image sensor function and image processing functions within the same integrated circuit. APS sensors have found markets in many consumer applications, especially camera phones. They have also been used in other fields including digital radiography, military ultra high speed image acquisition, security cameras, and optical mice. Manufacturers include Aptina Imaging (independent spinout from Micron Technology, who purchased Photobit in 2001), Canon, Samsung, STMicroelectronics, Toshiba, OmniVision Technologies, Sony, and Foveon, among others. CMOS-type APS sensors are typically suited to applications in which packaging, power management, and on-chip processing are important. CMOS type sensors are widely used, from high-end digital photography down to mobile-phone cameras.

Advantages of CMOS compared with CCD

Blooming in a CCD image Blooming ccd.jpg
Blooming in a CCD image

A big advantage of a CMOS sensor is that it is typically less expensive than a CCD sensor.

A CMOS sensor also typically has better control of blooming (that is, of bleeding of photo-charge from an over-exposed pixel into other nearby pixels).

In three-sensor camera systems that use separate sensors to resolve the red, green, and blue components of the image in conjunction with beam splitter prisms, the three CMOS sensors can be identical, whereas most splitter prisms require that one of the CCD sensors has to be a mirror image of the other two to read out the image in a compatible order. Unlike CCD sensors, CMOS sensors have the ability to reverse the addressing of the sensor elements.

Disadvantages of CMOS compared with CCD

Distortion caused by a rolling shutter Helicopter taking off at Hat Yai Hospital, November 2010.jpg
Distortion caused by a rolling shutter

Since a CMOS sensor typically captures a row at a time within approximately 1/60th or 1/50th of a second (depending on refresh rate) it may result in a "rolling shutter" effect, where the image is skewed (tilted to the left or right, depending on the direction of camera or subject movement). For example, when tracking a car moving at high speed, the car will not be distorted but the background will appear to be tilted. A frame-transfer CCD sensor or "global shutter" CMOS sensor does not have this problem, instead captures the entire image at once into a frame store.

The active circuitry in CMOS pixels takes some area on the surface which is not light-sensitive, reducing the photon-detection efficiency of the device (back-illuminated sensors can mitigate this problem). But the frame-transfer CCD also has about half non-sensitive area for the frame store nodes, so the relative advantages depend on which types of sensors are being compared.

Architecture

Pixel

A three-transistor active pixel sensor. Aps pd pixel schematic.svg
A three-transistor active pixel sensor.

The standard CMOS APS pixel today consists of a photodetector (a pinned photodiode [16] ), a floating diffusion, a transfer gate, reset gate, selection gate and source-follower readout transistor—the so-called 4T cell. [17] The pinned photodiode was originally used in interline transfer CCDs due to its low dark current and good blue response, and when coupled with the transfer gate, allows complete charge transfer from the pinned photodiode to the floating diffusion (which is further connected to the gate of the read-out transistor) eliminating lag. The use of intrapixel charge transfer can offer lower noise by enabling the use of correlated double sampling (CDS). The Noble 3T pixel is still sometimes used since the fabrication requirements are less complex. The 3T pixel comprises the same elements as the 4T pixel except the transfer gate and the photodiode. The reset transistor, Mrst, acts as a switch to reset the floating diffusion to VRST, which in this case is represented as the gate of the Msf transistor. When the reset transistor is turned on, the photodiode is effectively connected to the power supply, VRST, clearing all integrated charge. Since the reset transistor is n-type, the pixel operates in soft reset. The read-out transistor, Msf, acts as a buffer (specifically, a source follower), an amplifier which allows the pixel voltage to be observed without removing the accumulated charge. Its power supply, VDD, is typically tied to the power supply of the reset transistor VRST. The select transistor, Msel, allows a single row of the pixel array to be read by the read-out electronics. Other innovations of the pixels such as 5T and 6T pixels also exist. By adding extra transistors, functions such as global shutter, as opposed to the more common rolling shutter, are possible. In order to increase the pixel densities, shared-row, four-ways and eight-ways shared read out, and other architectures can be employed. A variant of the 3T active pixel is the Foveon X3 sensor invented by Dick Merrill. In this device, three photodiodes are stacked on top of each other using planar fabrication techniques, each photodiode having its own 3T circuit. Each successive layer acts as a filter for the layer below it shifting the spectrum of absorbed light in successive layers. By deconvolving the response of each layered detector, red, green, and blue signals can be reconstructed.

APS using thin-film transistors

A two-transistor active/passive pixel sensor 2-TFT-APS-PPS.svg
A two-transistor active/passive pixel sensor

For applications such as large-area digital X-ray imaging, thin-film transistors (TFTs) can also be used in APS architecture. However, because of the larger size and lower transconductance gain of TFTs compared with CMOS transistors, it is necessary to have fewer on-pixel TFTs to maintain image resolution and quality at an acceptable level. A two-transistor APS/PPS architecture has been shown to be promising for APS using amorphous silicon TFTs. In the two-transistor APS architecture on the right, TAMP is used as a switched-amplifier integrating functions of both Msf and Msel in the three-transistor APS. This results in reduced transistor counts per pixel, as well as increased pixel transconductance gain. [18] Here, Cpix is the pixel storage capacitance, and it is also used to capacitively couple the addressing pulse of the "Read" to the gate of TAMP for ON-OFF switching. Such pixel readout circuits work best with low capacitance photoconductor detectors such as amorphous selenium.

Array

A typical two-dimensional array of pixels is organized into rows and columns. Pixels in a given row share reset lines, so that a whole row is reset at a time. The row select lines of each pixel in a row are tied together as well. The outputs of each pixel in any given column are tied together. Since only one row is selected at a given time, no competition for the output line occurs. Further amplifier circuitry is typically on a column basis.

Size

The size of the pixel sensor is often given in height and width, but also in the optical format.

Design variants

Many different pixel designs have been proposed and fabricated. The standard pixel is the most common because it uses the fewest wires and the fewest, most tightly packed transistors possible for an active pixel. It is important that the active circuitry in a pixel take up as little space as possible to allow more room for the photodetector. High transistor count hurts fill factor, that is, the percentage of the pixel area that is sensitive to light. Pixel size can be traded for desirable qualities such as noise reduction or reduced image lag. Noise is a measure of the accuracy with which the incident light can be measured. Lag occurs when traces of a previous frame remain in future frames, i.e. the pixel is not fully reset. The voltage noise variance in a soft-reset (gate-voltage regulated) pixel is , but image lag and fixed pattern noise may be problematic. In rms electrons, the noise is .

Hard reset

Operating the pixel via hard reset results in a Johnson–Nyquist noise on the photodiode of or , but prevents image lag, sometimes a desirable tradeoff. One way to use hard reset is replace Mrst with a p-type transistor and invert the polarity of the RST signal. The presence of the p-type device reduces fill factor, as extra space is required between p- and n-devices; it also removes the possibility of using the reset transistor as an overflow anti-blooming drain, which is a commonly exploited benefit of the n-type reset FET. Another way to achieve hard reset, with the n-type FET, is to lower the voltage of VRST relative to the on-voltage of RST. This reduction may reduce headroom, or full-well charge capacity, but does not affect fill factor, unless VDD is then routed on a separate wire with its original voltage.

Combinations of hard and soft reset

Techniques such as flushed reset, pseudo-flash reset, and hard-to-soft reset combine soft and hard reset. The details of these methods differ, but the basic idea is the same. First, a hard reset is done, eliminating image lag. Next, a soft reset is done, causing a low noise reset without adding any lag. [19] Pseudo-flash reset requires separating VRST from VDD, while the other two techniques add more complicated column circuitry. Specifically, pseudo-flash reset and hard-to-soft reset both add transistors between the pixel power supplies and the actual VDD. The result is lower headroom, without affecting fill factor.

Active reset

A more radical pixel design is the active-reset pixel. Active reset can result in much lower noise levels. The tradeoff is a complicated reset scheme, as well as either a much larger pixel or extra column-level circuitry.

See also

Related Research Articles

Charge-coupled device device for the movement of electrical charge

A charge-coupled device (CCD) is a device for the movement of electrical charge, usually from within the device to an area where the charge can be manipulated, for example conversion into a digital value. This is achieved by "shifting" the signals between stages within the device one at a time. CCDs move charge between capacitive bins in the device, with the shift allowing for the transfer of charge between bins.

Digital camera camera that captures photographs or video in digital format

A digital camera or digicam is a camera that captures photographs in digital memory. Most cameras produced today are digital, and while there are still dedicated digital cameras, many more are now incorporated into devices ranging from mobile devices to vehicles. However, high-end, high-definition dedicated cameras are still commonly used by professionals.

LBCAST is a type of photo sensor which the manufacturer claims is simpler and thus smaller and faster than CMOS sensors. It was developed over ten years by Nikon, in parallel with other manufacturer's development of CMOS, and resulted in shipping product in 2003.

Single-photon avalanche diode solid-state photodetector

A single-photon avalanche diode (SPAD) is a solid-state photodetector in which a photon-generated carrier can trigger a short-duration but relatively large avalanche current. This avalanche is created through a mechanism called impact ionization, whereby carriers are accelerated to high kinetic energies through a large potential gradient (voltage). If the kinetic energy of a carrier is sufficient further carriers are liberated from the atomic lattice. The number of carriers thus increases exponentially from, in some cases, as few as a single carrier. This mechanism was observed and modeled by John Townsend for trace-gas vacuum tubes, becoming known as a Townsend discharge, and later being attributed to solid-state breakdown by K. McAfee. This device is able to detect low-intensity ionizing radiation, including: gamma, X-ray, beta, and alpha-particle radiation along with electromagnetic signals in the UV, Visible and IR. SPADs are also able to distinguish the arrival times of events (photons) with a timing jitter of a few tens of picoseconds.

Super CCD camera parts, features and technologies

Super CCD is a proprietary charge-coupled device that has been developed by Fujifilm since 1999. The Super CCD uses octagonal, rather than rectangular, pixels. This allows a higher horizontal and vertical resolution to be achieved than a traditional sensor of an equivalent pixel count.

Electronic component basic discrete device or physical entity in an electronic system used to affect electrons or their associated fields

An electronic component is any basic discrete device or physical entity in an electronic system used to affect electrons or their associated fields. Electronic components are mostly industrial products, available in a singular form and are not to be confused with electrical elements, which are conceptual abstractions representing idealized electronic components.

A microbolometer is a specific type of bolometer used as a detector in a thermal camera. Infrared radiation with wavelengths between 7.5–14 μm strikes the detector material, heating it, and thus changing its electrical resistance. This resistance change is measured and processed into temperatures which can be used to create an image. Unlike other types of infrared detecting equipment, microbolometers do not require cooling.

A staring array, staring-plane array, focal-plane array (FPA), or focal-plane is an image sensing device consisting of an array of light-sensing pixels at the focal plane of a lens. FPAs are used most commonly for imaging purposes, but can also be used for non-imaging purposes such as spectrometry, lidar, and wave-front sensing.

Color filter array

In photography, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an image sensor to capture color information.

Image sensor format camera parts, features and technologies

Note: If you came here to get a quick understanding of numbers like 1/2.3, skip ahead to table of sensor formats and sizes. For a simplified discussion of image sensors see image sensor.

Rolling shutter image capture method

Rolling shutter is a method of image capture in which a still picture or each frame of a video is captured not by taking a snapshot of the entire scene at a single instant in time but rather by scanning across the scene rapidly, either vertically or horizontally. In other words, not all parts of the image of the scene are recorded at exactly the same instant. This produces predictable distortions of fast-moving objects or rapid flashes of light. This is in contrast with "global shutter" in which the entire frame is captured at the same instant.

Back-illuminated sensor

A back-illuminated sensor, also known as backside illumination sensor, is a type of digital image sensor that uses a novel arrangement of the imaging elements to increase the amount of light captured and thereby improve low-light performance.

A vision chip is an integrated circuit having both image sensing circuitry and image processing circuitry on the same die. The image sensing circuitry may be implemented using charge-coupled devices, active pixel sensor circuits, or any other light sensing mechanism. The image processing circuitry may be implemented using analog, digital, or mixed signal circuitry. One area of research is the use of neuromorphic engineering techniques to implement processing circuits inspired by biological neural systems. The output of a vision chip is generally a partially processed image or a high-level information signal revealing something about the observed scene. Although there is no standard definition of a vision chip, the processing performed may comprise anything from processing individual pixel values to performing complex image processing functions and outputting a single value or yes/no signal based on the scene.

Nobukazu Teranishi is a Japanese engineer who researches image sensors, and is known for inventing the pinned photodiode, a component of modern digital cameras. He was one of four recipients of the 2017 Queen Elizabeth Prize for Engineering. As of 2017, he is a professor at the University of Hyogo and at Shizuoka University.

sCMOS Camera technology

sCMOS is a technology based on next-generation CMOS Image Sensor (CIS) design and fabrication techniques. sCMOS image sensors offer extremely low noise, rapid frame rates, wide dynamic range, high quantum efficiency, high resolution, and a large field of view simultaneously in one image.

References

  1. Alexander G. Dickinson et al., "Active pixel sensor and imaging system having differential mode", US 5631704
  2. Zimmermann, Horst (2000). Integrated Silicon Optoelectronics. Springer. ISBN   3-540-66662-1.
  3. Lawrence T. Clark, Mark A. Beiley, Eric J. Hoffman, "Sensor cell having a soft saturation circuit" US 6133563
  4. Matsumoto, Kazuya; et al. (1985). "A new MOS phototransistor operating in a non-destructive readout mode". Japanese Journal of Applied Physics. 24 (5A): L323.
  5. 1 2 Eric R. Fossum (1993), "Active Pixel Sensors: Are CCD's Dinosaurs?" Proc. SPIE Vol. 1900, p. 2–14, Charge-Coupled Devices and Solid State Optical Sensors III, Morley M. Blouke; Ed.
  6. , (presented with an award for 'Seminal contributions to the early years of image sensors', by the International Image sensor Society in 2015). Peter J. W. Noble (Apr 1968). "Self-Scanned Silicon Image Detector Arrays". ED-15 (4). IEEE: 202–209.
  7. Savvas G. Chamberlain (December 1969). "Photosensitivity and Scanning of Silicon Image Detector Arrays". IEEE Journal of Solid-State Circuits. SC-4 (6): 333–342.
  8. P. K. Weimer; W. S. Pike; G. Sadasiv; F. V. Shallcross; L. Meray-Horvath (March 1969). "Multielement Self-Scanned Mosaic Sensors". IEEE Spectrum. 6 (3): 52–65. doi:10.1109/MSPEC.1969.5214004.
  9. R. Dyck; G. Weckler (1968). "Integrated arrays of silicon photodetectors for image sensing". IEEE Trans. Electron Devices. ED-15 (4): 196–201.
  10. Richard F. Lyon (1981). "The Optical Mouse, and an Architectural Methodology for Smart Digital Sensors". In H. T. Kung; R. Sproull; G. Steele. CMU Conference on VLSI Structures and Computations. Pittsburgh: Computer Science Press.
  11. D. Renshaw; P. B. Denyer; G. Wang; M. Lu (1990). "ASIC image sensors". IEEE International Symposium on Circuits and Systems 1990.
  12. M. A. Mahowald; C. Mead (12 May 1989). "The Silicon Retina". Scientific American. 264 (5): 76–82. Bibcode:1991SciAm.264e..76M. doi:10.1038/scientificamerican0591-76. PMID   2052936.
  13. Fossum, Eric R. (18 December 2013). "CAMERA-ON-A-CHIP: TECHNOLOGY TRANSFER FROM SATURN TO YOUR CELL PHONE". Technology & Innovation. 15 (3): 197–209. doi:10.3727/194982413X13790020921744 via IngentaConnect.
  14. Fossum, E. R. (1 September 2013). "Modeling the Performance of Single-Bit and Multi-Bit Quanta Image Sensors". IEEE Journal of the Electron Devices Society. 1 (9): 166–174. doi:10.1109/JEDS.2013.2284054 via IEEE Xplore.
  15. Research on advanced image sensors and camera systems
  16. A review of the pinned photodiode for CCD and CMOS image sensors, IEEE J. Electron Devices Society, vol 2(3) pp. 33-43 May 2014 open access "Archived copy". Archived from the original on 2015-10-27. Retrieved 2014-08-17.CS1 maint: Archived copy as title (link)
  17. H. Lin; C.H Lai; Y. C. Ling (2004). "A four transistor CMOS active pixel sensor with high dynamic range operation". IEEE Advanced System Integrated Circuits: 124–127.
  18. F. Taghibakhsh; k. S. Karim (2007). "Two-Transistor Active Pixel Sensor for High Resolution Large Area Digital X-Ray Imaging". IEEE International Electron Devices Meeting: 1011–1014.
  19. IEEE TRANSACTIONS ON ELECTRON DEVICES, VOL. 50, NO. 1, JANUARY 2003

Further reading