Lab Notes

PHOTON DETECTORS
Ultrasensitive Two-Dimensional Photodetection
Posted October 2012
What information can be associated with the
detection of a single photon?

Where and, more importantly, when did you see that flash of light? Specifically, when did that single photon arrive at the detector? And how quickly can the detector recover to see the next one? These questions and others are the bailiwick of Richard Marino and his associates in the Active Optical Systems Group. The solution for getting answers to these questions is a focal-plane-detector array comprising Geiger-mode avalanche photodiodes (GM APD FPA).

Back in 1991, Marino recalls, there was a need, expressed by the Department of Defense, for a very smart missile that could quickly and automatically distinguish between a true target and a decoy. The missile sensors had to be very capable, and yet small and light in order for the missile to be quick enough to engage the target. The sooner the missile could identify the target, the sooner it could divert its path to hit the target. The sensor system could use a laser to probe the targets and decoys, but the performance of the laser radar (or ladar) was crucial. The question came down to, "From how far away can the sensor identify the target?" and led to the fundamental question, "How much information can be associated with the fewest amount of photons detected?"

Illustration of how GM APD array is set upEach pixel of the array comprises a stack of components. Photons arriving from the right pass through an array of lenses and hit the GM APD array. Each pixel of the array has its own timing circuit directly behind it, so digital counting signals are generated at the pixel, thus allowing the pixel to be quickly reset. In this figure, CMOS stands for complementary metal-oxide semiconductor, ROIC is readout integrated circuit, and InP and InGaAsP are indium phosphide and indium gallium arsenide phosphide.

The concept of a Geiger-mode photon-counting detector array that could measure the three-dimensional shape (resolved in angle, angle, and range) and orientation of the target and decoys was proposed. Such a sensor could reduce the requirements for size, weight, and power by maximizing the efficiency of the optical signal receiver. A compact, intelligent, integrated detector array was needed. "There's skepticism even today about the GM APD technology," Marino says. The nonlinearity of the GM APD and the fact that it isn’t counting enough photons for traditionally "good" statistics are typical concerns in ladar sensor engineering. "Engineers like linear systems," Marino states. However, the technology behind GM APDs is almost perfectly nonlinear. "Timing is everything," Marino continues. The arrival time of a single photon at a detector determines the distance to the corresponding spot on the object.

FPA waferThe wafer above contains twenty 256 × 64 FPA chips.

Brian Aull and his associates in the Advanced Imaging Technology Group developed GM APD arrays that have the ability to detect and time-stamp single photons by using their unique, independent, digital time-of-flight counting circuits (at each pixel) and their extremely high internal detector gain. The independent time-of-flight measurement for each pixel has a timing quantization of 500 picoseconds (equivalent to a 2 GHz effective clock rate). Using the detector in this binary response mode, where time of arrival is more important than signal intensity, simplifies the signal processing by eliminating the need for analog-to-digital converters and corrections to varying responses to input intensity. Simplifying the detection process in this way, and maximizing single-photon-detection efficiency helps in reducing the required size, weight, and power requirements. Marino is also concerned with possible false alarms with such sensitive devices. In order to reduce unwanted detections from random background light, GM APDs are not held in their wait state indefinitely—they are turned on only during the expected detection time (range to objects).

A single GM APD can be thought of as a photon-to-digital converter that produces a digital-logic-compatible voltage transition in response to a single incident photon. In this way, GM APDs completely eliminate many of the traditional types of noise (e.g., read noise, amplification noise) involved in photon detection with analog receivers. In the GM APD arrays developed at Lincoln Laboratory, each pixel is mated to a digital CMOS (complementary metal-oxide semiconductor) timing circuit that measures the arrival time of the photons.

The independent time-of-flight measurement for each pixel has a timing quantization of 500 picoseconds (equivalent to a 2 GHz effective clock rate). One primary application of the GM APD imager is as a detector array in a three-dimensional imaging laser radar (ladar) camera. A ladar camera uses a very-short-pulse laser (a typical laser pulse width is 1 ns) to illuminate an object and a GM APD optical receiver to simultaneously image the reflected light and measure the time of flight of each photon for each pixel in the image. The resulting three-dimensional data (x- and y-coordinates corresponding to the pixel position in the array and a z-coordinate corresponding to its range) can be mapped, with a color spectrum representing the range in the image to produce a three-dimensional "point cloud" depiction of the object. Such imagery is useful for looking behind partially opaque material (e.g., foliage) as well as for target identification and feature extraction.

Example of imagery achieved by GM APD photodetectorFoliage penetration is possible through multiple images from various angles. Even though an object is completely covered to the eye (e.g., tree foliage above a vehicle), laser radar signals can visualize any level (distance from the detector) of information. This sequence of images shows the effect of cropping the three-dimensional data by successively eliminating pixels above a certain height. The first image (upper left) contains treetops that are eliminated by the fourth image (lower right) to reveal a hidden object below the tree canopy.

The second innovation that made photon-counting ladar sensors so successful was the microchip laser. These lasers proved effective in reducing the camera size and weight while still providing "enough energy to get the information you want," according to Marino. Under normal circumstances, to produce sufficient statistics, you need high intensity or you need to integrate over long time periods if the intensity is low. With the current microchip lasers, even a relatively small transportable or space-qualified camera has plenty of intensity for photon detection. Flash ladar, a single big pulse, certainly has sufficient intensity to acquire adequate data, but these lasers tend to have low pulsing rates (10 to 100 pulses/s). If the sensor receiver requires a strong signal to make a detection, then there will be noise from speckle interference in the intensity data, and multiple pulses are usually averaged to reduce the effects of speckle, according to Marino. "Instead, we typically use lasers that pulse at a very high rate (10,000 pulses/s or greater) and operate with less than one photon per pixel per pulse on average."

"Signal-to-noise ratio (SNR) is commonly used to determine the performance or quality of a measurement, but SNR usually refers to intensity," Marino says. The SNR figure of merit isn't an obvious concept in the GM APD data. "With a GM APD, a photon is either detected or not—a zero or a one." Marino does consider that the error in range is a measurable value that can be applied as a figure of merit for the GM APD.

One application that has been successfully demonstrated and utilized is an airborne three-dimensional ladar camera. This device collects multiple three-dimensional data sets from a collection of viewing angles and then registers and combines the images, as in a jigsaw puzzle. By adjusting the "threshold" of the photons' return times, the color coding of the images can be selected to remove the obscuring materials (e.g., foliage) from the image to reveal the objects of interest beneath. An illustration of this process uses data obtained from a helicopter. Multiple three-dimensional images are collected from different viewing angles, spatially registered to form a three-dimensional point cloud, and displayed with color representing relative height.

Images of a van created by GM APD photodetectionThese images of a Chevrolet van were obtained from a prototype three-dimensional laser sensor. In the upper left is a three-dimensional model rendered from the angle-angle-range data. The other three renditions are point clouds viewed from different aspects. Rotation of the color-coded image better reveals shapes, sizes, and relative positions of different parts of the van.

The GM APD array cameras have been used for imaging wide areas and urban environments, as well as for detecting change within an image. Future applications are wide open. With these cameras, it is possible to rapidly capture true three-dimensional data of an entire construction area, perform accurate land surveying, and create precise three-dimensional surface models of solid objects. This sensor technology in its current form could be used for detection and tracking of moving objects for border patrol, for robotic vision, and for navigation of fully autonomous air, land, sea, and underwater vehicles.

 

top of page

Newer and more sensitive satellite-tracking telescopes are helping to solve the first problem of locating objects down to less than 10 cm in size. Once each object is located, it needs to be continuously tracked to define its orbit. Now, Lue's analysis comes into play. The six-hour frequency conjunction mentioned above is for a simple sphere. Lue proposes that if the orbits can be defined more accurately, elongated ellipsoids of potential future locations of objects will not overlap as often and the "ellipoidal time between conjunctions" can be extended to 15 days—a significant improvement over six hours. With these tools—more accurate measurements of orbiting objects and improved algorithms for defining future locations of the objects—only those very close conjunctions will require a notification to satellite owners to suggest that they move the satellite to avoid the collision. As an added bonus to the collision avoidance, "there shouldn’t be as many additional influxes of debris into the Space Catalog," Lue says.