Missile defense includes detecting, tracking, and neutralizing missiles. Missile defense may be especially useful for surface ships and aircraft. When surface ships are near the shore, missile launches may pose dangers to the surface ships. Sensor systems may be carried by aircraft to detect objects or the movement of objects. For example, an aircraft may perform reconnaissance missions using optical systems including telescopes to detect missile launches. These telescopes are moveable, mounted, and frequently operated by human operators.
Additionally, ships may scan the horizon using telescopes to look for incoming missiles that may skim the surface of the water. These types of telescopes have a limited field of view. As a result, the number of telescopes needed to provide a preferred coverage and number of operators to operate the telescopes may be greater than desired. Because of their narrow field of view, telescopes may miss launches at the moment they occur, reducing the amount of response time to the launch.
Other sensor systems may include forward looking infrared (FLIR) sensors. These types of sensors detect heat to generate an image. These sensors may include cameras that detect infrared light. The sensors may include various components such as filters, cryogenic cooling, and complex arrays of detectors. These sensors also are movably mounted because of the limited fields of view. With the increased complexity and the presence of moveable parts, the maintenance that may be needed for these sensor systems may be greater than desired.
Conventional optical imaging sensors typically operate as bulk frequency absorption devices. Many conventional optical imaging sensors employ bulk semiconductors, which absorb electromagnetic radiation across large frequency ranges and have no means to discriminate against electromagnetic radiation in specific frequency ranges. Conventional optical imaging sensors typically include a focal plane array of detectors. Each detector is a semiconductor pixel. These pixels absorb images at all frequencies, and convert them into electrical signal equivalents. However, these pixels have no means to maintain the wavelength selectivity in the image and, thus, that information is lost.
In order for these optical imaging sensors to be able to determine the colors they are sensing (i.e. achieve color discrimination), their pixels are typically grouped into square groupings of four, and a Bayer mask is placed in front of each square grouping of four pixels. A Bayer mask includes four color filters that are arranged in the form of a square. A Bayer mask has one red filter, one blue filter, and two green filters. Bayer masks include two green filters because the human eye is more sensitive to green than to red or blue. Each grouping of four pixels sends its sensed information to a processor for determining the specific color that they detect. As such, in order for these conventional optical imaging systems to be able to achieve color discrimination, a substantial amount of computation is required. In addition, it should be noted that the use of the four color filters results in reduced sensitivity of the sensor, lower image resolution, and increased noise.
Currently, many of the imagers that operate in the single photon regime are limited to image-intensifier tubes. These devices do not have wavelength selectivity. Image-intensifier tubes are constructed from tubes that are very delicate and, thus, they can easily be damaged. In addition, image-intensifier tubes require separate power supply devices.
As can be observed from the discussion above, optical imaging that can detect single-photons while maintaining wavelength sensitivity is very challenging to achieve. Therefore, it would be advantageous to have a method and apparatus that takes into account at least some of the issues discussed above as well as possibly other issues.