Detection of infrared radiation emitted by warm bodies provides an important method for night vision (perception without visible light). Infrared detectors may be classified in various ways, such as scanning or staring arrays, cryogenic (typically liquid nitrogen temperatures) or uncooled detectors, 3-5 micron or 8-12 micron spectral sensitivity range, and photon or thermal detection mechanism.
Cryogenic infrared detectors are typically made of small bandgap (about 0.1-0.2 eV) semiconductors such as HgCdTe and operate as photo-diodes or photocapacitors by photon absorption to produce electron-hole pairs. On the other hand, uncooled infrared detectors cannot make use of small bandgap semiconductors because the bandgap is only about four times the thermal energy (4 kT) at room temperature and dark current swamps any signal. Consequently, uncooled infrared detectors rely on the other physical phenomena and are less sensitive than cryogenic detectors but do not require cooling apparatus or its energy consumption. Typically, the preferred choice for an uncooled detector is a thermal detector. The thermal detector is usually one of three types: (1) pyroelectric detector, (2) thermocouple, or (3) bolometer.
A very good pyroelectric detector uses a ferroelectric ceramic material (such as BaSrTiO.sub.3) at operating temperatures typically between about 0.degree. C. and 150.degree. C. The preferred ferroelectric materials have a large change in spontaneous dielectric polarization at operating temperatures, and the heating of the ferroelectric is detected by sensing the induced voltage created by the generation of charge across a capacitor with the ferroelectric as insulator.
Prior art approaches, however, suffer from processing difficulties which limit the array resolution and quality of devices fabricated. Accordingly, improvements which overcome any or all of the problems are presently desirable.