Classical methods for evaluation of fingerprints. toeprints, palmprints and like skin patterns entail location, categorization and tabulation of minutiae. Efforts to adapt these classical techniques for automated print verification have received great attention and elaboration, but are fundamentally limited by their sensitivity to measurement noise at the location of the miniutiae.
Automated analysis based on minutiae also is inherently very dependent on image enhancement--which often breaks down when initial data quality is marginal. For these reasons some workers have explored other methodologies.
Some seemingly promising efforts employ holograms--either direct three-dimensional images of prints, or holographic Fourier transforms (which have the advantage of being position invariant). Some of these techniques, for best results, impose costly demands on special memory devices for storing the holograms. These holographic correlators are in essence modern refinements of much earlier two-dimensional direct-optical-overlay correlators such as that described by Green and Halasz in U.S. Pat. No. 3,928,842.
An intermediate ground is represented by a few relatively sophisticated patents that use digital computers to (1) automatically select one or more distinctive small regions--not necessarily minutiae--in a master print or "template", and then (2) automatically look for one or more of these selected small regions in a print provided by a person who purports to be the maker of the template. These earlier patents particularly include U.S. Pat. Nos. 5,067,162 of Driscoll, 5,040,223 of Kamiya, 4,982,439 of Castelaz, 4,805,223 of Denyer, and 4,803,734 of Onishi.
All of these latter patents describe making final verification decisions based upon such comparisons of small regions. In this they are unavoidably flawed in their excessive dependence upon isolated, small amounts of data--more specifically, very small fractions of the available information in a candidate user's print.
Some of the patents in the above list do describe sound techniques for one or another part of their respective processes. Some workers, such as Driscoll and Kamiya, use correlation methods (but electronic-data correlation methods, not optical correlation methods) to choose the small reference sections in the enrollment process--i. e., in forming the template--and also in comparison of those regions with features in a candidate user's print. Denyer similarly uses an approximation to such correlation technique.
These patents do generally allow for the possibility that the authorized user's template may be shifted, or in other words translated, in placement of the print image on the sensor. Some (particularly Driscoll and Denyer) allow for the possibility that the template may be rotated too.
Driscoll discusses finding a least-squares-fit between plural reference regions and a potentially corresponding plurality of test regions in the candidate print. He suggests that departures from an ideal rotated pattern of the reference regions is to be accounted for by distortion of the fingertip in the course of placement on a sensor, but by his least-squares approach also suggests that such distortion is inherently "random" in the sense of lacking internal correlation. whereas distortions of flesh-and-skin structures are in fact random in the sense of being modeled or modelable statistically, proper efforts at such modeling must take into account that neighboring portions of the structure exert influences upon one another, resulting in physical correlations. In short, neighbors are softly constrained.
Driscoll's approach, in using a least-squares fit--to accommodate departures from a rigid rotation that underlies the distortion--in essence disregards such correlations; at best, he only considers a small part of the operative statistics. Denyer, too, briefly mentions (though in a much more generalized and tangential way) the possibility of somehow accounting for distortion.
All of these patents, however, fail to take account of dilations (or, to put it more completely, dilations or contractions) which an authorized user's fingertip may undergo--relative to the same user's established template. Such dilations may arise from variations in the pressure with which the finger is applied to an optical or other sensor (capacitive, variable-resistance etc.).
Such dilations may be expected to have at least a component which is invariant across the entire image, in other words a dilation without change of fingerprint shape--an isomorphic dilation. Furthermore all the above-mentioned patents fail to make systematic, controlled allowance for dilations and other forms of distortion that are differential--which is to say, nonisomorphic.
Correlation methods, matched-filter methods, and (loosely speaking) related overlay-style techniques of comparison all fail totally in any area where a reference print is mismatched to a candidate print by as little as a quarter of the spacing between ridges. I have found that dilations and other distortions can and commonly do produce spurious mismatches locally--over sizable areas--exceeding twice the spacing between ridges, that is, many times the minimum disruption which destroys correlation and thereby recognition.
Therefore, failure to account properly for either dilation (isomorphic distortion) or distortion differential distortion) results in unacceptably high rates of failure to verify or recognize an authorized user--i. e., high rates of the so-called "false rejection" or "type 1 error". Artificial measures aimed at reducing this failure rate lead inevitably to the converse: unacceptably high rates of failure to reject unauthorized users, impostors--i. e., high rates of the so-called "false acceptance" or "type 2 error".
Merely allowing for some distortion, in a statistically uncontrolled way, can never cure this fundamental failing. Skin and flesh distortion does not affect prints in an uncorrelated way, but rather in partially systematic ways that arise from the physical character of skin and flesh. I believe that failure to account properly for distortion is the single greatest contributor to poor performance of fingerprint verifying systems heretofore.
Furthermore variations in habits of placement of a fingertip on a sensor tend to be somewhat systematic. These systematic properties of the print-forming process have their own statistically characteristic patterns--their own statistics.
In the context of any given comparison method, these special statistics exert particular characteristic effects on the results. All the patents mentioned above appear to ignore these statistics, in the process discarding very important information that bears strongly on verification decisions.
In addition, the patents listed above fail to make use of modern principles of decision theory and signal processing that have been used to great advantage in other fields. Driscoll, for instance, while discussing the final stages of his analysis in terms reminiscent of the established Neyman-Pierson analysis, does not appear to properly apply the principles of that analysis. Such principles have been importantly applied in industrial, military, and scientific pattern-recognition problems, but workers in the practical fingerprint field do not appear to be aware of these principles or in any event are not evidently using them.
Similarly none of the patents noted makes use of decisional downweighting of data from areas that are less certain or noisier; rather, to the extent that any consideration at all is given to such matters, noisy data are simply discarded--a very undesirable way to treat expensive data. Bandpassing of test data is not seen in these references, although certain other forms of filtering are used by Driscoll and others. Normalizing is likewise absent--except for trivial forms implicit in binarization or trinarization, used in many print analyzers. None of the noted patents teaches expression of test and template data, or comparison of such data with one another, in terms of local sinusoids.
Thus the skin-pattern verification field has failed to make good use of all available data, take effective account of dilations or distortions, make suitable allowance for known statistics of placement variation, and apply modern decisional and signal-processing tools. As can now be seen, the prior art in this field remains subject to significant problems, and the efforts outlined above--although praiseworthy--have left room for considerable improvement.