Image calibration with white referencing is one important step of scientific imaging. The goal of the calibration is to eliminate the impact from uneven lighting conditions. Typically, shortly before or after the raw image of the target object is taken, a white tile is placed at the same position as the object and is imaged too. For each image pixel, the pixel value in the raw image is divided by the corresponding pixel value of the white reference image:Corrected Image from the camera=Image of sample/Image of white tile
Sometimes the dark reading image is also taken by closing the aperture or simply putting on the lens cover, in which case:Corrected Image from the camera=(Image of sample−Dark reading Image)/(Image of white tile−Dark reading Image)
The above method works well for flat objects, but when the object has complicated 3D shapes this method has serious problems because 1) the object surface may be at different depth distances from the camera, where the lighting intensity can differ a lot from where the flat white tile is located, and 2) the object surface may be at many different tilted angles, which will severely change the reflectance not only in intensity, but also in color. Take plant leaf reflectance for example, where the PROSAIL model (http://teledetection.ipgp.jussieu.fr/prosail/) shows the different leaf angles completely change the reflectance spectra, which may cause 300% change in color index calculation such as NDVI, as shown in FIG. 1.
The problem may be solved by replacing the flat white tile with a 3D white referencing. The 3D white reference should have exactly the same size and 3D shape as the target object. Since each target object is different, one solution can be achieved by placing a 3D scanner and a 3D printer on the spot. Every time a new object arrives, it is scanned by the 3D scanner. The scanning result is then sent to the 3D printer immediately to print out the 3D white reference. This 3D white reference is then scanned for the white reference image, which is used to calibrate the raw image of the object. Preliminary data from experiments confirm the improved calibration quality with 3D reference compared with 2D flat reference. FIG. 2 shows the spectra of multiple points on a 3D object calibrated with 3D and flat references separately. The object is made of uniform material and color, so the difference in spectra between the different points can only be from lighting and angle variation. As observed from the figure, the 3D reference did a far better job than the flat reference. In addition, plant leaves were also imaged, rotated by different angles. The result in FIG. 3 shows the similar improved calibration quality of 3D (sloped) reference compared with flat reference.
However, producing the 3D white reference for each object is an expensive and impractical approach, incurring increased processing time and resources. Therefore, improvements are needed in the field.