Reducing the use of pesticides for weed and plant control has become an issue of national importance. Ground water is vitally important and the use of herbicides to prevent weeds from growing in homeowner and commercial lawns adversely impacts the quality of ground water. Most herbicides are persistent, soluble in water, and ingestion at high toxicity levels can be carcinogenic, affecting the human nervous system and causing endocrine disruption.
Ninety-five percent of fresh water on earth is ground water. Ground water is found in natural rock formations called aquifers, and are a vital natural resource with many uses. Over 50% of the USA population relies on ground water as a source of drinking water, especially in rural areas.
Recent decades have seen the rise of intelligent systems, especially large agricultural systems run by controllers or computers that detect weeds, surface and soil defects, pests and the like. These systems are used to provide for selectively targeted, judicious use of pesticides in managing weeds and other undesirable or predatory plants on fields, lawns and grasslands. This helps avoid runoff from rain or melting snow, and direct poisoning of animals via direct exposure.
Known commonly used spray treatments include Spectracide® (Spectrum Brands™ Middleton, Wis., USA), a non-selective fast acting liquid pesticide that kills all plants, ideal for use with a selective plant treatment system as illustratively shown in this disclosure. Another known spray treatment is 2,4-Dichlorophenoxyacetic acid, an organic compound which kills most broadleaf weeds by inducing uncontrolled growth in them, but spares most grasses on lawn and turf and crop fields.
There is a tremendous economic incentive, too, to reduce use of pesticides. Large land areas under treatment consume large volumes of liquid or other treatments, a huge expense and a labor intensive task for distribution.
Now referring to FIG. 1, a schematic representation of a general electromagnetic spectrum is shown for wavelengths of radiation of significance that are potentially incident upon a plant, with wavelengths ranging from 1 mm to less than 100 nm. In the infrared portion, or heat radiation portion of the electromagnetic spectrum, the near-infrared, or near-IR, as it is commonly known, ranges in wavelength from 700 nm to 3 microns. Visible light is generally taken to range approximately from 700 nm to 400 nm. Ultraviolet radiation is generally taken to be of wavelength less than 400 nm, with near-ultraviolet further divided into known portions UV-A (400-320 nm), UV-B (320-280 nm) and UV-C (280 nm-100 nm), which is extremely dangerous for humans and is often used as a germicidal radiation to purify water and kill bacteria, viruses, and other organisms.
Photosynthesis in plants makes use of visible light, especially blue and red visible light, and ultraviolet light, to varying degrees, depending on a host of factors including plant species and type, radiation exposure history and other factors. Approximately seven percent of the electromagnetic radiation emitted from the sun is in a UV range of about 200-400 nm wavelengths. Nearly all plants have low absorbance of the green portion of the visible spectrum and exhibit strong characteristic green reflectance. Although some plants can be characterized via spectral analysis of light reflected off from leaves and stems, etc., this type of detection requires special arrangements, calibrations, processing demands and software, and is prone to error and non-operation due to local conditions, plant variations, moisture and soil on leaves, and the effects of recent sun exposure history, which can change the tone or color of leaves and plant parts, which adapt to irradiation, often for protection against internal scalding.
Specifically, known leaf reflectance and transmittance spectra depend on light absorption by leaf pigments and reflectance/transmittance from light multi-scattering within leaves as a function of refractive index and leaf anatomical structure. As known in the art, leaf reflectance varies with four basic biophysical properties including internal leaf anatomy, chlorophyll concentration, water content and dry matter concentration. Reflectance for plant leaves from UV through IR range (330-1300 nm) demonstrate four different reflectance patterns: 1) 330-450 nm and 680 nm with a small peak at 550 nm (green edge); 2) peak between 680-750 nm (red edge); peak at 780-1300 nm (near an infrared plateau); and decreased reflectance at 1300-2500 nm. Reflectance patterns of plant pigments show peaks for chlorophyll at 550 and 700 nm; 550 nm for anthocyanins; and 510-520 nm for carotenoids.
Also, field leaf reflectance may vary with environmental parameters like soil type, light conditions, irregular terrain, and maintenance inputs (fertilizer, watering, etc.); as well as, plant variables such as irregular/dense sowing patterns, different plant species, growth stages, leaf moisture, and similar color of crop and weeds.
Now referring to FIG. 2, a target plant and grass on a field are shown illustratively. A field on which the instant invention shall operate (shown, FIELD and 01) and shown are features arising from this field: grass Gr and an aberration, undesired entity, pest, defect or weed is shown merely for illustrative purposes as a target plant TP.
In the use of field images to detect target plants (as defined below), the use of color is complex and critical. Teachings allow selection of target plants among a large number of plants in a field using analysis of image data.
The analysis and processing of color information, or any information gathered from a color camera must operate in a color space, almost always a 3-dimensional, mathematically defined color space where each color available, of each hue, saturation and brightness, is specified using three coordinates x, y, and z. Machine vision necessarily uses what is called a rendered color space that is formed or rendered for a specific device or devices, explained further below.
Human vision itself can also be characterized using a color space, which is not rendered for a specific device, an example being known unrendered opponent color spaces, such as the CIE L*U*V* (CIELUV) or CIE L*a*b* (CIELAB) systems. The CIE (International Commission on Illumination) established in 1931 a foundation for all color management and reproduction, and the result is a chromaticity diagram which uses three coordinates, x, y, and z. A plot of this three dimensional system at maximum luminosity is universally used to describe color in terms of x and y, and this plot, famously called the 1931 chromaticity diagram, is believed to be able to describe all perceived color in humans.
FIG. 3 shows chromaticity coordinates on a standard well-known prior art cartesian 1931 CIE x-y chromaticity diagram or color map. The map shows an unrendered color map, and displays all known colors or perceivable colors at maximum luminosity as a function of chromaticity coordinates x and y, with nanometer light wavelengths, with regions corresponding to red (R), green (G) and blue (B). A selected illustrative color gamut, or allowable range of colors, for a color model is shown as a hatched triangle in the figure.
Color models draw very specifically from human vision, which uses an enormously complex sensory and neural apparatus to produce sensations of color and light effects, and to allow distinguishing perhaps 10 million distinct colors. In the human eye, for color-receiving or photopic vision, there are three sets of approximately 2 million sensory bodies called cones which have absorption distributions which peak at 445, 535, and 565 nm light wavelengths, with a great deal of overlap. These three cone types form what is called a tristimulus system and are called B (blue), G (green), and R (red) for historical reasons; the peaks do not necessarily correspond with those of any primary colors used in a display, e.g., commonly used RGB phosphors, or in any color model. There is also interaction for scotopic, or so-called night vision bodies in the human retina called rods, which influence, under many conditions, color vision.
The nominal blue and green set of cones in the human retina having a peak absorption and sensitivities of 535 and 565 nm coincides well with the green/blue reflectivity of nearly all plants, and the peak luminosity function of human vision, which gives the human sensitivity and perceived brightness of a given radiometric input of light, peaks at 555 nm.
This means that green color is a special spectral territory that must be managed intelligently to allow successful operation of any target plant detection and treatment system if it is to avoid a high processing load or overhead.
Much of the prior art utilizes radiometric, physical methods in image analysis, and this introduces difficulties because detection of target plants is subtle and happens to be something that humans perform well and easily when compared to physical methods, such as spectral analysis.
Concerning rendered spaces, provisions for compatibility with device color models inherent to a tristimulus color models provide for fast processing of rendered field images taken from a field, even though video or display reproduction is not specifically required or sought in practicing the instant invention. It is worthwhile noting that color reproduction influences color models and color reproduction can take many forms, depending on the main objectives sought and the solution devised to meet those objectives.
Specifically, reproduction paradigms include, for example, colorimetric color reproduction, which provides a useful alternative where tristimulus values are proportional to those in the original scene. Chromaticity coordinates are reproduced exactly, but with proportionally reduced luminances. Colorimetric color reproduction is considered a reference standard for video systems.
Most video reproduction in practice attempts to achieve corresponding color reproduction, where colors reproduced have the same appearance that colors in the original would have had if they had been illuminated to produce the same average luminance level and the same reference white chromaticity as that of the reproduction.
Historically, most color reproduction encoding uses standard RGB color spaces, such as sRGB, ROMM RGB, Adobe RGB 98, Apple RGB, and video RGB spaces such as that used in the NTSC standard. Typically, an image is captured into a sensor or source device space, which is device and image specific. It may be transformed into an unrendered image space, which is a standard color space describing the original colorimetry.
However, video images are nearly always directly transformed from a source device space into a rendered image space (see Definitions section), which describes the color space of some real or virtual output device such as a video display. Most existing standard RGB color spaces are rendered image spaces. For example, source and output spaces created by cameras and scanners are not CIE-based color spaces (such as shown in the 1931 CIE chromaticity diagram) but spectral spaces defined by spectral sensitivities and other characteristics of the camera or scanner.
Rendered image spaces are device-specific color spaces based on the colorimetry of real or virtual device characteristics. In most reproduction applications, images are converted into a rendered color space for either archiving and data transfer and analysis, including video it image signals.
By using data with data generated and formed using a rendered tristimulus color model, human sensory experience is already factored into detection of target plants as defined below, and as described in this specification.
Helpful information about video and television engineering, compression technologies, data transfer and encoding, human vision, certain information about color science and perception, color spaces, colorimetry and image rendering, including video reproduction, can be found in the following references which are hereby incorporated herein in their entirety: ref[1] Color Perception, Alan R. Robertson, Physics Today, December 1992, Vol 45, No 12, pp. 24-29; ref[2] The Physics and Chemistry of Color, 2ed, Kurt Nassau, John Wiley & Sons, Inc., New York ©2001; ref[3] Principles of Color Technology, 3ed, Roy S. Berns, John Wiley & Sons, Inc., New York, ©2000; ref[4] Standard Handbook of Video and Television Engineering, 4ed, Jerry Whitaker and K. Blair Benson, McGraw-Hill, New York ©2003.
In the prior art, detecting target plants, as opposed to merely detecting a plant—is considered difficult without using very high processing power which analyzes on a sophisticated level, such as by comparing plant images to images stored in a database.
In identifying species of plants in an agricultural field, US Patent Application Publication 2005/0122513 to Masten teaches a low cost, high speed sensing method by gathering strictly radiometric spectral data using portable spectrometer, obtaining a spectral distribution of reflected wavelengths, and wirelessly transmitting that information to a remotely located analyzer. This is cumbersome, involves expensive components to form a spectrometer, and increases processing requirements. The reliance on spectrum data makes the method vulnerable to local plant reflectance characteristics, local conditions like rain drops, invites high error rates, and requires extensive calibrations.
The Weed Seeker® automatic spot spray system of Trimble, Inc., (Sunnyvale, Calif., USA) seeks green plants for reductive or eradication spraying, and uses expensive sophisticated processing. It does not perform analysis of green plants of the same approximate size to determine if a plant is a target plant.
US Patent Publication 20150309496 to KAH III of K-RAIN Manufacturing Corporation, teaches use of a database of reference data for comparison, which introduces a high computational overhead, as image comparison algorithms must perform complex calculations, and pass through logic trees after what is often pixel-by-pixel comparisons.
U.S. Pat. No. 6,795,568 to Christensen et al of TORSANA LASER TECHNOLOGIES teaches using “means for analyzing the image data and for determining whether the plant is one of a number of predetermined plants or types of plants, and, from that determination, whether the plant is to be severed or damaged” and this involves complex comparison with plant attributes contained in a storage unit (see, for example, Christensen '568 column 5, line 40), with attendant large computational overhead and the need for calibrations to insure low false positives and low false negatives.
US Patent Publication 20040034459 to Hoelscher et al of Syngenta® Corporation requires a computing means capable of reading and processing data from a computer-readable medium having stored relating to a standardized functional relationship between the amounts of individual pesticides and the biological effects achievable with said amounts on said plants. This also introduces large processing overhead and the need for uninterrupted communication with a database.
U.S. Pat. No. 5,924,239 to Rees does elementary spectral analysis and compares aggregates of field image pixels and looks for “green.” It does not analyze what is a target plant such as a weed, among other plants.
Plants have varied reflectances and absorptions across this spectrum and many complex attributes. This invention seeks to use a rendered tristimulus color model and perform processing steps from data originating from the middle of the visible spectrum using low processing power to emulate larger, more complex systems.