This application relates to a high-definition radio frequency radar system for an autonomous vehicle. This application also relates to a method of data association for object detection for use in controlling autonomous vehicles.
Autonomous vehicles must provide solutions to three fundamental problems: “Where am I? What is around me? What do I do next?” The first question, also known as localization, requires the autonomous vehicle to determine its location with respect to a known map. Localization ranges from coarse determination of the autonomous vehicle's street address to a more accurate self-position relative to a road's centerline, travel lane boundaries, cross-walks, sidewalks, road signs, and other road features. The second question addresses the question of situational awareness. Situational awareness includes determining which nearby objects such as other vehicles, humans and animals have potential interaction with or potential collision with the autonomous vehicle. The last question is how the autonomous vehicle should navigate in light of answers to localization and situational awareness given a pre-planned destination.
Localization typically starts with coarse position estimates from GPS/INS systems. Position accuracy is improved by simultaneous localization and mapping (SLAM) from road and other features obtained from optical sensors such as cameras and Lidar. The 1D, 2D, and 3D optical sensors derive localization cues from paint marks, travel lanes, road signs, curb edges, lamp posts, building boundaries, etc., cross-correlated against pre-stored maps. The state-of-the art integrates GPS/INS, 3D optical data from Lidar, and camera derived road features with a Kalman tracking filter to improve localization accuracy and reduce position variance.
Situational awareness requires position and velocity estimation of objects, including other vehicles and people, in the vicinity of autonomous vehicle's planned path. Conventional approaches for velocity estimation are obtained by coarse angle resolution radar and by multiple observations of a slowly moving object obtained from stereoscopic cameras, Lidars, and/or ultrasonic sensors.
The state-of-the-art localization and lane keeping functions fundamentally rely on road attributes and other features obtained from optical sensors. Optical sensors are compromised and can become useless in foul weather and other adverse conditions. The loss of optical sensor data results in autonomous vehicle systems that are unable to navigate safely in the presence of foul weather and other adverse conditions.
Generally, there are two classes of adverse conditions that plague optical sensors: air-borne and surface visual obscurants. Visual air-borne obscurants include: snow and rain precipitation, fog, smoke, smog, pollen, dust, and/or other air-borne obscurants. Further, even when the air path is visually clear, localization can be compromised when only surface visual obscurants are present, including: wet or flooded road surfaces, road and ground covers that include accumulated snow, sand, dust, leaf, pollen, and/or other surface obscurants.
Another problem with the state-of-the-art autonomous vehicle localization occurs under GPS compromised conditions combined with foul weather and/or other adverse conditions. Practical low cost GPS/INS systems incorporating 3D accelerometers and wheel encoders are able to coast only through brief GPS outages before requiring GPS to reset accumulated biases. Urban canyons, with their long GPS shadows and multipath opportunities, result in mean and variance position errors frequently exceeding 10 feet or more. One approach to overcome the GPS outages is to integrate GPS/INS state information with apriori 3D visual maps and passive optical sensors such as cameras and Lidar to maintain localization.
Unfortunately the latter approach for recovery of self-position in the presence of GPS multipath and outages is viable only under benign weather conditions. The presence of visual air-borne and/or surface obscurants increases the position errors derived from passive and active optical sensors such as cameras and Lidar.
Accordingly, accurate position information for an autonomous vehicle is needed under foul weather and other adverse conditions to navigate with or without GPS. There is a need in the art for a high-definition radar system and method for an autonomous vehicle that addresses the shortcomings of the prior art discussed above.