(1) Field of the Invention
The present invention relates to airborne imaging and navigation systems and, more particularly, to a wide field airborne imaging system capable of providing airspace imaging and sense and avoid capabilities over a large field of view at high resolution and range of detection using a single camera.
(2) Description of Prior Art
The Federal Aviation Administration promulgates both Visual flight Rules (VFR) and Instrument Flight Rules (IFR) for all manned aircraft. VFR regulations allow a pilot to operate an aircraft in clear weather conditions, and they incorporate the “see and avoid” principle, e.g. the pilot must be able to see outside the cockpit, to control the aircraft's attitude, navigate, and avoid obstacles and other aircraft. Pilots flying under VFR assume responsibility for their separation from other aircraft and are generally not assigned routes or altitudes by air traffic controllers, in contrast to IFR flights.
Unmanned Aircraft Systems (UASs) have no onboard pilot to perform the see and avoid function. In the past this was not a large issue because UASs were predominantly flown in foreign or military restricted airspace and war zones, and in these situations UASs do not typically come into conflict with manned civilian aircraft, nor are they required to comply with FAA Regulations. Currently, UASs can only fly domestically in our National Airspace System with special permission from the Federal Aviation Administration (FAA) given in the form of Certificates of Approval (COAs) issued to public entities for flight activities that have a public purpose, or alternatively under an Experimental Airworthiness Certificate issued to commercial entities for development, demonstration and training. Even then, only qualified ground observers or qualified personnel in manned chase aircraft are considered acceptable by the FAA to provide the See-And-Avoid (S&A) function.
Now, however, the demand for UASs is proliferating among the military, civil government, and private sectors due to growing awareness of their value and significant improvements in capabilities and performance. For example, over the last four years the U.S. Customs and Border Protection agency has been operating the Predator B Unmanned Aerial System (UAS) for its purposes. This has been done under the established rules in the National Airspace System.
The FAA has not yet established Federal Aviation Regulations (FARs) for UASs to fly routinely in the National Airspace System, and the potential for UASs is suppressed by an inability to comply with FAA rules. Not surprisingly, the industry has lobbied hard for clear and simple rules, and this has resulted in a recently introduced bill called the FAA Reauthorization Act of 2009, which calls for the FAA to provide within nine months after the date of enactment a comprehensive plan with detailed recommendations to safely integrate UASs into the NAS by 2013.
It is reasonable to assume that any new FAA rules will impose requirements similar to manned S&A rules, e.g., it is necessary to detect and avoid both cooperative aircraft (aircraft with radios and navigation aids such a transponders and ADS-B), and, importantly, non-cooperative aircraft such as parachutists, balloons, and manned aircraft without radios or navigation aids. Indeed, proposed FAR rules have been discussed. For example, the ASTM F-38 Committee has published a recommended standard for collision avoidance, (F2411-04 DSA Collision Avoidance) that proposes requiring a UAS operating in the NAS to be able to detect and avoid another airborne object within a range of + or −15 degrees in elevation and + or −110 degrees in azimuth and to be able to respond so that a collision is avoided by at least 500 ft. The ASTM standard may be incorporated in whole or in part into eventual FAA certification requirements. Complying with existing S&A rules would severely limit the range and conditions under which UASs can operate. The limits are in large part due to the lack of onboard S&A capabilities. Developing technical capabilities to comply with the proposed ASTM and other proposed rules is the subject of significant research but as yet has only resulted in proposed technical solutions that require substantial weight, volume and power to perform the task relative to the capacity of many UAS. Still, the publishing of UAS FARs will be a first major step toward routine operation of UASs in the National Air Space.
Since UASs do not have onboard pilot visual contact with the vehicle's surroundings, effective, onboard, autonomous S&A capabilities are necessary to facilitate operations of UASs in the NAS to avoid collisions with other aircraft or with terrain objects (e.g., buildings, power lines, trees and so on). In addition, UASs must have widefield detection capabilities (by radar, synthetic vision, etc.) in order to detect the range and altitude of nearby aircraft and perform “see and avoid” maneuvers.
Quite a number of alternative approaches to detecting other aircraft are being investigated at present including optical, acoustic, radar, etc. To the best of the present inventor's knowledge the prior art S&A systems are all very heavy when compared to the weight of the UAS, especially with regard to small UAS (sUAS). If the S&A detection device(s) are overweight, too large, or require too much power they can exceed the payload capacity of the UAS, or even exceed the weight of the entire UAS, frustrating its very purpose.
Against this backdrop, an effective S&A technology for UAS is critical to the future of the industry. What is needed is a system combining a wide field airborne imager capable of providing airspace imaging over a large field of view at high resolution and range of detection with low weight, volume and power using a single camera, and an automated trajectory-based control system to avoid collisions with other aircraft or with terrain objects (e.g., buildings, power lines, trees and so on).
There are a few enabling technologies that must be combined in order for such systems to be feasible, including: 1) wide field imaging; 2) digital image feature detection/motion analysis; 3) avoidance/alarm system.
With regard to prior art imaging, there are various types of imagers used in other contexts. For example, the F. Rafi et al. “Autonomous Target Following by Unmanned Aerial Vehicles”, SPIE Defense and Security Symposium 2006, Orlando Fla. article describes an algorithm for the autonomous navigation of an unmanned aerial system (UAS) in which the aircraft visually tracks the target using a mounted camera. The camera is controlled by the algorithm according to the position and orientation of the aircraft and the position of the target. This application tracks a moving target in different directions, making turns, varying speed and even stopping, and does not rely on an ESRI Shapefile. A target-tracking camera is not suitable for UAS S&A which requires widefield detection capabilities.
U.S. Pat. No. 6,804,607 to Wood issued Oct. 12, 2004 shows a collision avoidance system using multiple sensors that establishes a 3D surveillance envelope surrounding the craft.
U.S. Pat. No. 7,061,401 to Voos et al. (Bodenseewerk Geratetechnik GmbH) issued Jun. 13, 2006 shows a method and apparatus for detecting a flight obstacle using four cameras for recording an overall image.
European Application No. EP 1296213 discloses a method of monitoring the airspace surrounding an unmanned aircraft by a number of cameras having different viewing angles, and the images are displayed to a ground pilot superimposed.
U.S. Pat. No. 6,909,381 to Kahn issued Jun. 21, 2005 shows an aircraft collision avoidance system utilizing video signals of the air space surrounding the aircraft for alerting pilots that an aircraft is too close.
U.S. Pat. No. 7,376,314 to Reininger (Spectral Imaging Laboratory) issued May 20, 2008 shows a fiber coupled artificial compound eye that channels light from hundreds of adjacent channels to a common point on the convex surface of a fiber optic imaging taper. The superposed light from all the channels form a curved, high intensity image onto a detector array. Multiple such systems are required to detect over a wide field of view.
U.S. Pat. No. 5,625,409 to Rosier et al. (Matra Cap Systems) issued Apr. 29, 1997 shows a high resolution long-range camera for an airborne platform using two imagers, a first detector and a second detector with a larger field of view, covering the field of the first detector and extending beyond it.
With regard to feature detection/motion analysis software, there are commercial programs for doing this to successive frames of video images. For example, Simi Motion at www.simi.com sells a 2D/3D motion analysis system using digital video and high speed cameras, and there appear to be a few other rudimentary programs. This has been applied to the UAS navigation context as shown in the F. Rafi et al. article “Autonomous Target Following by Unmanned Aerial Vehicles”, which teaches an attempt to use it for automatic target tracking of a UAS. However, this application tracks a moving target in different directions but does not monitor airspace. The '607 patent to Wood also determines speed and motion vectors for surrounding objects.
Finally, with regard to any scenario-based avoidance capabilities, the '232 Bodin et al. patent (IBM) issued Jun. 5, 2007 shows a UAS control system that identifies obstacles in the path, and then decides on a particular avoidance algorithm. An array of avoidance algorithms are taught.
It would be greatly advantageous in light of this cluttered prior art background to consolidate hardware/software into a functional and compact UAS S&A system combining a wide field airborne imager capable of providing airspace imaging over a large field of view at high resolution and range of detection using a single camera, and a trajectory-based control system that is reliable and capable of autonomous or even semiautonomous operation to avoid collisions with other aircraft or with terrain objects.