Depth sensing and obstacle avoidance is a key step toward building a fully autonomous aerial robot. Currently existing drones have demonstrated using sonar, proximity sensor, laser scanning range finder, time-of-flight based sensor, structural light based sensor or a stereovision camera pair to perform depth sensing toward one particular direction (forward/downward), or using multiple depth sensors (such as stereovision camera pairs) facing different directions to attempt obstacle avoidance in more directions. However, the fundamental flaw of this approach is that a rotary-wing drone (such as quadcopter, etc.), unlike a car or a ship, is capable of driving in all directions. A stereovision pairs can only cover a limited range of driving angles. Using brute force to stacking multiple stereovision pairs is very inefficient, and can still fail in obstacle avoidance due to lack of full coverage (360 degrees) of horizontal field angle for depth sensing. Another existing attempt was to use a wide-angle lens or catadioptric lens to capture omnidirectional video information and then using techniques like structure from motion (SFM) or visual inertial odometry (VIO) to achieve single camera depth sensing. This approach relies on extremely accurate SFM/VIO in order to obtain usable absolute depth information. This approach is also vulnerable to vibration and angular movement of the system, which occurs almost constantly on a drone.
In addition, it is desirable to achieve omnidirectional obstacle avoidance for an unmanned aerial vehicle. The current technology is to use a fixed binocular camera so that the unmanned aerial vehicle can sense obstructions in the direction same as the binocular camera. For example, DJI Mavic Air has front binoculars, rear binoculars, and bottom binoculars, so it can avoid obstacles when flying forward or backward but cannot achieve obstacle avoiding when flying leftward or rightward. Skydio R1 can achieve omnidirectional obstacle avoidance, but it has four pairs of binocular cameras.