1. Field of the Disclosure
This invention relates to the field of modularized hardware and software for delivery and monitoring of a personalized augmented reality drone event (PARDE) mission plan based augmented reality experience piloting Unmanned Vehicles (UV) and the semi-autonomous, direct, and swarm, or flock control of the one or more UVs being piloted during the configured mission. Specifically it relates to a system of modular hardware and software which is configurable to support operation of a diverse range of UV platforms (i.e., unmanned aerial vehicles (UAVs), unmanned ground vehicles (UGVs), unmanned surface water vehicles (USVs), and unmanned underwater vehicles (UUVs)) operated by multiple pilots within a controlled and configured augmented reality PARDE mission plan that includes vision-integrated geo-fence management of UV location parameters and delivery of augmented visual, audio, motion and tactile content to the end-user UV pilots, or remote drone pilots (RDPs).
2. Description of Related Art
Unmanned vehicles have been around with increasing sophistication since the early 1900s—first as simple mechanical devices such as anti-aircraft mine-blimps, advancing into computerized platforms. A major limiting factor for autonomous and semi-autonomous UV operation s has been the need for a robust and accurate method to estimate position, without which UVs are very likely to cause damage to themselves and their surroundings. As capabilities in these areas have increased, UVs have become increasingly important in various civilian search and rescue, and military situations due to increases in platform and payload hardware, software sophistication and overall UV capability. Both military and civilian organizations now use UVs to conduct reconnaissance, search and rescue, and commercial use cases to reduce human risk and to increase the efficiency and effectiveness of mission execution. Much of this increase in capability derives from and depends on more accurate means of localization for vehicles within an environment, e.g GPS for outdoors. While GPS has advanced functionality, it has often been limited in terms of precision and robustness. For indoor applications, the precision issue has been solved with motion capture systems, but they are impractical for large-scale outdoor use.
A UV usually includes a device such as an engine for powered, controlled motion, a system for navigating, sensors for internal and external conditions, and an optional payload system. The onboard sensors often provide a remote user or observer with information such as vehicle pose, velocity, battery level, external levels of noise or physical agent, and video or laser data of surroundings which can be used for navigation or locating an individual or item. This is a small sample of available sensors, which are constantly increasing in functionality and sensitivity. UVs can be operated in autonomous, semi-autonomous, or direct (i.e., RDP controlled) control modes giving a pilot flexibility to configure a PARDE mission plan in which the UV may be operated in any one of the indicated modes, and flexibility to adjust control mode at any time.
With growing use and applicability of UVs across multiple industries and use cases, the advancement of UVs and virtual and augmented reality technologies, a system that supports individual piloting and overall flight control of one or a fleet of UVs is required. Furthermore, this system should enable piloting of UVs by individuals of varying piloting or driving experience and deliver PARDE mission plan with visual and audio augmented and interactive content to support successful completion of the specified mission and to provide an enhanced mission experience that increases mission success probability or pilot overall experience.