Training pilots and aviators can be very costly. Pilots go through hundreds of hours in classrooms, ground based flight simulators and flight training in training aircrafts, for learning, enhancing and maintaining their flight skills. Needless to say that each training flight incurs hefty costs, for example, for the maintenance of the training aircrafts and facilities (e.g., airports), for the required training and maintenance crew, and the like. Therefore, it is imperative that each training flight is fully exhausted, in terms of training value, for maximizing its utility.
It is noted that the term “pilot” refers to a person who directly controls the aircraft, and the term “aviator” is a more general term referring to all crewman involved in operating the aircraft, such as navigators, electronic warfare officers, and the like. Herein below, the terms pilot and aviator may be used interchangeably and both relate to pilots as well as to other aircraft operating crewman.
Training a pilot involves not only teaching the pilot how to control the aircraft but also relates to, for example, teaching the pilot to handle the aircraft in operational activities, to co-operate with fellow pilots (e.g., formation flight) and with other units, and to face opposing pilots and other hostile units. For such training, the trainee pilot should preferably experience flying in formation and flying in the presence of foe pilots (i.e., pilots simulating foe pilots).
As mentioned above the costs associated with flying aircrafts are considerable. For saving some of these costs, the systems of the prior art produce virtual RADAR targets for training a trainee pilot. Other methods described in the prior art for saving costs of training a pilot involve modifying the systems of a relatively inexpensive aircraft (e.g., a training aircraft) to simulate those of a costlier operational aircraft. Thereby, the trainee pilot is trained by simulating the experience of operating the systems of the operational aircraft at lower costs.
U.S. Pat. No. 5,807,109 issued to Tzidon et al., and entitled “Airborne Avionics Simulator System”, is directed to an airborne avionics simulation system installed onto a host aircraft, for simulating the avionics of a high performance aircraft. The system includes an input interface, a multifunctional display, an Inertial Navigation System (INS) module, a processor, and a data link module. The input interface reads input from the controllers of the host aircraft. The INS module reads INS data from the INS of the host aircraft. The input interface and the INS module transmit data from the host aircraft into the processor, which generates a simulation of the aircrafts sensors. The processor can further receive GPS data from a GPS sensor, and receive mission\scenario data from a data storage. The data link module communicates with other system installed on other host aircrafts and with ground based stations.
The simulation system simulates the avionics of a high performance aircraft within a low cost host aircraft, such that the trainee pilot flying the host aircraft is provided with simulated flight experience of the high performance aircraft. In particular, the simulation system simulates an avionics system which is lacking in the host aircraft, such as RADAR, as if it was the avionics system of the simulated high performance aircraft. For example, the simulation system can simulate virtual hostile targets, missiles, chaff, flares, and the like. It is noted, however, that the pilot interfaces of the simulated avionics systems are not necessarily the same as those of real avionics systems. For example, the host aircraft does not include the same screens and input interfaces (e.g., buttons, switched, and the like) as an operational aircraft which avionics systems are simulated. The virtual targets are controller according to a set of if-than rules, based on inner variables such as velocity, position, status of armaments, and the like.
U.S. Pat. No. 7,852,260 issued to Sarafian, and entitled “Methods and Systems for Generating Virtual RADAR Targets”, is directed at a system for generating virtual RADAR targets. The system includes a transceiver and a controller coupled thereto. The transceiver receives a signal transmitted from a radar antenna and stores the signal information representative of the received signal. The controller receives the received signal and determines an output transmission signal representative of a virtual target. The controller further determines the timing of the transmission of the output signal in response to a virtual distance between the virtual target and the radar antenna, a required virtual target direction and direction information representative of a direction of the radar antenna. The transceiver transmits the output signal such that at least a fraction of the output signal is received by the radar antenna.
U.S. Pat. No. 5,421,728 issued to Milden, and entitled “In-Flight Radar Warning Receiver Training System”, is directed at an in-flight Radar Warning Receiver (RWR) training system for use with an RWR. The RWR training system includes formatter, generator and merge operator. The formatter formats real RWR track file reports. The generator generates threat/RWR simulated threat track file data. The merge operator merges the formatted real threat data with the threat/RWR simulated threat track file data. A track file lists the parameters that describe a threat, including physical data (e.g., radar frequency, pulse width, pulse repetition interval) and derived data, determined through analysis of the physical data (e.g., threat type, threat mode). The simulated threat track file data is generated by a threat activity simulator, which simulates threat encounters according to the range of the target aircraft from the trainee aircraft; line-of-sight calculations based on terrain data and radar cross section of the target aircraft; and according to threat modeling for each specific threat type.
U.S. Pat. No. 7,612,710 issued to Lamendola et al., and entitled “Processing Virtual and Live Tracks to Form a Virtual-Over-Live Environment”, is directed at a method for merging virtual and live tracks. The method includes the following steps, merging live radar data with virtual radar data, tagging the merged data, determining if the virtual radar data is present and tagging data accordingly. The merged data is tagged with a virtual tag indicator to indicate a presence of the virtual radar data, or tagged with a live tag otherwise. The virtual tag indicator is employed for determining if the virtual radar data is present. The virtual radar data is provided by a pre-determined virtual tagged beam, which origin and manner of production exceed the scope of this publication.
U.S. Pat. No. 5,431,568 issued to Fey et al., and entitled “RADAR Target Generator”, is directed at a RADAR target generator. The radar target generator is employed with a radar system including a radar transmitter, a load, a radar receiver, a first transmission line connecting the transmitter and the load, and a second transmission line connecting the load and the receiver. The radar target generator includes a central processor, and a tap. The tap samples a portion of a radar signal traveling through the first transmission line and redirects the sampled portion to the central processor. The processor applies a target signature component to the sampled radar signal portion. The first tap returns the modified radar signal portion to a return radar signal traveling through the second transmission line. The target signature component relates to a pre-determined specific synthesized target type, which is selected by an operator, depending upon the purpose of the operation being conducted.
U.S. Pat. No. 6,166,744 issued to Jaszlics et al., and entitled “System for Combining Virtual Images with Real-World Scenes”, is directed at a system for combining virtual targets with real scenes. The system includes a range scanner, a computer model, virtual objects generator, virtual objects combiner, and a display. The range scanner scans the field of interest and generates range data indicating the distance of real-world objects within the field of interest. The computer simulates a virtual entity and produces a virtual image at a location within the field of interest. The generator generates virtual objects. The combiner combines the virtual objects and a real-world image of the field of interest to create a combined image. The display displays the combined image to an observer.