In many applications, an operator controls a remote image sensor via a communication link. Examples are traffic control, border control, search and rescue operations, land surveys, police surveillance, military applications, etc. Operators may additionally request measurements of the remote tracked object, such as motion parameter measurements and the like.
Reference is now made to FIG. 1, which illustrates a prior art surveillance/tracking system referred to in WO 00/46985, entitled “Delayed Video Tracking”. Note that the following description of FIG. 1 is taken from the '985 publication and has not been amended to cope with inaccuracies except for only a few minor modifications.
Thus, as disclosed in FIG. 1 system 10 comprises a remote image-sensing unit 20 and a control center 22, which are connected via a communication link 25.
Sensing unit 20 comprises a sensor communication unit 28, a remote tracker 30, an image sensor 32, and a pointer control unit 34. The methods of operation for these elements are well known in the art. Sensing unit 20 locates and tracks a sighted object, and transfers sensor data 24, such as image pictures, track location, pointing data and the like, to control center 22. Sensor data 24 travels in the direction represented by arrow A.
Control center 22 comprises a control communication unit 36, a display 38 and a control stick 40. The methods of operation for these elements are also well known in the art. Control center 22 provides control data 26, such as pointing instructions, centering instructions, track commands, track corrections and the like, to sensing unit 20. Control data 26 travels in the direction represented by arrow B.
Communication link 25 is connectable to sensing unit 20 and control center 22 via sensor communication unit 28 and control communication unit 36, respectively. Furthermore, communication link 25 transfers sensor data 24 and control data 26, via sensor communication unit 28 and control communication unit 36, respectively.
Generally, image sensor 32 surveys an object, and relays image pictures (sensor data 24) to display 38, which displays the pictures for viewing by an operator 42.
If operator 42 decides that it is desirable to track the object, he sends via stick 40 manual coarse pointing instructions (control data 26), such as “move up”, “move right”, “zoom” and the like, to pointer control unit 34. Pointer control unit 34 acts upon these instructions, and directs image sensor 32 in the instructed 15 direction.
Operator 42 then sends via stick 40 centering instructions to pointer control unit 34. Pointer control unit 34 directs image sensor 32 in the instructed direction, thus centering the object in the center of the Field of View (FOV) of display 38. Once the object as sensed by imaging sensor 32 is centered in the FOV, operator 42 electronically sends via stick 40 locking instructions to remote tracker 30. Remote tracker 30 receives the instructions and attempts to lock onto the object in the center of the FOV of display 38.
Once the object has been locked, remote tracker 30 takes over command and generation of the tracking operation. Pointer control unit 34 ceases to receive commands via stick 40 and instead commences to receive instructions from tracker 30. Upon receipt of the instructions, pointer control unit 34 relays them to the image sensor 32. Image sensor 32 tracks the moving object and keeps the object in the center of FOV of display 38, even while the object moves relative to sensing unit 20.
In many applications, there is a considerable time delay between the time when sensing unit 20 acquires an image picture of an object, to when the image is displayed on display 38, and finally, to the receipt of the responding instructions by sensing unit 20. Generally, the main factors contributing to the delay are signal processing, image compression/decompression, duration of the communication, and/or link bandwidth limitations. Consequently, when taking into account the delayed reaction time of the operator, the accumulated delayed time can be from fractions of a second to several seconds.
Due to the time delay, the location of the image as displayed on display 38 is generally not the current location of the object. The location displayed on the screen is the location of the object before the transfer of the sensor data 24, (e.g. A seconds ago). Additionally, by the time the pointer control unit 34 receives the instructions (control data 26) additional time has lapsed, (e.g. an additional B seconds). Subsequently, by the time image sensor 32 is instructed to locate the object, the object may no longer be in the same location it was when the image picture was taken over A+B seconds ago.
Clearly, this time delay complicates the efforts to lock remote tracker 30 onto the object. Operator 42 has to accurately calculate and estimate the expected location of the object at the time in the future when the tracking instructions are to arrive at sensing unit 20. Only then is pointing control 34 pointed to the calculated estimated location, and remote tracker 30 instructed to lock and initiate tracking.
If the estimated location calculation is not accurate enough, remote tracker 30 will lock onto some other background object and the entire estimate, calculate and lock process has to be repeated. As such, the effect is a continuous feedback control loop with delay, a situation which is liable to suffer from overshoots and instability.
The locking process is complicated even more by the human input in the tracking loop. Human reactions and directions are less precise than, for example, computer or processor generated instructions. Humans do not function well in feedback loops with time delay, such an example being the typical daily experience of adjusting the temperature of hot water from a faucet with a slow reaction time.
WO 00/46985 attempts to cope with the delay problem by offering an approach as described with reference to FIG. 2. Note that the following description of FIG. 2 is taken from the '985 publication and has not been amended to cope with inaccuracies except for only a few minor modifications.
As shown in FIG. 2, tracking system 50 provides a reduced time delay by supplying the lock instructions directly from stick 40 to a local control tracker 60, in contrast to the prior art system (described with reference to FIG. 1) which supplied the lock instructions to remote tracker 30. System 50 additionally provides improved tracking ability by supplying tracking instructions directly from control tracker 60 to sensor remote tracker 30, and thus providing more exact tracking/locking instructions than those experienced by the previously described tracking systems.
As shown, system 50 comprises an image sensing unit and a control station 52, which are connected via a communication link 55.
Sensing unit 70 locates a desired object, and sends sensor data 24 to control station 52. Control station 52 sends control data 56 such as pointing and tracking instructions to image sensing unit 70. Control data 56 travels in the direction represented by arrow B.
Communication link 55 is connectable to sensing unit 70 and control station 52 via sensor communication unit 28 and control communication unit 36, respectively.
As opposed to the system of FIG. 1, control station 52 additionally comprises a control tracker 60. Stick 40 transfers coarse tracking instructions to pointing control 34 and control tracker 60 transfers direct tracking and locking instructions to sensor remote tracker 30.
Furthermore, sensing unit 70 comprises a processor 62 and a memory 64. Additionally image sensor 32 transfers generally identical image data to remote tracker 30 and to control tracker 60. Hence, control tracker 60 and remote tracker 30 operate from the same data, and tracking instructions from control tracker 60 are more direct and precise than prior art tracking instructions.
Direct instructions from control tracker 60 to remote tracker 30 is a useful advantage over previously described system 10, where the locking instructions were determined by the operator as gleaned from an image seen on display 38, and then estimated and transferred from stick 40 to tracker 30. The measurements are then fed into a mathematical predictor, which can accurately predict object location. In such a manner, system 50 is free of operator induced errors related to coarse eye/hand coordination.
A sequence of operation in accordance with this prior art solution is described with reference to FIG. 3. Note that the following description of FIG. 3 is taken from FIG. 3A of the '985 publication and has not been amended to cope with inaccuracies except for only a few minor modifications.
Thus, flow of data images, generally designated I (sensor data 24), to control station 52, and specifically control tracker 60. It is noted that image I1 is taken at time T1, (not shown in FIG. 3) and image In is taken at time Tn, which is later than time T1. FIG. 3 additionally illustrates the flow of a tracking/locking command image, generally designated C1 (control data 56), to sensing unit 70, and specifically remote tracker 30.
As previously noted, the transfer of sensor data 24 takes an extended amount of time. Hence, although sensing unit 70 acquires a data image I1 at time T1, the data image 11 is not received at station 52 until time Tn, which is relatively the same time as when sensing unit 70 is acquiring data image In. As illustrated in FIG. 3, although sensing unit 70 is acquiring and sending data image In, control station 52 is just receiving data image I1.
It is noted that data images I include a sensed object 66. It is additionally noted that the location of object 66, as sensed by sensing. unit 70, moves from data image I1 to data image In.
Therefore, once a decision has been made to lock onto or track object 66, control tracker 60 transfers tracking/locking command image CIn to remote tracker 30. Command image CIn is an associated data image In with the appropriate command attached, e.g. command image CI1 is associated data image I1 with a lock command attached. Thus, control tracker 60 establishes a coordinated reference image with remote tracker 30. Accordingly, both remote tracker 30 and control tracker 60 refer to the same image (data image 11) and the location of object 66 is known and referenced from data image I1.
Thus, although at time Tn sensing unit 70 is currently sensing the image In, upon receipt of command image CI1, remote tracker 30 refers to image I1 and is able to make positive identification.
Sensing unit 70 stores in memory 64 the past records of data images I. Upon receipt of command image CI1 referring to object 66 in data image I1, tracker 30, searches through the records in memory 64 for the appropriate image, and positively identifies object 66 from that record. Remote tracker 30 identifies object 66 as designated in data image I1 via the known in the art methods listed hereinabove. Remote tracker 30 then scans from data image I1 through to the data image In, tracking object 66 from data image to data image. The delay A and B specified above are coped with seeing that the remote tracker 60 is eventually capable of locating the object in the currently acquired image.
While the solution described with reference to FIG. 2 indeed provides certain improvement over solutions such as those described with reference to FIG. 1, it falls short to cope with certain typical scenarios. For instance, in the sequence of operation described with reference to FIG. 3, the object resides in all images (I1 to In), facilitating, thus simple tracking when running through the history images. This, however, is not always the case and there may be situations where the object escapes from the field of view, and it is still required to track it and locate its position in real time.
There is, thus, a need in the art to provide for a new method and system for tracking moving objects.