In current avionics systems (navigation, piloting, mission management, communications management, etc.), information is displayed on several separate physical screens positioned in front of the pilots' seats over an extensive area of the cockpit. These screens are connected to on-board computers which continuously analyse data from various sensors distributed throughout the aircraft and command data provided by the crew members, and they produce complex information for the crew members. Each cockpit screen may include several windows, also called formats, which may have widgets sending data to separate applications. The pilots must manipulate one or more cursors on the different screens, in order to select an interactive element or to designate one or more widgets. To this end the crew members use one or more devices for indirect interaction with the different screens, such as for example a keyboard, a trackball or a touchpad.
In addition to the multiple indirect interaction devices the pilots may have to fit devices of a given type, which may take different forms (a keyboard, for example), so as to interact with different windows displayed on the screens requiring different forms of control. The large number of types of interaction devices leads to an additional workload to identify the interaction device appropriate for a given window.
In addition, depending on the operational context (flight phase), it may happen that certain indirect interaction devices are not kept in the preferred interaction area facing the pilot where the pilot has optimal control and comfortable interaction conditions, in order to keep this area free from all obstacles. For this reason, certain interaction devices are positioned on retractable shelves which slide under the instrument panel. This results in reduced interaction efficiency with display formats using these specific interaction devices, which must be manipulated using generic devices.
Furthermore, these devices are not suitable for natural user interfaces (NUI), but rather mainly for graphical user interfaces (GUI). This prevents these interfaces from changing into natural graphical interfaces allowing the simplicity, user efficacy and learning to be improved. A known solution to provide direct and natural interaction with the displayed information consists in using touchscreens. However, the display windows are extended over the surface of the cockpit's main instrument panel and may be too far from the position of the pilot, often obliging them to change their posture in order to reach the different displayed windows. This impairs the ergonomics and efficiency of the pilots' actions.
The aim of the invention is to palliate the disadvantages of the prior art described above by a new device for shared interaction with a display system which can easily be adapted to the many display windows, so as to replace the multiple interaction devices used in the prior art, and to improve the ergonomics and efficiency of the interactions with the display system.