Mission systems allow one or more operators, via one or a plurality of display devices, to view information relating to a mission, for example a terrain map, the indication of information relating to vehicles, devices or people present on the terrain, this information being able to be provided by a variety of sensors such as radars, cameras, detectors, etc. Such systems also allow the operator to input commands, allowing him for example to configure the display which is presented to him, or else to address instructions for the attention of devices or individuals.
For example, for so-called tactical missions, the mission system provides the operators with a synthetic representation of the tactical situation, that is to say objects detected by the sensors, on a cartographic background for example. The MMI of a mission system provides the operators of this system with the means for monitoring the feelers and the effectors through the synthetic representation provided by the system. Such an MMI generally consists of an application executed on a physical device, for example formed by a console having one or more screens some of which may be touchscreens, and one or more devices for inputting instructions.
Mission systems may be deployed on the ground in dedicated buildings, for monitoring drones for example, or else on platforms aboard carriers, for example aircraft, terrestrial vehicles such as tanks or other armored motor vehicles, or else ships. The carriers may thus undergo movements, these latter possibly resulting in diverse constraints at the level of the system and of its operator: notably, abrupt movements, vibrations, loud noise, variable brightness, etc., defining an environment referred to hereinafter as a “constrained environment”. The use of the mission system, via a man-machine interface or MMI in a constrained environment may thus turn out to be tricky for the operator; for example, the movements of the carrier may make it difficult to point to objects on a display, or to select menus. It is thus necessary for the systems to be designed so as to minimize the influence of a constrained environment on their use.
It is for example known to resort to an input device of trackball type with which the operator rolls a protruding part of the ball for example with a finger or the palm of his hand. Various buttons may be disposed in proximity to the ball, for example at sites accessible to the tip of the fingers, making it possible to activate diverse actions. The trackball is deemed to be relatively insensitive to the movements of the carrier. However, a drawback of the trackball is related to its high inertia, requiring non-negligible effort on the part of the operator, notably to cause a pointer to travel across the display device, for example with the aim of selecting objects so as to engage actions.
It is also for example known to resort to mission systems using one or a plurality of touchscreens. The applications deployed on this type of device must meet graphical requirements allowing their use in situations of sizable movements and vibrations, and for example under the eventuality that the operator is wearing gloves. Thus, the interaction objects displayed must be defined with sizable dimensions, and their mode of operation must be limited to a single touch or “simple click”, and not permit functions such as for example “drag-and-shift” consisting in shifting the object on the screen by keeping a finger pressed on the screen, etc.
Furthermore, in extremely constrained environments, for example when the carrier is an aircraft deploying at very low altitude with possibility of unfavorable weather, or else when the latter is a terrestrial vehicle deploying over rugged terrain, or else when the latter is a ship deploying in rough sea, it is necessary for the operator to hold on to the structure of the carrier in order to avoid falling or impacts, and the latter thus loses the use of one of his hands. The actions can only then be done through a single interaction means, for example via the trackball, a keyboard, a joystick or a touchscreen, thereby appreciably limiting the operator's field of action and requiring an MMI specifically designed in this regard.