A viewing direction of a user can be captured by means of an eye tracker unit. It can thus be recognized for example on which element on a screen a user is focused or which one they are looking at. Such an eye tracker unit is known for example from US 2015/0278599 A1. It is also described therein that the eye tracker unit must be calibrated so that the coordinates of the focal point focused or viewed on the screen can be correctly calculated from the viewing direction information of the eye tracker unit.
A method for calibrating an eye tracker unit of an operating device of a motor vehicle is known for example from DE 10 2014 008 852 A1. Accordingly, a user must view a screen while their eyes are filmed with an eye tracking system. Meanwhile, a graphical object, a cross for example, is displayed on the screen. It is assumed that the user focuses their eyes on this graphical object so that the coordinates of the focused area are known. The viewing direction information meanwhile identified by means of the eye tracking system can thus be mapped or transformed to the known coordinates.
It is known from WO 2014/020323 A1 to determine the point on a screen which a user is currently focusing on in that a mouse cursor is shown on the screen, which a user must operate with a computer mouse. When the user clicks a button on the computer mouse, it is assumed that they are looking at that moment at the mouse cursor on the screen. It is also assumed that the viewing direction information provided by the eye tracker unit matches the position of the mouse cursor at that moment.
In a technical paper by Perra et al. (David Perra, Rohit Kumar Gupta, Jan-Micheal Frahm, “Adaptive Eye-Camera Calibration for Head-Worn Devices,” CVPR—IEEE Conference on Computer Vision and Pattern Recognition, 2015), it is described for a head-mounted camera how optically salient elements can be recognized in a camera image of surroundings, on which a user is focused with a high probability.
The user calibration of an eye tracker unit is normally performed in that a series of defined calibration points must be viewed by the user in sequence for a short period of time. During this time, the eye tracker unit detects the viewing direction of the user and generates corresponding viewing direction information. This information can be compared with the known coordinates of the calibration points in order to provide or generate calibration data in order to compensate for a difference.
This solution of an active calibration by representing defined calibration points takes time and prevents the user from immediately beginning the actual system interaction, i.e., the use of the operating device for operating the at least one apparatus.