User equipment of today, such as e.g. mobile phones and/or other portable devices communicating to other user equipment, often include a touch sensitive display. Typically, so called smart phones includes touch sensitive displays. The touch sensitive displays are arranged for displaying e.g. key buttons, icons, pictures and/or information on the display. The touch sensitive displays are also arranged for receiving input from a user of the user equipment. The input can be in form of pressing/pushing/touching/tapping an area on the display corresponding to such key buttons, icons, pictures and/or information by use of one or more fingers and/or pointing devices. The key buttons being displayed on a touch sensitive display can be seen as virtual buttons.
The touch sensitive display registers one or more areas being pressed/pushed/touched/tapped on the touch sensitive display and provides this input/information to a processor included in the user equipment. The processor can interpret this input/information and can take appropriate actions in response to the input/information, e.g. activate a program and/or a function corresponding to a pressed icon.
For example, if the user of the user equipment wants to dial a phone number in order to establish a connection with another user equipment, the user presses key buttons of a keypad being displayed on the touch sensible display, wherein the combination of pressed key buttons corresponds to the telephone number of the other user equipment. The other user equipment mentioned in this document may generally include essentially any type of communication device, such as e.g. a portable/wireless telephone, a wired telephone and/or any other machine arranged for responding to calls.
The function of the user equipment is thus dependent on a well-functioning touch sensitive display, which registers when the user touches the display in order to convey input to the processor of user equipment. However, in some situations, the touch sensitivity of the display can cause problems. For example, when a user is using the user equipment, e.g. a smart phone, for talking to another user, the user usually holds the user equipment such that a loudspeaker of the user equipment is pressed against his/her ear and talks into a microphone of the user equipment. In other words, the user holds the user equipment against the side of his/her face when using the user equipment as a phone. Hereby, there is a risk that some part of the face and/or body of the user, or some other object, will come in contact with the touch sensitive display, which by the touch sensitive display and the processor in the user equipment will be interpreted as input of information.
Often, the user has used a keypad on the touch sensitive display when inputting the information needed to establish the call, e.g. the user has dialled a telephone number on the keypad or has searched for a contact/number in a list of contacts/numbers using the keypad. Therefore, the keypad is often displayed when the user is talking on the user equipment. When the user comes in contact with the display during the talk, e.g. by his/her ear, cheek and/or hair pressing against some key button of the keypad, a tone corresponding to that key button will be outputted by a loudspeaker of the other user equipment used by the other user during the talk. Thus, a loudspeaker of the other user equipment will, on top of the voice being outputted, also output one or more very annoying tones resulting from undesired input on the touch sensitive display. The user making the undesired input is normally unaware of that he/she is making the undesired input, and is thus also unaware of the stress and irritation this is causing to the other user during a talk.
Touch sensitive displays consume battery power when they emit light. Therefore, there have been developed prior art methods for completely deactivating the displays if a proximity detector, such as an infrared sensor built into the user equipment in order to determining distances from the user equipment to surrounding objects, indicates that the user equipment is held against the ear of the user. There have also been developed methods for determining if a user equipment is held closely to another object, such as an ear, where these methods include complex signal processing of audio signals being picked up by a microphone of the user equipment in order to determine the position of the user equipment in relation to other objects.
However, these prior art methods, that have been developed for power saving reasons and not for taking care of problems related to undesired input to touch sensitive displays, suffer from a number of disadvantages. The solutions using a proximity detector are costly to implement, since the user equipment obviously needs to include such a proximity detector. The solutions can also not be implemented at all in a large number of smart phones today, since many smart phones are not provided with a proximity detector. The known audio signal processing methods adds to the computational complexity of the user equipment, and also often do not work reliably. Further, the known prior art methods do not provide a function being adapted to the way a user of the user equipment normally handles the user equipment. For example, if a user of a user equipment holds e.g. a hand and/or a finger close to or on the proximity detector or the microphone when dialling a number or otherwise handling the user equipment, this would be interpreted as the user equipment being held against the ear, whereby the touch sensitive display would be completely inactivated. Thus, by simply holding the user equipment in an unfortunate grip, the touch sensitive display would be deactivated by these prior art methods, which would be very annoying. Also, the proximity sensor and/or the loudspeaker of smart phones of today are often designed and/or located such that the fingers of a user are likely to come close to and/or cover the proximity sensor and/or the loudspeaker when holding and/or handling the smart phone.
Most users are not aware of these functions of the smart phone, and do specifically not know where the proximity detector is located. Most users would therefore not understand that unintentional covering the proximity sensor or microphone completely inactivates the display. There is thus a risk that most users would cause the display to be turned into a completely black state without knowing the reason for it, which would of course be very annoying for the user.
There is therefore need for a method, a user equipment, a computer program and a computer program product comprising the computer program to provide a touch sensitive display control being well adapted to the way the user equipment is commonly used by their users today.