Electronic systems including computer systems that interact with human beings use what is commonly known as a user interface to control one or more aspects of machine/human interaction. User interfaces have provided formidable challenges to the user interface developers who normally seek to provide an interface which is both easy to use and efficient from a user's perspective. That is, a user interface should allow a user to accomplish a desired objective, e.g., receive a desired set of information or make a desired selection, without a lot of unnecessary user operations and without having to have menu options repeated, e.g., because of a lack of understanding upon initial presentation. It has been recognized that to provide an effective and efficient user interface, the interface should have flexibility and be able to adapt to the individual user.
The decision process of modifying the user interface to adapt to an individual user can be extremely difficult because the adaptation process attempts to customize the interface for a particular user often without specific knowledge of a user's current condition. A user's current condition may include such things as a user's intentions, goals, and/or informational needs, e.g., how experienced the user is with the system and/or what the user's knowledge base is.
Known user interfaces have focused on attempting to determine a user's experience level and/or goals and then modify the user interface accordingly. For example, users who are determined to be experts based on their amount of previous experience with a system may be presented less help when traversing menus and options than users who are determined to be novices at using an apparatus. Similarly, when a user's goal is determined to be a particular operation or one of a set of operations, e.g., based on previous menu selections, a user may be presented with specific menus tailored or arranged to facilitate the user's goal.
Such known systems focus on modifying a user interfaces fail to consider the impact of a user's emotional state at the time of using an apparatus. The inventor of the present application recognized that a user's emotional state can have an impact on a user's ability to interpret information provided to the user, make menu selections and perform other tasks commonly performed through use of a user interface. For example, when under extreme stress: angry, grieving, etc. a system user may find it more difficult to interpret and understand machine generated speech, menu options, etc., than under less stressful conditions.
Thus, it is very possible for a user to be fairly knowledgeable about a user interface on a particular system, but still be impaired in his capacity to interact with the user interface due to the presence of emotions, e.g., anxiety, grief, anger, etc. Such impairment can be either long or short term in nature.
In view of the above discussion, it becomes apparent that it would be desirable for a user interface to adapt in response to a user's emotional state. Accordingly, there is a need for methods and apparatus that allow a user interface for any product or service to detect the emotional state of a user and to be modified based upon a user's current emotional state (e.g., stress level). Furthermore, it would be desirable that at least some new user interfaces be able to make user interface modifications in response to a user's emotional state while also making modifications based on other known factors such as experience level and knowledge.