The present invention relates to a user interface apparatus and a mobile information apparatus that employ such a user interface apparatus. More specifically, the present invention relates to data input technique and interface technique applicable to a mobile apparatus or apparatus having neither keyboard nor mouse.
The present invention relates to a portable information apparatus that provides predetermined information service in response to input operations performed by a user. Specifically, the present invention relates to a portable information apparatus that operates in response to a user input obtained via an input device provided with the apparatus of the standard configuration.
More specifically, the present invention relates to a portable information apparatus that allows the user to perform a complex input operation with rather simpler user actions, and particularly to a portable information apparatus provided with a physical user interface that accepts physical gestures of the user, thereby simplifying the input operation.
In one aspect of the conventional input techniques, significant problems with new mobile and handheld devices are difficulties of effective interaction with the device. The input capabilities of mobile devices are usually limited to pen input and touch screens, buttons and jog-dials type of controllers.
The problem with these input techniques is that the touch screens, which occlude the screen, often require use of pen, but the use of pen is often difficult because of the limited resolution of the touch sensors. Interaction with touch screens also promotes interaction style based on direct sequential manipulation of GUI interface objects: for example, to zoom into map, a user has to sequentially repeat scrolling, pointing and zooming operations.
An alternative device has been proposed that would help to create small while simple to use and mobile apparatuses. For example, devices that accept user's physical interactions to a body of the device are disclosed in Japanese Patent Applications JP 11-143606 and JP 07-64754. The physical interactions include the changing a shape of deformable part or tilting of the device.
Although such devices have been presented, there is not much attempt to develop a graphical user interface that would take advantage of these devices. There is little exploration on how such interface would be useful in basic interface tasks such as data scrolling, navigation, browsing and so on.
Most of the portable devices currently use conventional data input and user interface techniques that have been either replicated from desktop graphical user interfaces or attempt to extend them. One example of the conventional data input and interface techniques is disclosed in Japanese Patent Application JP 2000-207088. These techniques are usually based on using pen and mouse and mostly inappropriate to small handheld devices such as mobile or portable apparatuses.
For example, conventional interactions with GUI interfaces, such as those used on desktop computers and PDAs, is based on the concept of cursor or pointer. The cursor and pointer are graphical representation of the current position on a display screen. To change the current position, e.g. to select a different actionable item, the user has to use input devices such as mouse or keyboard to directly specify one of GUI element on the screen as shown in FIG. 1(A). In the present specification, such task is referred to as pointing or selecting.
In some cases, a desired element may not be visible on the screen and requires the user to scroll the content to find it. Such scrolling of content is a separate task from the pointing (selecting) described above. The scrolling usually requires either
a) special user interface elements, such as scroll bar or
b) switching of interface modes, e.g. in current GUI when cursor reaches the limits of the visual content area, the content starts scrolling down. Typically, in the conventional interface method, moving of a pointer and pressing of the mouse button as shown by numerals 1 and 2 of FIG. 1(B) trigger these operations. These pointing (selecting) and scrolling methods are particularly inefficient and difficult to use in the device with a small display screen.
Another aspect of the conventional input techniques is described below.
Human hands are excellent tools. Various complex operations can be accomplished while effectively controlling a number of freedom the hands and fingers posses. For example, a musician such as a violinist can apply two different types of tensions on strings at the same time by moving a bow in two different directions (for example, along the string and across the strings).
Similarly, a position and a force may be simultaneously inputted in a computer screen by using an input devices such as a mouse and a pen. For example, in the screen, a button may be pressed while pointing a particular position by the mouse, or pressing a pen down on a tablet.
In a paper presented by S. Zhai and P. Milgram (Human performance evaluation of manipulation schemes in virtual environments: Proceedings of VRAIS'93. 1993. IEEE. pp. 155-61), it is suggested that a position control (isotonic control) and a force control (isometric control) are manual controls with different physiological and physiological mechanisms, and directions of these controls are, in basic sense, orthogonal to each other for human beings.
Difficulty of data input and lack of effective interaction in a portable or hand-held type apparatuses are know for a long time and are very important issues. Typically, input functions in these portable apparatuses are limited to, for example, a pen input via a touch screen, buttons or jog-dial type controllers. In case that the touch screen is used, there are some difficulties such that the pen may occludes display contents of the screen, or the pen may be required too often for the interaction, or an accurate pen input becomes impossible because of resolution limitation in the touch screen.
The interactions via the touch screen is only recommended for a specific instruction such that successive direct operations are performed on GUI interface objects, such as successively applying functions of scrolling or zooming for viewing and zooming a map. If such an input operation is realized in a single gesture, the operation may be drastically simplified and burden on the user may be eased.
There are some suggestions regarding physical user interactions for realizing various tasks such as performing all of the computer operations by applying user's physical gestures on a portable computer. For example, the above-mentioned JP 11-143606 discloses a portable apparatus, which includes a feed back module for displaying information regarding data structure processed and a detector for detecting a user's manual operation, for modifying a display form of the data structure in response to the manual operation. The above-mentioned JP 07-64754 discloses a small information processing apparatus that can be held in single hand and make scrolling of a display in accordance with an inclination of the apparatus. In these apparatuses, interfaces are controlled by detecting user's actions applied on the apparatuses with sensors provided therein.
Rekimoto, J (Tilting operations for small screen interfaces; Proceedings of UIST'96. 1996. ACM. pp. 167-168) discloses a small display interface to be used for scrolling information display by detecting an inclination of an apparatus with a tilt sensor. However, these interfaces are focused on realization of only certain functionalities. In other words, the physical interactions to the computers are considered as asynchronous ones. If an action is performed, another user action would follow. No research has been made on an apparatus that can transparently combine and use a plurality of gestures.
Balakrishnan, R., G. Fitzmaurice, G. Kurtenbach, K. Singh (Exploring interactive curve and surface manipulation using a bend and twist sensitive input strip; Proceedings of Symposium on Interactive 3D graphics. 1999. ACM. pp. 111-118) and U.S. Pat. No. 5,396,265 disclose a flexible interface in which a rotation sensor is utilized to detect bending of sensing portions that are mechanically connected to each other. However, the disclosed interface focuses only to work of creating forms but not suggesting a general-purpose interface that can be applied to portable apparatuses or any other general apparatuses.
Further, there is physical interactions available for a desktop computer using a force detection device such as a space ball. However, applications of such physical interactions are limited to, in general, a navigation application in three-dimensional space.
With regard to the first aspect of the conventional input techniques, there are several attempts to investigate new types of interfaces and data input techniques suitable for mobile apparatuses. However, most of these investigations have been focused on a single task such as 3D data control (Balakrishnan, R., Fitzmaurice, G., Kurtenbach, G., Singh, K., “Exploring interactive curve and surface manipulation using a bend and twist sensitive input strip”, Proceedings of Symposium on Interactive 3D graphics, 1999, ACM. pp. 111-118), or data scrolling (Rekimoto, J., “Tilting operations for small screen interfaces”, Proceedings of UIST'96. 1996. ACM. pp. 167-168) or others (Fishkin, K., et al., “Embodied user interfaces for really direct manipulation”, Communications of the ACM, 2000. 43(9): p. 74-80).
Furthermore, text input using conventional mobile or handheld devices are rather problematic in the following point. That is, input capabilities of mobile devices are usually limited to pen input and touch screens, buttons and jog-dials type of controllers. For text input, there are currently three widely used techniques: keyboard (on screen or as a physical array of buttons), number-pad input on mobile phones and gesture-based systems such as Palm Computing's Graffiti (product name). Of these, the number-pad text input technique may be the most widely used. However, such technique has some disadvantages since it requires multiple button presses for the input of each character.
All the above conventional techniques become more difficult to use as the size of the devices becomes smaller. The need for the physical buttons limits the miniaturization of portable computing devices. The touch screens are problematic on small devices because of their limited sensing resolution and because the user may occlude the screen during the input operation.
Furthermore, existing computer input devices such as mouse and pen interfaces allow various input operations, e.g. pressing button on the mouse or pressing pen on the tablet. However, external input devices are needed, which may be problematic in very small devices.