A user interface typically provides a number of interactive items which need to be navigated through by a user in order to locate a target interactive item. The number of items available in a user interface may be such that it is not possible to display all of the items in a suitable manner on a display associated with the user interface at once. In order to locate or navigate to the target interactive item, it may necessary to perform certain actions to locate the target interactive item. The actions may be, for example, to scroll, pan, zoom, page or the like. When a specific target interactive item has been located, and a user wishes to locate a next target interactive item, the user may again need to perform certain actions in order to locate the next target interactive item. The act of manipulating the display of the user interface in order to alter the displayed targets therein may be referred to as “view pointing”. A pointer in the user interface is moved in a particular manner so that the view is suitably altered until the target becomes visible in the display.
In the remainder of the specification, the term “view pointing” should be construed to mean any action which alters a display of a user interface, including, but not limited to, elements such as panning, zooming, paging, changing the orientation of an item, the location of an item on the display, the transparency level of an item, making an item appear or disappear, fade in or out, or the like. It should be noted that the action of zooming may include or involve other actions such as increasing a size, changing an orientation or location of an item, making an item appear or disappear, or the like.
When a target interactive item is visible in the display, a user typically navigates the pointer to the target in order to select it. This may be referred to as “focus pointing”. A pointer in the display is moved in a particular manner so that the target may be selected, viewed or discovered. In this specification, focus pointing refers to any action which affects a specific focal point or area on a user interface, including, but not limited to, moving an interactive item, moving a cursor, or interacting with a specific target interactive item.
View pointing and focus pointing may be referred to as “target-directed movements”. A target interactive item may be any interactive item, where an interactive item could be, for example, a document, content item, event, file, folder, container, media item, music item, photo, video, title, clip, message, entry, text, tag, data point, character, emoji, icon, link, button, application, program, executable file, digital file, icon, menu item, hypertext link, button or any other target that may be positioned in a display region of a user interface as part of a number of such interactive items. Throughout this specification, the term “interactive item” should be broadly interpreted and is used to refer to any target provided by a user interface and which a user may wish to select, whether or not it is visible to the user at a given point in time.
A pointer may typically be a cursor provided by a user interface and controlled by an input device such as a mouse. The pointer may also be the position, movement and/or touch of any other pointing tool or the body of the user in a control region provided by the user interface. In cases where touch-sensitive displays are employed, the pointer is typically a finger or fingers of the user. Implementations have further been developed wherein electronic devices are equipped with z-axis tracking. In such a case, a device may, for example, be capable of tracking the position and/or movement of a pointer in a region above a display plane such as a touch-sensitive electronic device.
Known user interfaces generally provide different target-directed movements as distinct modes. For example, only one form of view pointing may be activated at a given point in time, and a particular signal may be required from the user in order to enter a desired mode or switch between modes. For example, in conventional “point and click” user interfaces, scrolling may be achieved by operating a scroll wheel located on a mouse, while zooming may be achieved by pressing a certain key on a keyboard while operating the scroll wheel. In touch-sensitive devices, certain discrete gestures such as pinching the screen may achieve zooming, while other gestures such as swiping may achieve scrolling.
U.S. Pat. No. 8,654,076 discloses a method of switching between three distinct modes when using an electronic device equipped with z-axis tracking. In a normal mode, a number of interactive items are displayed on a touch screen. In this mode, the pointer, which is typically a finger of a user, is out of a tracking range of the device. Once the pointer hovers over the touch screen within the tracking range but without touching it, a zooming and panning mode is entered. Once a touch input is provided, a pointing mode is entered. In this mode, panning is prevented and the user is capable of moving the pointer to select a target interactive item.
A problem associated with methods such as those described above is that at least some target-directed movements, including scrolling, zooming, panning and moving the pointer to select a target interactive object, may be implemented as distinct modes. A user may thus not be able to simultaneously perform, for example, zooming and scrolling. Furthermore, the efficiency of such user interfaces may be unsatisfactory, as the user may be required to alternate between different modes, potentially resulting in repetitions of “point and click” to ultimately select a target on an interactive object.
The present invention aims to alleviate this and other problems, at least to some extent.