Electronic devices are in many cases provided with one or more displays for providing visual information to users of the devices. The electronic devices can be provided with user interfaces for display on the display of the device for facilitating user interaction with, and operation of, the device via one or more user inputs. The user interfaces comprise visual elements that can be arranged in various manners on the screen and can represent, for example, executable software programs, menu items indicating selectable functionality or operations available to the user within programs, a state of some aspect of program or data or other function of the device, etc. User inputs such as trackpads, trackballs, mice, cursors, touch screens and multitouch screens, can provide pointer-type controls usable to adjust the position of a pointer in multiple dimensions to allow interaction with the user interface by, for example, enabling navigation through menu systems, options, file systems, program shortcuts etc, and enabling selection and manipulation of visual elements and the items they represent.
Electronic devices, in particular, small formal portable electronic devices such as smart phones and tablet computers, having a touchscreen or multitouch screen as the primary means for receiving user input for controlling the user interface, are now commonly available. These devices are typically limited in the on-screen real-estate they have available and the more tactile and direct nature of the touchscreen driven user interfaces of these devices lends itself user interactions that are immediate and intuitive. However, the inability of users to immediately and intuitively understand the user inputs and gestures that would be required to achieve relatively complex interactions and operations can have a limiting effect on the usability of these interfaces. That is, these interfaces typically currently implement recognition of only relatively simple immediately and intuitively understandable touch gestures that perform simple interactions, and instead rely on the user to perform multiple simple gestures (e.g. selection/deselection of buttons and manipulation of other UI elements, such as touch keyboards) to perform these more complex interactions. This is to the detriment of their intended quickness and immediacy of usability. The additional steps and UI elements required to perform complex interactions take up a user's time and the available space on the display, which can be frustrating to a user, who may as a result become dissatisfied with the device.
There is a need for touchscreen-driven user interfaces that are simple enough to be intuitive to new users, while still allowing a user to perform complex actions quickly. Furthermore, with a finite amount of screen real estate available on displays for electronic devices, there is a need for user interfaces that can perform their function while minimising the amount of screen space used that could otherwise be utilised for displaying content.