The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly in recent years. Exemplary touch-sensitive surfaces include touchpads and touch-screen displays. Such surfaces are widely used to manipulate user interface objects on a display.
Some touch-sensitive surfaces also include sensors that detect inputs provided by an object (e.g., stylus) that is not in direct contact with the touch-sensitive surface, but is in close proximity to the touch-sensitive surface. The proximity-based inputs provide an additional avenue for manipulating user interface objects on a display. However, contact-based inputs and proximity-based inputs do not often work together seamlessly and may interfere with each other and cause confusion and frustration to the user.
Exemplary manipulations of user interface objects include adjusting the position and/or size of one or more user interface objects or activating buttons or opening files/applications represented by user interface objects, as well as associating metadata with one or more user interface objects or otherwise manipulating user interfaces. Exemplary user interface objects include digital images, video, text, icons, and control elements such as buttons and other graphics. A user will, in some circumstances, need to perform such manipulations on user interface objects in, for example, a note taking application, a file management program, an image management application, a digital content management application, a drawing application, a presentation application, a word processing application, or a spreadsheet application.
But methods for performing these manipulations are cumbersome and inefficient. In addition, these methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices.