The present invention provides methods and systems for the control of electronic devices. More particularly embodiments of the present invention provide methods and systems for control of electronic devices comprising graphs comprising nodes and edges.
A touchscreen is a visual display screen for an electronic device which displays a user interface with which a user interacts by touching the screen, for example with a finger or with a stylus. Touchscreens based on several different touch sensing technologies have been developed. Two common types in use today are resistive touchscreens and capacitive touchscreens. Resistive and capacitive touchscreens comprise a touch-sensitive overlay to the display screen which communicates touch position to touchscreen control logic.
Interaction with a touchscreen by a user may take the form of a simple interaction, such as a touch of a finger at the point on the screen where a selectable object is displayed which is interpreted by the touchscreen control logic as a selection of that object. More complex interaction may also be provided for, for example a user may move a touch point in a gesture, the pattern for which is recognized by the touchscreen logic and interpreted as indicating the entry of a command associated with that gesture.
Touchscreens may recognize a single touch point, but most touchscreens in use on current devices are multi-touch touchscreens which are capable of recognizing the position and movement of more than one touch at a time, for example when two or more fingers are moved in relation to each other on the touchscreen. In such multi-touch touchscreens, for example, a gesture comprising the movement apart of two finger touches, or finger and thumb touches, might be recognized as a command to enlarge the display of the object displayed at the position of the gesture.
Devices equipped with touchscreens may comprise touchscreen displays of varying sizes. A particular current growth area is in handheld and other portable mobile electronic devices with integral touchscreens, for example mobile phones (also called cell phones) and tablet computers, in which the size of the display is necessarily limited by the requirement for portability. A mobile phone or cell phone may have a touchscreen display of diagonal size of about 90 mm to 120 mm, for example. A small tablet computer may have a touchscreen display of about 180 mm, for example.
Such portable electronic devices may be used to display control interface software for the control of the interaction of multiple objects represented on the display. For example, a control interface may be used to control the interaction of external electronic devices depicted as nodes in a graph, each node being displayed as an on-screen icon or similar object, the graph representing interaction of a network of nodes. The nodes may be linked by edges, straight or curved lines, each edge representing an interaction between the devices represented by the nodes.
Touchscreen interaction control software provides a user with the ability to add, move and remove nodes, and to add, move and remove edges to control the interaction of the objects represented by the nodes.
United States published patent application number US 2013/0335339 provides methods, computing devices, and computer-readable media for interpreting gestures and triggering actions on a graph when the gestures are detected. The triggered actions may include the addition or deletion of nodes, connections between nodes, or connections between node ports; the expansion or collapse of a set of nodes; or the copying of nodes. The input may describe an action of selecting, dragging, holding, flicking, shaking, pinching, unpinching, or spinning a graphical object such as a node or a connection. Gesture interpretation logic determines whether the input matches a stored or known gesture. If the input matches a gesture, then the gesture interpretation logic may perform an action mapped to the gesture instead of or in addition to the normal action that would otherwise be caused by each individual input.
United States published patent application number US 2013/0246958 describes a link curvature processing module which provides a user with the ability to control the curvature of links in a node-link diagram. As a node-link diagram is displayed to a user, the user may interact with the diagram and adjust the curvature of one or more links in the diagram to improve the readability of the diagram. The user's modification to the curvature of a link alters the shape of the link so that the position of the nodes connected to the link does not change. By providing the user with such control, the user is able to tailor the visual display of the links to the user's preference.
Touchscreen interaction by a user, using a finger for example, is very much less precise than interaction with a user interface using a pointing device, such as a computer mouse. A computer mouse-controlled graphical cursor may be used to define an interaction point as small as a single display pixel. The position of touch interaction using a finger may comprise at least many hundreds of pixels and so the desired point of contact is much more difficult to define. This is a problem when interacting with a graph showing selectable objects which appear close together on the display.
For example, if a user wishes to remove a single edge from the graph by first using a finger to select the edge, it may be difficult to select the correct edge, or to avoid selecting more than one edge, because of the size of the contact location of the finger on the touchscreen. It would be desirable to provide an improved interaction method for the selection and removal of a single edge from a graph.