In recent years, both the number and variety of electronic devices and appliances in use have increased dramatically. Lighting systems, personal computers (PCs), video tape recorders, compact disc (CD) players, stereo receivers, and televisions are but a few of the most common devices found n both residential and commercial settings. Even appliances which rely on non-electric power sources, such as gas burning furnaces, are usually controlled electronically.
Reflecting the variety of devices now available, many different systems and devices are used to provide more centralized or automated control of the appliances found in the home and the office. These can be as simple as a remote-control entertainment system comprising a CD player, stereo and television, or as complex as a building environmental control system for regulating HVAC and security functions.
A central feature of any automation system is the interface by which the user interacts with the automation system. User interfaces permit one to program future operation or to control different devices from a centralized location. Reflecting the variety of appliances or devices that are to be controlled, user interfaces may range widely in complexity. In some settings, a more complicated interface is necessary to provide the broad range of functionality required. For example, a building control center may be sufficiently complicated to require training for the operator. On the other hand, in many environments, particular residential settings, it is essential that interfaces be easy to use and understand so that the entire range of functionality may be utilized.
Unfortunately, the standard approach now commonly used is for each device or system in a given environment to be controlled according to a particular methodology which might differ dramatically from other systems. For example, a home might include a security system, an entertainment system, an environmental control system, and so forth, each with its own unique interface. Thus, a user may be required to set a thermostat in a first manner, program a VTR in a different manner, and program the security system in yet a different manner. By requiring the user to learn several methods of operating each system or set of devices in the environment, it is more difficult for the user to become familiar with the various systems and to take full advantage of all their features.
Another drawback associated with this standard approach is that the use of different interfaces may result in an increase in the amount of space taken up in the setting. For example, two or more different keypad controllers may be mounted on a wall to separately control individual systems. As a result, there may be a decrease in available wall space and a negative impact on the aesthetic quality of the setting.
Some automation systems attempt to address these limitations through the use of menu driven interfaces which are connected to a single, dedicated control processor. With such a system, a user may control various systems, such as lighting, HVAC, and security, from a single type of interface which uses a common methodology for interacting with the user. In general, such interfaces incorporate a display, typically a cathode ray tube (CRT) or liquid crystal display (LCD), which provides the user with several options for controlling one or more systems in the setting.
Different types of interfaces are used in menu driven systems. In some systems, a touch screen is utilized wherein the user presses a portion of the display screen to make a selection. The user's fingertouch is detected and the display indicates which area of the display has been selected. Alternatively, conventional mechanical switches may be provided in proximity to the display screen. A graphical image on the display device directs the user to the appropriate push-button flanking the display. This approach is similar to that adopted in connection with many automated teller machines (ATMs).
Touchscreens are advantageous in that the area in which the user is prompted to make a selection is usually identical to the region-which is actually acted upon by the user. For example, virtual buttons may be displayed which are selected on the same display area. On the other hand, touchscreens are relatively expensive and require a fair amount of support by processor and memory elements in order to function properly. Moreover, their use may not prove to be as convenient to the average user as more simple mechanical buttons which provide tactile feedback to the user, usually by a button which clicks when fully depressed.
The other above-mentioned approach of providing mechanical switches in proximity to a display screen overcomes these drawbacks, at least in part, in that the mechanical switches may provide tactile feedback. However, since the switches are offset from the display, this approach may require more care in use, particularly in comparison with a touchscreen. In particular, the offset between the switches and the display may make it difficult for a user to determine which button corresponds to a given option. As a result, the user may become disoriented and there may be a delay in the selection of a desired option as the user verifies which switch corresponds to the desired option. Such a delay is clearly unacceptable where the user is attempting to address certain high priority control systems, such as lighting systems or home security. Further, this problem may be exacerbated where the display screen contains more than a few simple commands. Thus, the combination display/push-button interface is limited by its nature in the amount of information that may be displayed and the command options that may be offered.
Accordingly, there is a need for an economical, menu-driven interface that permits the user to rapidly and efficiently select a desired function and enter a desired command. There is a related need for the interface to provide tactile feedback to signal the user that a command has in fact been entered.