Computer science researchers have been exploring ideas associated with “smart spaces” or ubiquitous computing for years. One of the most important challenges has been creating new, intuitive user interfaces that are suitable for ubiquitous computing applications, but are not based on traditional personal computing platforms. The challenge is to make these new devices and applications simple and intuitive to use. Donald Norman describes some of the challenges in designing such a user interface in the book, THE INVISIBLE COMPUTER. 
Many conventional computing systems, such as desktop and laptop computing systems, are capable of presenting a Graphical User Interface (GUI) that can be navigated using a pointing device, such as a mouse, trackball, touchpad, touch screen. In this regard, the pointing device can be utilized to point to an object presented on a display screen, and to select one or more actions to perform based upon that object. For example, many pointing devices include two buttons, a left button and a right button, for selecting action(s) to take based upon an object pointed to by the pointing device. For example, in Microsoft Windows operating systems, the left button can be clicked once to “select” an object and provide an indication that the object has been selected. Then, the next click (or two clicks in succession) on the object causes an action to be performed based upon the selected object. In Windows operating systems, for example, the next click can cause the computing system to initiate an application associated with the object. On the other hand, the right button typically provides one or more less-frequently utilized capabilities. For example, after selecting an object, the right button can be clicked to, among other things, view properties (e.g., the application) associated with the object.
In an attempt to address the design challenges of user interfaces suitable for ubiquitous computing applications, there has been significant work in the area of point-and-click user interfaces and the application of point-and-click ideas to ubiquitous computing research. Traditional infrared (IR) remote controls are an obvious example. In this regard, IR remote controls are being developed today to allow users to control a number of different entities from a single controller. Another example of the application of point-and-click ideas to ubiquitous computing research is the “Cooltown” research project within Hewlett-Packard Laboratories. Generally, in accordance with the Cooltown research project, users can interact with entities using wireless communication devices to execute uniform resource locators (URLs) to web resources such that the wireless communication device can thereafter interact with the web resources.
Whereas previous real-world point-and-click techniques allow users to interact with entities in a predetermined manner, such as to control the entity or link to web resources, such techniques are inflexible. In this regard, most systems employing such techniques are single purpose devices. For example, Cooltown uses IR to achieve limited, multi-application, point and click functionality. Cooltown protocols are used to send and receive URLs and these URLs are assumed to point to HTML documents. Cooltown can be thought of as a sort of real-world, web browsing system. The flexibility of this approach is limited because it is based on URLs that only point to web resources. It is very rare that a single reader can be used to read different tags associated with different applications. In some cases, such as laptop computers, IR can be used for multiple purposes (e.g., controlling peripheral devices, receiving data, etc.), but such applications typically do not involve point-and-click techniques to select entities with which the laptop computers interact.