Bibliography
In the description that follows, reference will be made to selected publications and other materials by citations in brackets consisting of the name of an author and the year of publication (e.g. [Ullmer 2000]). The corresponding citations for each of these references are listed below:
[Auld 1990] Auld, B. A. (1990). Acoustic Fields and Waves in Solids, Vol. I & II. 2nd Edition, Krieger Publishing Company, February 1990.
[Devige 2003] Devige, F., Nikolovski, J-P. (2003) Accurate Interactive Acoustic Plate. US Patent Application No. US2003/0066692 A1, Apr. 10, 2003.
[Dietz 2001] Dietz, P., Leigh, D. (2001) “DiamondTouch: A Multi-User Touch Technology” in Proceedings of the 14th Annual Symposium on User Interface Software and Technology (UIST '04), ACM Press, pp. 219-226.
[Fitzmaurice 1995] Fitzmaurice, G., Ishii, H., Buxton, W. (1995) “Bricks: Laying the Foundations for Graspable User Interfaces” in Proceedings of Conference on Human Factors in Computing Systems (CHI '95), ACM Press, pp 442-449.
[Grant 2002] Grant, K. D., Winograd, T. (2002) “Flexible, Collaborative Organization on a Tabletop” in CSCW 2002 Workshop on Co-located Tabletop Collaboration: Technologies and Directions. New Orleans, La., USA, November 2002.
[Ing 1998] Ing, R. K., Fink, M. (1998) “Time-Reversed Lamb Waves” in IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, Vol. 45, No. 4, pp. 1032-1043.
[Jordà 2003] Jordà, S. (2003) “Sonigraphical Instruments: From FMOL to the reacTable*” in Proceedings of the 3rd Conference on New Interfaces for Musical Expression (NIME 03), Montreal, Canada, p. 70-76.
[Kobayashi 1990] Kobayashi, K., Taniisshi, S., Kamono, T., Kaneko, K., Yoshimura, Y., Yanagisawa, R. (1990) Coordinate Input Apparatus. U.S. Pat. No. 4,980,518.
[Omojola 2000] Omojola, O., Post, E. R., Hancher, M. D., Maguire, Y., Pappu, R., Schoner, B., Russo, P. R., Fletcher, R., Gershenfeld, N. (2000) “An Installation of Interactive Furniture” in IBM Systems Journal, Vol. 39, Nos. 3&4, pp. 861-879.
[Paradiso 2002] Paradiso, J., Leo, C. K., Checka, N., Hsiao, K. (2002) “Passive Acoustic Sensing for Tracking Knocks Atop Large Interactive Displays” in Proceedings of the 2002 IEEE International Conference on Sensors, Vol. 1, pp. 521-527.
[Paradiso 2005] Paradiso, J., Leo, C-K. (2005) “Tracking and Characterizing Knocks Atop Large Interactive Displays” in Sensor Review, Vol. 25, No. 2, pp. 134-143.
[Patten 2002] Patten, J., Recht, B., Ishii, H. (2002) “Audiopad: A Tag-based Interface for Musical Performance” in Proceedings of Conference on New Interface for Musical Expression (NIME '02), Dublin, Ireland, 2002.
[Reynolds 2001] Reynolds, M., Schoner, B., Richards, J., Dobson, K., Gershenfeld, N. (2001) “An Immersive, Multi-User, Musical Stage Environment” in Proceedings of SIGGRAPH '01, ACM Press, p. 553-560.
[Roberts 1972] Roberts, L. (1972) “Aloha Packet System With and Without Slots and Capture” in Stanford Research Institute, Advanced Research Projects Agency, Network Information Center, Tech. Rep. ASS Note 8, 1972.
[Rosenfeld 2003] Rosenfeld, D., Perlin, K., Zawadzki, M. (2003) “Planar Manipulator Display” in SIGGRAPH 2003 Emerging Technologies, San Diego, Calif., July 2003.
[Ullmer 2002] Ullmer, B. (2002) Tangible Interfaces for Manipulating Aggregates of Digital Information. Ph.D. Dissertation, Massachusetts Institute of Technology, 2002.
[Underkoffler 1999] Underkoffler, J., Ishii, H. (1999) “Urp: A Luminous-Tangible Workbench for Urban Planning and Design” in Proceedings of Conference on Human Factors in Computing Systems (CHI '99), ACM Press, pp. 386-393.
[Walczak 2003] Walczak, M., McAllister, M., Segen, J., Kennard, P. (2003) “Dialog Table” in Strangely Familiar: Design and Everyday Life exhibition, Walker Art Center, Minneapolis, June 2003.
[Yoshimura 1992] Yoshimura, Y., Tanaka, A., Kaneko, K. (1992). Coordinates Input Apparatus. U.S. Pat. No. 5,097,415.
Media Tables
A media table, occasionally referred to as an interactive workbench within work-related contexts, is a horizontal surface upon which the spatial configuration of tagged objects is computationally interpreted and then augmented with coincident visual output. The visuals are usually provided by rear or front projection. Object tracking provides the ability to identify and associate unique physical artifacts to different elements or functions. In this way, some of the information within the interactive environment can be off-loaded from a purely graphical form to a physical-world representation. This important characteristic of tangible interfaces was noted by Ullmer and Ishii in [Ullmer 2000]. The graspable artifacts of tangible platforms are seen as physically embodying digital information, and they act as both representations and controls within the interactive environment.
Tables provide an ideal space for people to engage in shared and sociable interactions. For media tables, this implies a broad range of physical contexts that would be well-suited for their use, including the home, school classrooms, and conference rooms. One could also imagine placing media tables in public spaces and social environments, such as cafes, shops, or museums. As digital entertainment applications evolve, there is an increasing need to develop a general purpose interactive tabletop display surface that can support a diverse range of applications for these types of spaces, including media asset management, story construction, digital game play, and multimedia learning.
The shared living space found in most homes provides a natural arena for sociable interactions around tables. Media tables in the home may have uses within the domain of arts and entertainment, including specific applications such as game play, storytelling, and media browsing and organization. A media table for the living room needs to provide at least the size and scale of a typical game-board, and should support multiple points of control and multiple applications while ergonomically accommodating several users engaged in sociable interactions from different sides. In particular, a media table for the home might take the form of the low table which is often a living room's center-piece, and serves as a surface for depositing readings materials, playing games, assembling puzzles, sorting photographs and other shared activities. While the need for tabletop display surfaces seems to have been recognized by a number of researchers who have explored various interactive applications on tabletops, none of the early attempts point to a general purpose, economically viable tabletop display and interaction platform. In particular, existing media tables do not provide an extensible architecture that can support a diverse range of applications that would be required for everyday use by many people.
The functional criteria required for a media table can be grouped into the following three categories: (1) object sensing, (2) object management and identification, and (3) table setup and display. Each of these criteria is briefly summarized below.
Object Sensing
While the technologies like GPS capable of tracking objects on a global scale are increasingly accurate and reliable, precisely locating objects on the surface of a table remains a difficult technical problem. A variety of different approaches have been tried, using techniques ranging across optical, acoustic or radio-frequency sensing. It should be noted that many optical or acoustic approaches that are done through the air can pose problems of occlusion if one or more objects are blocking the view of a receiver or transmitter. In these cases, it becomes difficult to support multiple continuously interactive objects. Another problem that often arises is the scalability of the interactive surface in terms of size or shape. Approaches that use antenna grids rather than a small set of fixed receivers or transmitters generally require tiling in order to scale in size which can be costly and generally results in an extra level of complexity in the design of the electronics.
The important considerations for object sensing on an interactive media surface that provides multiple points of control are: the ability to track multiple objects at once, the ability to avoid interference between the tracked objects and between these objects and their surroundings, and the ability to scale the sensing surface to different sizes and aspect ratios.
Object Management & Identification
Interactive tables face the problem of dealing with large numbers of physical objects across many different applications and platforms. While unique identifiers and an extensible namespace are common within the digital realm, getting computers to uniquely identify objects in the physical world is a difficult problem. Technical solutions such as barcodes and Radio Frequency Identification (known as RFID) tags give physical objects a unique identity that can be understood by a computer. This unique identity is retained regardless of where you move the object in the physical space.
On a media table, different interactive objects might have different physical properties or shapes depending on their application and use. For this reason, the objects need to be uniquely and digitally identifiable, and the means of object identification must be able to function together with the table's position sensing technology. Moreover, users might want to move their interactive objects from one table to another, meaning that all tables need to have a shared understanding of how objects are identified and need to provide an application manager that can organize the diverse applications that run on the platform. The system should include an application programming interface (API) to allow programmers to develop different applications for the table and associate them with particular (potentially customized) sets of objects. Finally, if different types of objects are to serve unique purposes within different application environments, an interactive table should ideally provide a means for application designers to customize or tailor interactive objects to their particular applications in terms of physical form or functionality.
Past approaches to interactive tables have not been able to provide adequate systems for object identification and management. Optical-based systems have generally not been able to provide an easily extensible set of interactive objects, and customizations are difficult to manage or scale. Approaches that use actuated RFID tags typically provide only a small set of interactive objects.
The important considerations for object management and identification are: each object should be designated by a unique identifier that can be used across an extensible namespace, each object should be movable from one interactive surface to another and retain its unique identity, and it should be possible for objects to have new capabilities, shapes and forms.
Table Setup & Display
Many of the technical approaches to interactive tables have required a large amount of external infrastructure, such as sensors placed around the room or overhead projection systems to provide a visual display on the table's surface. While these systems work reasonably well for prototyping or demonstration purposes within a research laboratory, they are not viable options for a media table that is designed for use within a home setting. For the latter case, both the sensing and display technologies need to be encased within the interactive table itself. Ideally, the table should be a single integrated unit that is not encumbered by too many wires or external connections (aside from power) and can be easily moved around a typical living room.
The important considerations for table setup and display are: the sensing mechanism should be fully contained within the table, requiring no external infrastructure such as cameras or separate antennas; a display mechanism should be integrated inside the table with the interaction surface, requiring no external infrastructure such as an overhead projector.
Prior Interactive Table Systems
Early work on interactive tables began with the ActiveDesk at the University of Toronto, which allowed two-handed input through physical artifacts called “bricks” and was used with a drawing application called GraspDraw [Fitzmaurice 1995]. The next major steps took place at the MIT Media Lab, with the metaDesk and I/O Bulb projects [Ullmer 1997, Underkoffler 1999]. These systems differed from the ActiveDesk in that they were designed for collaborative use. Applications included geographical visualization, simulation of holographic setups, and urban planning. In both systems, the tracking of multiple objects was achieved through computer vision, and the display was accomplished with projected graphics from the rear (metaDesk) or the front (I/O Bulb).
Since then, interactive tables are being explored by an increasing number of researchers for different applications in a variety of physical contexts. The Sensetable project at the M.I.T. Media Lab has been used for applications in a number of areas, including supply-chain visualization and musical performance [Patten 2001, 2002]. The DiamondTouch table from MERL allows multiple users to interact simultaneously via touch, and has been used for applications such as a multi-user map viewer [Dietz 2001]. Other tables that are being developed at research labs within universities include the iTable from Stanford which has been used for information organization [Grant 2002], the reacTable* from the University of Pompeu Fabra which is used as a musical instrument [Jordà 2003], and the Planar Manipulator Display from NYU which has been used for simulating furniture layouts in an interior space [Rosenfeld 2003]. In the latter of these, the interactive objects are equipped with motors and their movements can be not only sensed but also controlled by the computer. Interactive tables have also been used within interactive art projects, such as the sensor table developed by Post and collaborators from the MIT Media Lab for an installation at the Museum of Modern Art in New York. The sensing surface in this case was a pixilated capacitive matrix that could detect and track bare hands through capacitive loading [Omojola 2000]. Another example is the Dialog Table which was commissioned by the Walker Art Center as a permanent installation that promotes social interactions among visitors and provides access to the museum's multidisciplinary collections [Walczak 2003].
It is worth noting here that none of the tables which use multiple object tracking have been realized in a manner that can scale beyond a research prototype or single-platform project in order to gain widespread and possibly commercial use. Nevertheless, the idea of an interactive table is gradually making its way into the consumer market, and Hitachi has announced the development of a touch-sensitive tabletop display which it plans to commercialize within the coming year, starting at a price of $20,000 US [Hitachi 2004].
Acoustic tracking is believed to be the most promising technology of determining the position of multiple objects on a media table's interactive surface. In acoustic tracking systems, objects can be located by embedding ultrasonic transmitters inside the tracked objects. Ultrasonic receivers can then be placed around the sensing area and used to pick up the short acoustic signals emitted by the objects. An object's location can be triangulated based on the time-of-flight of the signal from the transmitter to each receiver. The acoustic signal is typically transmitted through the air, which can result in errors if there are objects in the way between a transmitter and receiver. This approach has been used to locate performers on a stage [Reynolds 2001] as well as in a number of electronic whiteboard systems such as Mimio by Virtual Ink, in which an ultrasonic tracking array is positioned along the upper left edge of the board, and the acoustic signals are transmitted from special sleeves that hold the whiteboard markers. In another system from the MIT Media Lab, ultrasonic positioning has been used to track knocks or taps on large glass surfaces [Paradiso 2002, 2005]. In this case, the receivers are affixed to the back of the glass panel and the acoustic signal travels through the glass rather than through the air, which eliminates potential problems of occlusion. This method provided the inspiration for the approach used in TViews, the embodiment of the invention described below, but differs in that the TViews system places the receivers in the objects in order to allow the system to scale to larger numbers of tracked objects without sacrificing the position update rate. In the commercial realm, two companies from France, Intelligent Vibrations [Devige 2003] and Sensitive Object [Ing 1998], have both developed systems for tracking taps on large flat surfaces such as windows and tables, and their approaches are similar to the work by Paradiso described above. Intelligent Vibrations has designed an “Internet Table” with an embedded display that is designed for bars and internet cafes. Users interact by tapping on the table surface with their fingers or other objects such as pens and cutlery. Canon has developed a similar approach for tracking a vibrating stylus on the surface of a display tablet using the acoustic propagation time of the signal through a glass plate placed above the display surface [Kobayashi 1990, Yoshimura 1992].