Touch systems are well known in the art and typically include a touch screen having a touch surface on which contacts are made using a pointer. Pointer contacts with the touch surface or pointer proximity (e.g., hovering) in relation to the touch surface are both detected and used to generate corresponding output pointer position data representing areas of the touch surface where the pointer is located either via contact or proximity. There are basically two general types of touch systems available and they can be broadly classified as “active” touch systems and “passive” touch systems.
Active touch systems allow a user to generate pointer position data by contacting the touch surface with a special pointer that usually requires some form of on-board power source, typically batteries. The special pointer emits signals such as infrared light, visible light, ultrasonic frequencies, electromagnetic frequencies, etc. that activate the touch surface.
Passive touch systems allow a user to generate pointer position data by contacting the touch surface with a passive pointer and do not require the use of special pointers in order to activate the touch surface. A passive pointer can be a finger, a cylinder of some material, or any other suitable object that can be used to contact some predetermined area of interest on the touch surface. Since special active pointers are not necessary in passive touch systems, battery power levels and/or pointer damage, theft, or pointer misplacement are of little concern to users.
Regardless of whether active or passive touch systems are utilized, each touch system may be adapted for use in relation to displays of varying sizes, which may depend on, for example, the size of the conference proceeding and the location (e.g., room, hall, etc.) that it is held. When such events are held at presentation locations that do not have resident touch systems, it is necessary to transport the touch system to the location, deploy it, pack the touch system up at the end of the event, and then remove the touch system from the presentation location. This should all be done in a manner that facilitates both the safe transportation and the efficient deployment/removal of the touch system, while at the same time exhibiting reliable operability.
For example, U.S. Patent Application Publication No. 2007/0109278 to Moon describes an input apparatus and method in a portable terminal. The apparatus includes a display unit having pixels, an optical sensing unit for forming a grid corresponding to the pixels and producing location information of a pointer in the grid, a coordinate converter for converting the location information into coordinate information and computing a location of a cursor based on the coordinate information, and a controller for displaying the cursor at the computed location. The method includes identifying an operation mode of the input apparatus, and if the mode is a key input mode, operating optical sensors for key recognition, displaying a soft keypad, identifying an input location in response to a user input, finding a key value corresponding to the identified input location, and processing the found key value. If the mode is a cursor input mode, the method operates optical sensors for cursor recognition, displays a cursor wait screen, identifies an input location in response to a user input, computes coordinate values corresponding to the identified input location and finds a pixel corresponding to the coordinate values, and changes a color of the found pixel.
U.S. Patent Application Publication No. 2004/0246229 to Yamada describes an information display system comprising an information processing apparatus, an information display apparatus which displays information held in the information processing apparatus on a display surface, and a pointing apparatus which points at an arbitrary position on a display image displayed by the information display apparatus.
U.S. Pat. No. 7,202,860 to Ogawa describes a coordinate input device that includes a pair of cameras positioned in an upper left position and an upper right position of a display screen of a monitor that lies close to a plane extending from the display screen of the monitor and views both a side face of an object in contact with a position on the display screen and a predetermined desk-top coordinate detection area to capture the image of the object within the field of view. The coordinate input device also includes a control circuit which calculates the coordinate value of a pointing tool, points to a position within a coordinate detection field, and based on video signals output from the pair of cameras, transfers the coordinate value to a program of a computer.
U.S. Pat. No. 6,947,032 to Morrison et al. describes a system and method for determining pointer contacts on a touch surface including a touch surface to be contacted by a pointer. At least one imaging device having a field of view looks generally along the touch surface. At least one processor communicates with the at least one imaging device and analyzes images acquired by the at least one imaging device to detect the relative positions of a pointer and a reflection of the pointer therein and thereby determine if a pointer contact with the touch surface has been made.
U.S. Pat. No. 6,828,959 to Takekawa et al. describes a coordinate-position input device that has a frame with a reflecting member for recursively reflecting light provided in an inner side from four edges of the frame forming a rectangular form. Two optical units irradiate light to the reflecting members and receive the reflected light. The frame can be detachably attached to a white board via one or more mounting members. The two optical units are located at both ends of any one of frame edges forming the frame, and at the same time the two optical units and the frame body are integrated to each other.
U.S. Pat. No. 7,232,986 to Worthington et al. describes an apparatus for detecting a pointer within a region of interest that includes at least one pair of imaging devices. The imaging devices have overlapping fields of view encompassing the region of interest. At least one light source provides illumination across the region of interest and is within the field of view of at least one of the imaging device. A filter is associated with the at least one imaging device whose field of view sees the light source. The filter blocks light projected by the light source to inhibit the imaging device from being blinded by the projected light.
U.S. Pat. No. 6,128,585 to Greer describes a sensor array that is positioned at a vantage point to detect and calibrate its reference frame to the external reference frame demarcated by light-emitting reference indicia. The sensor array encompasses a wide view calibration field and provides data indicating the spatial position of light sources placed within the calibration field. A tetrahedron framework with light-emitting diodes at the vertices serves as a portable reference target that is placed in front of the feature sensor to be calibrated. The sensor array reads and calibrates the position of the light-emitting diodes at the vertices while the structured light of the feature sensor is projected onto the framework of the reference target. The structured light intersects with and reflects from the reference target, providing the feature sensor with positional and orientation data. This data is correlated to map the coordinate system of the feature sensor to the coordinate system of the external reference frame. A computer-generated virtual image display compares desired and actual sensor positions through a real time feedback system allowing the user to properly position the feature sensor.
It is therefore at least one object of the present invention to provide a novel portable interactive media presentation system.