As technology rapidly advances, there are more and more sources and types of information available to a consumer. For example, after TVs improved from simple black-and-white TVs to color TVs, cable and satellite TV became available. Most recently, internet-enabled TV became available to provide both TV and internet content. In a fast-paced society such as ours, consumers are looking for ways to enhance the overall viewing experience and to absorb more of the available information from various sources, in less time.
One of the simplest methods of obtaining more information is to use multiple independent displays, such as a having two TVs tuned to two separate channels, or two closed circuit cameras (e.g., security cameras) focussed on separate areas. The problem with this method is that the person viewing the information may have to use various non-cooperating controllers for controlling the different devices, and may even have to physically change locations after viewing one display in order to be able to view another display. Additionally, this type of method usually involves fixed displays, as opposed to at least one portable display. And though it is possible to use a single universal remote control to control both TVs, additional problems exist including the cost of multiple TVs or monitors, the physical space required for the TVs, and the availability of other electronics hardware such as cable receivers, closed circuit wiring, etc.
Other multiple-display systems have displays in close physical proximity to each other and may display related data, but if integration of the data is to occur, it must be accomplished manually, as the underlying systems are not capable of doing so. That is because the physical display devices are each dedicated to providing a limited amount of information related to a limited topic of interest. Examples of these types of systems include radar screens and controls in a cockpit of an airplane, and medical instruments monitoring various vital signs of a patient in a hospital.
One method of enhancing the viewing experience involves using multiple displays to simultaneously display multiple duplicate data streams. This is common in electronics stores where many TVs are all tuned to the same channel, or in exercise classes where many TVs are all playing the same exercise video tape. Though perception of the data stream (i.e., TV program or video tape) might be enhanced in such a case, the viewer is only viewing multiple copies of the same data stream, and is not benefitted with access to data streams of differing content.
Other multiple-display systems display separate parts of a single image, and the displays taken as a whole are designed to represent the desired image. The displays are arranged in a predetermined pattern such as side-by-side, in a matrix, or even to form a cylindrical "circle vision" theater. Again, though the viewing experience is altered and perhaps enhanced, the viewer is only viewing what is designed to be a single image from multiple data streams. In fact, these systems may even detract from the desired image perception due to the gaps between the several display screens and/or the imperfect placement of the various parts of the overall image within the various display screens.
Another attempt to provide consumers with a method of viewing multiple data streams simultaneously to obtain more information, is through the use of windows-type operating systems used with PCs. The user may swap between multiple windows to view various data streams. However, these systems are designed under the assumption that the viewer concentrates on a single type of application at a time. Thus, the same physical display is used for the multiple windows that represent various applications. A particular window may be enlarged to cover the entire display, but then the benefit is lost of having the capability to view multiple data streams substantially simultaneously. Furthermore, the individual applications are typically independent of each other, and do not communicate with each other except for perhaps during data transfer. Even where multiple windows are viewed simultaneously, such as during video conferencing where a person may view multiple conference "attendees" in various windows, the images of the attendees all appear on the same physical display, and the more attendees there are, the less room there is for each to be displayed.
Various systems also exist that attempt to solve the problem of viewing multiple data streams substantially simultaneously with two or more cooperating displays. A common system is a PIP (picture-in-picture) system, which allows simultaneous viewing of two TV programs at once, for example. Though the viewer may "channel surf" a first data stream without affecting the overall image produced by the second data stream, both data streams are displayed on the same physical display, and therefore there is some degradation of the viewing experience of each. For example, a primary program may be playing on the entire physical TV display while a secondary program (the PIP) is playing in a small area in the corner of the display. The primary program is degraded due to the screen real estate used up by the secondary program, and the secondary program is degraded because it is not able to occupy the full size of the physical display.
Similarly, existing TV interfaces provide a wide variety of available functions such as data display (e.g., to adjust attributes of the TV and related devices), advertisements, internet browsing, alternate channel viewing, etc. But the additional functions are all managed by assigning different amounts of the same physical screen to the additional function or functions. This approach of sharing the same physical screen real estate for all operations performed by a device means that combined activities restrict the display potential of each other, unless each activity occurs serially, which would defeat the entire purpose of allowing simultaneous viewing of multiple data streams in the first place. Moreover, the resolution of standard TV display formats such as NTSC (the American standard) and PAL (the British standard) are not very suitable for displaying static text.
Another type of system that involves data integration to some extent is a computer network, or even simply a remote PC communicating with a host. Software synchronizes the data files stored on each device, and files can then be downloaded from the host to the remote PC, edited remotely on the PC, and then transferred back to the host. However, this setup is designed primarily for one or the other device to be used independently. The simultaneous use of both devices occurs only during file transfer, or in some cases to achieve the same type of extended screen real estate for a single-image display that occurs with two or more displays side-by-side.
Thus it would be desirable to provide systems and methods for using two or more cooperating but physically independent displays for achieving enhanced viewing and/or browsing of data on each display, without affecting the viewing and/or browsing of data on the other displays.