With the advent of more powerful portable computing devices, and the growth of electronic distribution of information, there is a continued desire to present such information in a convenient and familiar format that is portable and similar to traditional mediums such as paper-based books. For example, news articles, technical publications, essays, books, and other textual and graphical information are widely circulated on the internet in various digital formats. Devices capable of displaying such information have included desktop computers, laptop computers, handheld personal digital assistants (PDAs), and electronic tablets. Within such devices, display systems have included cathode ray tube (CRT) displays or flat-panel matrix displays, such as liquid crystal displays (LCD) and plasma displays, which were coupled to a video driver device. Some of these display systems have been touch-sensitive, portable and capable of displaying various image types (e.g., text, static graphics, video, etc.). However, these devices do not lend themselves to the familiar look, feel and comfort that many people associate with a book or similar non-electronic device partly because only a single display screen is generally used.
Multiple display systems have also existed. One example being airport flight information systems, which repeat the same image on multiple, geographically dispersed display devices. Another example being desktop computers employing multiple displays for expanding the available screen space. However, these devices are not portable, do not lend themselves to the familiar look, feel and comfort associated with a book or similar device, and do not provide flexibility in the range of possible uses of multiple displays devices for displaying images.
Portable computing devices have been developed, such as PDAs, which employ a single flat-panel display, and electronic books, which employ two flat-panel displays situated next to each other. While these devices provided some functionality, look and feel associated with a book, they do not provide flexibility in the range of possible uses for such a portable, microprocessor-based device. For instance, such devices were generally embedded systems that did not permit customization or versatility, particularly in regards to the display of images on the displays. One aspect of this inflexibility is that orientation of an image on a display could not be easily changed or altered.
Devices with single or multiple displays have used distinct display controllers for each display. Generally, a display controller included an input/output interface, random access memory, a processor, logic circuitry, and a display driver. The input/output interface managed communications between the display controller and the larger computing device. Random access memory stored pixel data, which includes individual pixel address and pixel color data, for the image to display. A processor (e.g., a display processor) has also been used to relieve a main processor (e.g., a central processing unit) in the larger computing device of computing changes to the pixel data. Logic circuitry has been used to retrieve pixel data from the random access memory and to forward pixel data to the display driver. And the display driver has been used to control a display device, which displays the visible image. The display driver, or refresh output, generally updated the display in a line-by-line technique referred to as a “scan mode”. In order to accommodate multiple displays to display distinct images, it was common to employ a separate display controller for each display. Each display controller had separate memory storage, a separate display processor, separate input/output interfaces, etc. In this configuration, the operating software and main processor were burdened to determine which display among the multiple displays should display each pixel image.