Most modern stadiums and live entertainment facilities or arenas (herein also collectively referred to as “venues”), which feature sporting events and concerts, typically employ large television screens that receive video images and are linked within the stadium to a plurality of television cameras positioned to capture video images at diverse locations within the stadium. The audience at a typical sporting event, for example, can generally view advertisements, instant replays, and other sports related data on the large television screens within the sports stadium itself. Feeds are additionally generally provided from the cameras to announcers in a broadcast booth, replaying certain plays from the event so that the announcers and can make comments about plays, and finally transmitting a telecast to the viewing audience, including some aspects of captured video and data to the stadium audience.
Despite the availability of such large screen television monitors, venue event audience members still lack enhanced viewing options or perspectives within the stadium itself. To compensate for the lack of viewing options, sports and concert promoters often rent binoculars to audience members prior to or during the event. Such binoculars can permit the typical audience member to obtain a somewhat better, but limited, view of the event, such as a baseball, a basketball, a football or a hockey game, but even these views are often obstructed by other audience members and are tied to only one perspective.
The large television screens placed in the stadium are typically linked to cameras that are either fixed and mobile, the placement of the cameras about the stadium or venue are generally tied to an enterprise system. The movement of the game ball in a baseball or football game, for example, along with the players on the field is dynamic and unpredictable, and may not always be caught by the active camera having the best perspective. Thus, during a game, the large television screens typically provide only one view, which can be obstructed further by other players or officials, often destroying a critical angular view.
In addition, such large screens are often utilized to bombard audience members with advertisements, thereby cutting into data such as instant replays at a time when an audience member might otherwise wish to view instant replays, a current play or other event data. The audience members, therefore, essentially view the large screen at the behest of the camera operator and cannot select their own views or camera angles.
Based on the foregoing, the present inventors have found that such problems in venue environments can be solved through the use of hand held devices, such as PDAs, data/video-enabled cellular telephones, and other hand held wireless video-enabled devices. For example, the recent shift in the consumer electronics industry from an emphasis on analog technology to a preference for digital technology is largely based on the fact that the former generally limits the user to a role of a passive recipient of information, while the latter is interactive and allows the user to control what, when, and how he or she receives and manipulates certain formation. This shift in focus has resulted in the development and increasingly widespread use of a digital device generically referred to as a “personal digital assistant” (PDA).
These devices are hand held computing devices (i.e., hereinafter referred to as “hand held devices” or “handheld devices”) that are becoming increasingly popular for storing and maintaining information. Although PDAs may be connected to a desktop personal computer or other PDAs via infrared, direct wire, or wireless communication links, PDAs and similar hand held devices, can be linked to remote networks, such as the internet, or local wireless resources, through available wireless communications techniques.
The most advanced data- and video-enabled wireless communication devices currently available in the marketplace take the form of a PDA (such as the Palm OS, Handspring OS, and Windows CE compatible hand held computers). Unlike personal computers, which are general-purpose devices geared toward refining and processing information, PDAs are designed to capture, store and display information originating from various sources. Additionally, while a certain level of skill is required to use a personal computer effectively, PDAs are designed with the novice and non-computer user in mind.
A typical PDA includes a microprocessor, memory unit, a display, associated encoder circuitry, and selector buttons. It may optionally contain a clock and infrared emitter and receiver. A graphical user interface permits a user to store, retrieve and manipulate data via an interactive display. A PDA may also include a calendar, datebook, and one or more directories. The calendar shows a month of dates organized as rows and columns in the usual form. The datebook shows one day at a time and contains alphanumeric text entered in free format (typically, with a time of day and an event and/or name). Each directory contains entries consisting of a name field and a free form alphanumeric text field that can contain company names, addresses, telephone and fax numbers, email addresses, etc.
Entries may be organized alphabetically according to the name field and can be scanned or searched for by specifying a specific sequence of characters in the name field. A menu displayed via the graphical user interface permits a user to choose particular functions and directories. Most PDAs come equipped with a stylus, which is a plastic-tipped pen that a user utilizes to write in, for example, a “graffiti area” of the display and tap particular graphically displayed icons. Each icon is indicative of a particular activity or function. Touch screen interfaces, however, are also increasingly being implemented with PDAs to permit a user to activate software modules in the form of routines and subroutines therein.
Attempts have been made to provide venue-based, interactive entertainment to enhance the fan experience at live events. Such attempts utilize touch-screen technology integrated directly into seats at outdoor or indoor arenas. Such devices, however, due to their integration with the viewer seat, can be easily damaged by audience members. Systems that incorporate such devices are also expensive because they literally require miles of cable.
Some recently constructed arenas, for example, that implement such seat-integrated technology are requiring hundreds of miles of electronic cabling, including audiovisual, broadcast, and multiband lines. Such a plethora of large cables are expensive and require extra space, which often cannot be found in older stadiums, or would require a greater expense to integrate into newly built stadiums. The cost of retrofitting an older stadium with such technology can be staggering. Additionally, many fans who attend games or concerts with such technology integrated directly into the seats may find such a feature distracting.
Another problem faced by venue promoters and arena owners who integrate fixed technology directly into the seat is that such technology can quickly become obsolete. If a new facility is fitted with such electronic/data intensive technology, the technology may become quickly outdated, requiring an expensive update and/or retrofit.
The present inventors thus realize that a solution to these problems lies in the use of wireless hand held devices. By utilizing modern technology integrated with hand held devices, on-demand live action, instant replays from multiple camera angles, and real-time team and venue information may be readily provided to fans without the expense and problems associated with present in-seat integrated technical environments. Additionally, it is anticipated that the deployment of venue-based systems facilitating the use of such devices would be relatively inexpensive, at least in comparison to seat integrated systems. Finally, such systems will provide the venue attendee with increased mobility and freedom of use within and throughout the venue environment.