(1) Technical Field
The present invention relates to the fields of computer vision, isochronous networking, parallel computing, augmented reality, media synchronization architecture, and computer game architecture. More specifically, the present invention pertains to a method and system for real-time, group interactive, augmented-reality area monitoring, which is suitable for enhancing the enjoyment of entertainment events performed in an area being monitored. The system is configured to allow a user to interactively augment, in real-time, the display of an entertainment event with user-inputted doodles, user requested information and statistics, and user selected viewing angles. In addition, the system allows a user to interact in real-time with other users viewing the entertainment event.
(2) Discussion
Currently, watching a sporting event from an arena or stadium luxury box is a passive event. There is no way for the sports fan to interact with information about the game or with other similar sports fans. Therefore, there exists a need in the art for a real-time, group interactive, augmented-reality area monitoring system, which is suitable for enhancing the enjoyment of entertainment events performed in an area being monitored, by allowing the users to interact remotely with each other and with information related the entertainment events.
In recent years, many systems have been proposed to allow the fans to choose their own camera angles in entertainment events and to compose their own highlight videos of the events. However, none of the proposed systems are both interactive and real-time, and none of them are designed to be used in the arena. As discussed in “Sports Use Technology and T.L.C. to Hold On to Fans,” The New York Times, Jul. 31, 2001 by J. Longman, the News Corporation's British Sky Broadcasting (BSkyB) service SkySports in the U.K., currently provides the capability for fans to choose their own camera angles for soccer broadcasts. However, there is no indication that this system has the capability to augment the video with geolocated information interactively requested by the fans, as the present invention does. In the 2001 championship series, the National Basketball Association positioned robotic cameras around the arena and allowed the fans to compose their own highlight videos ten minutes after each quarter ended, as discussed in “Sports Use Technology and T.L.C. to Hold On to Fans,” The New York Times, Jul. 31, 2001 by J. Longman. However, even on this case, there is no indication that the capability was provided to the fans to augment the video with information requested by the fans as the present invention does.
Sportvision has several innovative augmented reality products, such as 1st and Ten Line, Virtual Caddy, and NASCAR-in-Car, but there is no indication that any of these products are interactive with the fans. Furthermore, according to “The FoxTrak Hockey Puck Tracking System,” IEEE Computer Graphics and Applications, March-April 1997, pp. 6-12, discussed by R. Cavallaro, there is no indication that Fox Sports Productions' FoxTrak is interactive with the fans either. In addition, the Graphical Video System of Sharir et al. is not real-time as discussed by Sharir in the Graphical Video System, Orad Hi-Tec Systems Limited, Apr. 30, 2002 (U.S. Pat. No. 6,380,933).
There are several examples of recent companies that provide information-intensive services, such as Quokka Sports, CBS SportsLine, and SkySports which provide in-depth sports information to subscribers. Quokka goes even farther than the others and provides interesting graphical representations for their content. However, none of these companies provide these capabilities on live action.
In addition, there are several examples of companies that provide sports electronic community services, such as SkySports and many others which provide electronic communities for sports discussion, but none of these companies integrate this sports discussion with arena experience, live video, or augmented displays.
Other prior art techniques use augmented reality technology in sports broadcasting, such as the art discussed by Welch et al. in “Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment,” (U.S. Pat. No. 4,970,666) granted Nov. 13, 1990. Here Welch et al. make broad claims about merging virtual images into the real world. However, the patent does not contain any mention of tracking, does not integrate the system with an electronic community service, and does not provide wireless access to this electronic community.
Furthermore, Princeton Video Images has several patents on virtual billboards, including the one discussed in “System and method for downstream application and control electronic billboard system,” Princeton Video Image, Inc., Aug. 6, 1996 (U.S. Pat. No. 5,543,856) by Rosser et al. However these patents deal with geolocation of images on stationary objects, and these patents are not interactive with the viewer. Also the FoxTrack Hockey Puck patented by Cavallaro in 1997 (U.S. Pat. No. 6,154,250) tracks objects in real-time and in real video, and places graphical objects in real-time video, however the FoxTrack Hockey Puck does not allow fan interaction with the tracked objects.
Sportvision's 1st and Ten Line (U.S. Pat. No. 6,266,100) are real-time systems that track a stationary object (the field), and handles occlusion by players well; however, Sportvision's 1st and Ten Line are not interactive with fans. In addition, the Sportvision's K Zone computer generated strike zone works on replay video, and inserts georegistered graphics into the video, but it is not interactive with fans as discussed by A. Gueziec in the publication “Tracking Pitches for Broadcast Television,” Computer, March 2002, pp. 38-43.
The system of Sharir et al.'s discussed in the patent “Graphical Video System,” Orad Hi-Tec Systems Limited, Apr. 30, 2002, (U.S. Pat. No. 6,380,933) supports generation and insertion of virtual players, however this invention works only from recorded video, and it is not a real-time system.
Other prior art techniques exist in multi-user real-time augmented reality such as the system discussed in “Multi-User Real-Time Augmented Reality System and Method,” Hughes Electronics Corporation, Nov. 12, 2001 (U.S. Pat. No. 6,317,127) by Daily et al. However, none of these other prior art techniques makes claims to provide interaction with the video content.
In addition, geometry extraction from still images using multiple-view geometry has been shown in “Self-Calibration and Metric 3D Reconstruction from Uncalibrated Image Sequences,” Ph.D. dissertation, Katholieke Universiteit Leuven, Heverlee, Belgium, May 1999 by M. Pollefeys, in “Multiple View Geometry in Computer Vision,” Cambridge University Press, Cambridge UK, 2000 by R. Hartley and A. Zisserman, and in “The Geometry of Multiple Images,” The MIT Press, Cambridge, Mass., 2001, by O. Faugeras and Q-T Luong. However these methods have not used video-only still images, and these methods were not designed to be interactive.
Prior art exists for inserting images into live video broadcasts as discussed in “System and method of real time insertions into video using adaptive occlusion with a synthetic reference image,” Princeton Video Image, Inc., Sep. 14, 1999, (U.S. Pat. No. 5,953,076), by Astle et al. However the images in this patent are static in the scene, and the patent makes no interactivity claims as discussed by Astle et al.
For the foregoing reasons, there is a great need for a system that performs real-time, group interactive, augmented-reality monitoring of an area, suitable for enhancing the enjoyment of entertainment events, where the system uses user-requested queries to interactively augment in real-time the user's local display of events with user-inputted doodles, user-requested information and statistics, and user selected viewing angles that provide high-quality views of an event. In addition, there is a need for a system that allows users to interactively share their locally augmented-reality display with users of other display centers in real-time, and a system that provides an electronic community for the users before, during, and after the game. So that the users can chat with and razz their friends via WAP enabled phones on the way to the game, continue the same thread during the game, and finish up on the way home and at home from desktop machines. Moreover, there is a further need for a system that allows users to switch their display device to display a video game mode where the user continues the entertainment event (sport game) as the user would like to see the entertainment event (sport game) play out, until the user decides to switch back to live action.
The following references are presented for further background information:    [1] National Standard for Information Technology—Fibre Channel—Physical and Signaling Interface (FC-PH), American National Standards Institute, Inc., Nov. 14, 1994.    [2] “Prices Are Set for GameCube, Xbox, PlayStation 2 Video Game Boxes,” Associated Press reported in The Wall Street Journal, Dow Jones & Company, May 22, 2001.    [3] Connexion by Boeing, “Frequently Asked Questions,” (http://active.boeing.com/connexion/Body.cfm?left=2A).    [4] Cavallaro, R., “The FoxTrak Hockey Puck Tracking System,” IEEE Computer Graphics and Applications, March-April 1997, pp. 6-12.    [5] SEC Form 10-K for Clear Channel Communications, Inc., Dec. 31, 2001.    [6] de Bonvoisin, S., “EU Clears General Motors Takeover of Daewoo Motor,” The Wall Street Journal, Dow Jones & Company, Jul. 23, 2002.    [7] Faugeras, O. and Luong, Q-T., The Geometry of Multiple Images, The MIT Press, Cambridge, Mass., 2001.    [8] Glaskowsky, P. N., “Microsoft Weighs in With X-Box,” Microprocessor Report, Cahners Publications, April 2000.    [9] Grenier, M. P., “Broadband User Growth Slows as Higher Costs Prove Obstacle,” The Wall Street Journal, Dow Jones & Company, Oct. 18, 2001.    [10] Guéziec, A., “Tracking Pitches for Broadcast Television,” Computer, March 2002, pp. 38-43.    [11] Hartley, R. and Zisserman, A., Multiple View Geometry in Computer Vision, Cambridge University Press, Cambridge UK, 2000.    [12] Heinzl, M., “Telecoms Hunt for Broadband ‘Killer Apps’ to Use Nearly Empty Data Delivery Pipes,” The Wall Street Journal, Jun. 14, 2001.    [13] Hindus, L. A., “CMOS: The Image of the Future,” Advanced Imaging Magazine, May 2001, pp. 33-35.    [14] Data sheet, “Kodak Digital Science KAC-1310 CMOS Image Sensor,” Eastman Kodak Company, Feb. 8, 2001, (http://www.kodak.com/US/plugins/acrobat/en/digital/ccd/kac1310Long.pdf).    [15] Kanade, T. et al, “Virtualized Reality: Digitizing a 3D Time-Varying Event As Is and in Real Time,” Proceedings of the International Symposium on Mixed Reality (ISMR 99), Springer-Verlag, Secaucus, N.J., 1999, pp. 41-57.    [16] Longman, J., “Sports Use Technology and T.L.C. to Hold On to Fans,” The New York Times, Jul. 31, 2001.    [17] Web page, Xbox Video Game System, Microsoft Corporation, (http://www.xbox.com/hardware/consoles/xbox.htm).    [18] Web page, high definition av pack, Microsoft Corporation (http://www.xbox.com/hardware/adapters/highdefinitionavpack.htm).    [19] Pollefeys, M., Self-Calibration and Metric 3D Reconstruction from Uncalibrated Image Sequences, Ph.D. dissertation, Katholieke Universiteit Leuven, Heverlee, Belgium, May 1999.    [20] Mills, D. L., Network Time Protocol (Version 3), Network Working Group, Internet Engineering Task Group, March 1992 (http://www.ietf.org/rfc/rfc1305.txt).    [21] Sweet, D., “Professional Sports Teams Use New Technology to Sell Tickets,” The Wall Street Journal, Jan. 10, 2000.    [22] Sweet, D., “Where Vince, and Even Luc, Become Human Highlight Reels,” The Wall Street Journal, Dec. 6, 2000.    [23] Tran, K. T. L., J. Angwin, “Sony, AOL Announce Agreement to Link PlayStation 2 to Internet,” The Wall Street Journal, May 15, 2001.    [24] (U.S. Pat. No. 4,970,666) Welsh et al, Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment, granted Nov. 13, 1990.    [25] (U.S. Pat. No. 5,543,856) Rosser et al., System and method for downstream application and control electronic billboard system, Princeton Video Image, Inc., Aug. 6, 1996.    [26] (U.S. Pat. No. 5,802,284) Karlton et al., System and method for using cover bundles to provide immediate feedback to a user in an interactive television environment, Silicon Graphics, Inc., Sep. 1, 1998.    [27] (U.S. Pat. No. 5,953,076) Astle et al., System and method of real time insertions into video using adaptive occlusion with a synthetic reference image, Princeton Video Image, Inc., Sep. 14, 1999.    [28] (U.S. Pat. No. 6,154,250) Honey et al., System for enhancing the television presentation of an object at a sporting event, Fox Sports Productions, Inc., Nov. 28, 2000.    [29] (U.S. Pat. No. 6,266,100) Gloudemans et al., System for enhancing a video presentation of a live event, Sportvision, Inc., Jul. 24, 2001.    [30] (U.S. Pat. No. 6,317,127) Daily et al., Multi-User Real-Time Augmented Reality System and Method, Hughes Electronics Corporation, Nov. 12, 2001.    [31] (U.S. Pat. No. 6,380,933) Sharir et al., Graphical Video System, Orad Hi-Tec Systems Limited, Apr. 30, 2002.    [32] Yang, S-j, “SKT Launches New Technology in Incheon,” The Korea Herald, Jan. 28, 2002.    [33] “Phone Giants Set Standards for Broadband Construction,” Associated Press reported in The Wall Street Journal, May 29, 2003.    [34] Berman, D. K., “Local Bells Look to Fiber to Stem Some Losses,” The Wall Street Journal, Jun. 19, 2003.    [35] Dockstader, S. L., A. M. Tekalp, “Tracking Multiple Objects in the Presence of Articulated and Occluded Motion,” Proceedings of the Workshop on Human Motion, Austin, Tex. 7-8 Dec. 2000, pp. 88-95.    [36] Dockstader, S. L., A. M. Tekalp, “Multiple camera tracking of interacting and occluded human motion,” Proceedings of the IEEE, October 2001, pp. 1441-55.    [37] Web site, “EA Sports NBA Live 2003,” Electronic Arts Inc., http://www.easports.com/platforms/games/nbalive2003/home.jsp.    [38] Gavrila, D. M., L. S. Davis, “3-D model based tracking of humans in action: a multi-view approach,” Proceedings CVPR '96: 1996 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 18-20 Jun. 1996, San Francisco, 1996.    [39] Li, Z., H. Wang, “Real-Time 3D Motion Tracking with Known Geometric Models,” Real Time Imaging 5, Academic Press, pp. 167-87.    [40] Web page, “Nokia N-Gage”, Nokia, http://www.n-gage.com/n-gage/home.html.    [41] Rehg, J. M., T. Kanade, “Model-based tracking of self-occluding articulated objects,” Proceedings, Fifth International Conference on Computer Vision, IEEE Computer Society Press, 20-23 Jun. 1995, Cambridge, Mass., 1995, pp. 612-7.    [42] Mills, D. L., Network Time Protocol (Version 3), Network Working Group, Internet Engineering Task Group, March 1992 (http://www.ietf.org/rfc/rfc1305.txt).    [43] Press release, “Yao Ming Basketball by Sorrent to be Distributed Exclusively via QUALCOMM's BREW Solution,” QUALCOMM Incorporated, Apr. 29, 2003.    [44] Press release, “Sony Ericsson unveils the T606, a CDMA phone with state-of-the-art mobile entertainment and imaging features,” Sony Ericsson, Mar. 4, 2003.    [45] Web page, “Sony Ericsson—Specifications for the T310,” Sony Ericsson, Jun. 20, 2003.    [46] Yang, S-j, “SKT Launches New Technology in Incheon,” The Korea Herald, Jan. 28, 2002.