A tremendous amount of video content is produced professionally every year, with applications in television broadcast, movie production, and webcasts to name a few. The vast majority of this content is two dimensional—NTSC, PAL, or High Definition video, with many other formats, resolutions, sampling rates, and intended uses. Another very large content production source is 30 modeling and character animation, with applications in video games, movies, and television, virtual worlds, simulators, etc. This content is produced and stored in three dimensional format, though in applications such as movies and television the produced product is a two dimensional projection of the three dimensional asset, whether a character, prop, scene, etc.
For some applications, it would be beneficial to be able to capture live video from commercially available and in-current-use video systems, such as TV or movie cameras, and be able to project that video stream into a three dimensional virtual coordinate system, which underpins any three dimensional modeling application, such as a virtual set, a virtual world, etc. However, there is an extremely limiting condition common to extant approaches. The novel systems and methods presented herein overcome these limitations, and address the need in modern video production practice to move studio cameras and change the zoom and focus of the cameras dynamically, while a performance is under production, including possibly being broadcast live. This novel approach is distinct from traditional approaches of setting studio camera locations, setting the optics at one particular zoom and focus, and acquiring calibration and production video at that one set of camera parameters, internal and external.
Others deal with this issue only partially and in a way that does not allow or suggest the un-restricted production control practice of a freely moving, zooming, and focusing studio camera. Reynolds U.S. Pat. No. 5,889,550 teaches a method to deal with a moving camera, but requires and is dependent on acquiring video with fixed optics. Alexander, US Patent Publication No. 20070076096 teaches a method for calibrating moving, dynamic-optics imagers that is distinct from the novel method herein and is limited by two restrictions: the devices described are 3D imagers including an illumination system, and more importantly the method requires the use of a very significant pre-calibration device Alexander US Publication No. 20070104361, including a linear motorized track and a large planar checkerboard target. In contrast, the novel methods described herein provide for complete freedom of movement of the studio camera within the volume of interest, and allow un-restricted control of the studio camera optics (zoom and focus), with the complete calibration performed in-situ, with the only calibration devices needed a simple ruled rigid stick with two or more visible features (often retro-reflective markers) and a rigid ruled right-angle, with three or more ruled visible features (often retro-reflective markers).