The invention relates generally to methods and apparatus for providing video overlays and more particularly to methods and apparatus for providing video overlay using a plurality of scalers.
Video overlays are typically provided by graphics accelerators or other processing devices to provide a video overlay within a graphic image on a computer screen, Internet appliance device, television, or any other suitable display device. The refresh rate of a display device is typically 50 Hz or more, although the rate at which images change may be as low as 24 Hz. Consequently, there is not much time to scale and display a video overlay. The problem is compounded particularly if the destination window of the video overlay is small vertically because the vertical active time of the destination window is a smaller fraction of the total refresh time. A scaling operation must be done even more rapidly while still allowing suitable image quality.
As is known, video overlay circuits and methods typically apply a scaling operation that requires the interpolation of pixels from a source window in both x and y directions to get high quality images at a different resolution. Ideally, a graphics processor will analyze more source pixels for each destination pixel when the destination window is smaller than the source window. Therefore, the processor typically needs more memory bandwidth and does more signal processing as the downscaling ratio increases.
One solution to reducing the memory bandwidth and signal processing has been to have the scaling process drop lines. For example, a scaling algorithm may read every other line of a source window. However, the resulting video overlay can have lower quality since valuable information is thrown away to meet timing requirements.
One example of a graphics processor is shown in FIG. 1. Typically, an application 100, such as a DVD driver, television display application or any other suitable application provides overlay control information to an overlay control driver associated with a graphics processing device or other suitable device. The overlay control information may include, for example, such information as the size of a new video overlay window, the video source base address, and any other suitable information. The overlay control driver 102 sets up hardware of a graphics controller or other suitable hardware to scale the video image stored in frame buffer 104 for display on display device 106. The overlay control driver 102 provides scaler control information 108 to a display engine 110. The scaler control information 108 can include, for example, source surface descriptor information such as the base address of the source image, the required pitch, a color format for the video overlay, a source window size and a destination window size and location, such as where on the display the destination surface (i.e. overlay) should appear.
The display engine 110 includes a scaler 112, often referred to as a back end scaler, that performs some type of spatial scaling in hardware. The scaler 112 in the display engine 110 scales data from the frame buffer and sends the scaled data directly to the monitor through display driver 114. The display engine 110 also includes the display driver 114, such as a CRT controller, and a primary surface reader 116. The primary surface reader 116, as known in the art, displays the primary surface 118 stored in frame buffer 104. The primary surface 118 is typically a graphics window or other suitable surface into which the video overlay is placed. The video overlay is the destination surface 120 which is a scaled version of the source window 122.
A 2D/3D engine 126 may also be present in the graphics controller which provides two-dimensional or three-dimensional rendering as required for video games or other suitable applications. The 2D/3D engine 126 includes a front end scaler 128. Unlike the back end scaler 112, the 2D/3D engine scales data from the frame buffer 104 and returns the scaled information back to the frame buffer 104 since it is typically not constrained by timing requirements associated with the refresh rate of the display device. Accordingly, the 2D/3D engine 126 is typically used to render images by a non-real time source such as a game, or any other suitable information. In contrast, the scaler 112 in the display engine is constrained by timing requirements associated with the display device 106. The pixel format of the primary surface 118 can have a lower color resolution than the pixel format of the source surface 130 for the display engine scaler. For example it may use only 8 or 16 bits to represent each pixel. Even if it uses 32 bits per pixel, it will typically use an RGB color space which cannot represent all of the colors of the YUV color space that is typically used for video. Therefore, if the 2D/3D engine 126 were used to color convert and draw the source surface 130 into the primary surface 118 directly there would be a loss of image quality. Accordingly, artifacts may sometimes be displayed. The 2D/3D engine 126 can perform a type of backup scaling and store the downscaled data in the primary surface 118. This will have the limitations mentioned above. For displaying two videos, such as a picture within a picture, the display engine scaler may be used as a primary scaler when each scaler is used to scale one of the video streams. However, a problem still occurs with downscaling using the 3D engine scaler 128 since the image is downscaled and stored to the primary surface 118 at low quality.
Consequently, a need exists for an improved method for generating a high quality downscaled video overlay for display on a display device.