The present invention relates to image data editing apparatus configured to produce a sequence of output image frames. The present invention also relates to a method of editing image data frames and a computer-readable medium having computer-readable instructions executable by a computer such that said computer performs steps for editing image data frames.
A computer-based image data editing system is disclosed in United Kingdom patent application number 2 311 394, in the name of the present Assignee. Sequences are formed of collated segments representing audio or video scenes and the architecture of these scenes may be displayed as sequence tracks for ordering the segments of the multi-media sequence. In addition, process tracks are included for applying transformations and/or animations to some of these segments. The output of the process tracks may be supplied to other process tracks and/or to a sequence track. Thus, the system facilitates a method of recording image data in combination with the data representing processes to be performed upon the image data.
A known type of process for application to image data is that of a wipe. An image data editing apparatus may be of the type configured to produce a sequence of output image frames and may include image storage means for storing a first input sequence of image frames and a second input sequence of image frames. Processing means may be configured to generate a wipe animation over a sequence of output frames such that an increasing proportion of the output image is derived from the second sequence and a decreasing proportion of the output image is derived from the first sequence.
A system for generating computer-based wipes between scenes is disclosed in U.S. Pat. No. 5,515,110. When implemented in this way, it is possible to generate wipe shapes with reference to a stored bitmap in computer memory, thereby allowing an editor to design wipes as required. However, in an on-line system, it is likely that a plurality of standard wipe shapes would be available and the wiping process may be driven with reference to vector-based boundary animations.
Boundary animations may be controlled with reference to an animation parameter. However, animation parameters suitable for implementing a wipe originating from the centre of a frame may be inappropriate when the wipe is initiation from other locations within the frame. After a wipe operation has completed, the whole of image frame should be taken up by image data derived from the new frame source, with none of the old frame source image data remaining. However, if the initiating position of the wipe is moved from its central location, it is possible that the wipe boundary may not have fully extended beyond the extent of the image frame at the end of the wipe operation, resulting in a sudden transition to a full image of the new data source. This problem may overcome by scaling the wipe animation parameter or by applying a bias to said parameter. However, if the scaling of the animation parameter is too great, the wipe will complete early, thereby introducing a variability in terms of the duration of the wipe, if the scaling of the animation parameter is too great, the wipe will complete early, thereby introducing a variability in terms of the duration of the wipe, to the extent that it becomes difficult for an operator to accurately define a wipe length. Consequently, when modifying an animation parameter, care must be taken in terms of controlling the extent of modification in order to obtain the required result.
According to a first aspect of the present invention, there is provided image data editing apparatus configured to produce a sequence of output image frames, including image storage means for storing a first input sequence of image frames and a second input sequence of image frames; data storage means for storing a plurality of position parameters in which each of said position parameters relates to a specific position within an image frame; processing means configured to generate a wipe animation over a sequence of output frames such that an increasing proportion of the output image is derived from said second sequence and a decreasing proportion of the output image is derived from said first sequence, under the control of a moving boundary defined by a wipe animation, wherein the rate of expansion of said boundary is controlled with reference to an animation parameter; and said animation parameter is derived by processing at least one of said stored position parameters with reference to a selected animation starting location.
In a preferred embodiment, the data storage means is configured to store a plurality of position parameters relating to specific positions within an image frame, wherein said specific positions are predefined such that similar positions are adopted for each of a plurality of wipe animations. Preferably, the image storage means is configured such that position parameters are stored for nine locations, that may be at the top-left, top-center, top-right, left-center, central-center, right-center, bottom-left, bottom-center and bottom-right, with reference to a viewing screen.
According to a second aspect of the present invention, there is provided a method of editing image data to produce a sequence of output image frames, in which a first input sequence of image frames and a second input sequence of image frames are stored in image storage means, a plurality of position parameters relating to specific locations within an image frame are stored in data storage means, and processing means are configured to perform the steps of: generating a wipe animation over a sequence of output frames such that an increasing proportion of the output image is derived from said second sequence and a decreasing proportion of the output image is derived from said first sequence, under the control of a moving boundary defined by a wipe animation; controlling the rate of expansion of said boundary with reference to an animation parameter, and deriving said animation parameter by processing at least one of said stored position parameters with reference to a selected animation starting location.