U.S. Pat. Nos. 4,100,569, 4,344,085, 4,589,013 and 4,625,231 describe a unique linear method for compositing a foreground subject disposed between a colored backing and camera, and a background scene, to form a composite image. This method reproduces at full level everything in the foreground scene that is visible to the cameras (except the colored backing), without attenuation and without in any way altering the subject, even when it is a puff of steam, a wisp of hair, or a reflection from a window. It is linear because it also reproduces the background scene behind and through the transparent portion of the foreground scene as a linear function of the luminance and visibility of the colored backing.
The background scene video level is regulated by control signal E.sub.c, which is proportional to the luminance and visibility of the backing. The blue backing is removed, not by switching it off, but by subtracting a portion of E.sub.c equal to each of the blue, green, red (B, G, R) components of the backing. Having its video signal reduced to zero leaves the backing ready to accept the background video signal by simple addition. The listed patents explain these functions in full detail.
In this invention, the E.sub.c control signal is improved by removing some of its limitations. These limitations will become apparent in the following description. In its simplest form: EQU E.sub.c =B-max(G,R) Eq.-1
where "max" designates the larger of green and red.
As the scanning spot leaves the blue backing and begins to enter the subject, the E.sub.c control signal begins to drop and, ideally, should just reach zero as the scanning spot just clears the backing and is fully within the subject. For a sharply focused subject this transition occurs within the width of the scanning spot. For an out-of-focus subject the transition occurs over the width of the semi-transparent blurred edge.
It is essential that E.sub.c go fully to zero when the foreground subject is fully opaque, otherwise the background scene will become visible through the opaque foreground subject. Such print-through is unacceptable in production quality composites.
An E.sub.c that just reaches zero in Equation 1 occurs for grey scale objects because B, G, R are equal to each other.
E.sub.c also just reaches zero for cyan (B=G) and for magenta (B=R).
If a blue subject is examined however, such as blue eyes (typical values for RGB are B=80, G=70, R=60), then E.sub.c in Equation 1 will not reach zero and will cause print-through of the background scene even though the subject is opaque. For this reason Equation 1 is modified to add a constant K.sub.2 as follows: EQU E.sub.c =B-K.sub.2 max(G,R) Eq.-2
By raising K.sub.2 from 1.0 to 1.14 (14%), G is increased from 70 to a level of 80, and E.sub.c then becomes zero. However, increasing green (and red) by 14% will cause E.sub.c to reach zero for grey scale objects before the scanning spot clears the backing. Being zero, E.sub.c has shut off the background scene before the scanning spot has fully entered the subject. The lack of background video at the edge of the foreground subject leaves a discernible dark line around the foreground subject.
In Eq.-2, E.sub.c also fails to reach zero when a black glossy subject causes a low-level reflection of the blue backing. Since there is little green or red in the blue backing, K.sub.2 is not effective for reducing E.sub.c to zero for black glossy objects reflecting the backing. K.sub.1 is therefore added for this purpose. The E.sub.c equation now reads: EQU E.sub.c =[(B-K.sub.1)-K.sub.2 max(G,R)].sup.+ Eq.- 3
(The meaning of the superscript "+" is described below.) This equation may be rewritten as: EQU K E.sub.c =[K.sub.1 B-K.sub.2 max(G,R)-K.sub.3 ].sup.+ Eq.- 4
which achieves exactly the same result as equation 3.
An alternate method for reducing E.sub.c to zero is as follows: EQU K E.sub.c -K.sub.1 =[B-K.sub.2 max(G,R)].sup.+ Eq.- 5
Equations 3, 4 and 5 are simply variations of the same equation and produce the same result.
In some scenes, particularly close-ups, a person will be the subject of primary interest, and it is important to eliminate the dark line around flesh tones. This was done in the referenced patents by adding the terms K.sub.3 and K.sub.4 to produce the following equation: EQU E.sub.c =[(B-K.sub.1)-K.sub.2 max(K.sub.3 G,K.sub.4 R)].sup.+Eq.- 6
Since the red constant of flesh tones is at least twice as great as the blue or green content, E.sub.c goes to zero much too soon thus leaving a dark edge. This problem was solved in the referenced patents by reducing K.sub.4 R in Eq.-6 to approximate the level of blue in flesh tones thus resulting in E.sub.c reaching zero just as the scanning spot fully clears the blue backing.
K.sub.3 permits the reduction of K.sub.3 G when a green subject is the center of interest.
There are times, although rare, when the blue backing is actually cyanish rather than blue. In this event, the E.sub.c equations above simply cannot develop a full level E.sub.c in the backing area. This was corrected in the later referenced patents by adding an additional term K.sub.5 as follows: EQU E.sub.c ={(B-K.sub.1)-K.sub.2 [K.sub.5 max(K.sub.3 G,K.sub.4 R)+(1-K.sub.5) min(K.sub.3 G,K.sub.4 R)]}.sup.+ Eq.- 7
where "min designates "the lessor" of the terms.
In the case of the cyanish backing, K.sub.5 would be set to zero thus reducing equation 7 to: EQU E.sub.c =[(B-K.sub.1)-K.sub.2 min(K.sub.3 G,K.sub.4 R)].sup.+Eq.- 7.1
Normally, and assuming a good blue backing, the E.sub.c control signal sets K.sub.5 (in Eq. 7) to 1.0, producing equation 6.
It should be noted that setting "K" values to 1.0, zero or some other value, does not create a new equation.
In the equations above,
K.sub.1 =Black gloss control PA1 K.sub.2 =Matte density control PA1 K.sub.3 =Green matte density control PA1 K.sub.4 =Red matte density control PA1 K.sub.5 =Backing purity control PA1 K.sub.20 =White Balance PA1 K.sub.22 =Gate 1 / 3 [gate 1 or gate 3] PA1 K.sub.23 =Gate 2 PA1 1- K.sub.20 is nominally 1.0 for a white balance on white foreground objects. PA1 2- K.sub.22 is normally 1.0 for most scenes, so that blue foreground objects preserve their color. In situations where the foreground object is green, blue flare will cause the color to turn cyan. Setting K.sub.22 to 0.0 will restore the green color to the foreground object. PA1 3- K.sub.23 is normally set to 0.0, which prevents foreground flesh colors from taking on a magenta tint from the blue spill light. If it is essential to reproduce pink or magenta colors, then K.sub.23 is set to a value between 0.0 and 1.0. It is important to set K.sub.23 to a value just sufficient to produce a subjectively satisfying pink or magenta, thereby minimizing the degree to which flesh tones take on a magenta tint.
All of the preceding explanations and equations are found in the referenced patents. Table 1 lists typical B, G, R video levels for several colors including pale blue (eyes) and a bright blue. E.sub.c values were calculated using equation 1. E.sub.c must be 100 for full turn-on of the background (BG) scene, and zero (or below) for full shut-off of the background scene when occluded by opaque subjects.
In the referenced patents, E.sub.c is prevented from going negative by the use of a zero clip. The zero clip is designated in the equations by the plus (+) symbol placed as a superscript on the terms generating E.sub.c such as: E.sub.c =[- - -].sup.+. The zero clip is achieved using an "OR" gate with one of its inputs set to zero.
There are other ways of generating a zero clip that are not described in the referenced patents. For example, in the equation: EQU E.sub.c =max(B,G)-max(G,R) Eq.-8
the term max(B,G) effectively introduces a zero clip for certain colors.
A zero clip for all colors is achieved by the following equation: EQU E.sub.c =K.sub.1 [max(K.sub.b B,K.sub.g G,K.sub.r R)-K.sub.2 max(K.sub.g G,K.sub.r R)] Eq.-9
While equation 9 appears to be different than equation 6, it produces exactly the same result. Table 1 lists the values of E.sub.c without the zero clip so as to show the negative E.sub.c before being clipped. The size of the negative number is also an indication of the point in the transition from backing to subject when E.sub.c =zero. If E.sub.c is 100 for the blue backing and -100 for a reddish subject, then E.sub.c becomes zero when the scanning spot is half way through the transition. Equation 9, when reset to calculate E.sub.c in Table 1, will show an E.sub.c of zero for all colors except the blue colors. However, each of its two terms include the expression "MAX", which is an "OR" gate and therefore a zero clip. In the transition from backing to subject, E.sub.c reaches zero for a given color at exactly the same point in the transition for equations 1, 2, 3, 4, 5, 6, 7, 8, and 9. These equations are all variations of the same basic equation best expressed as equation 7, and all produce the same zero crossing for the colors in Table 1.
While equations 4, 5, 8 and 9 are not specifically shown in the referenced patents, they perform the same function and get the same result. Therefore, they are not considered to be new or different or more useful.
TABLE 1 ______________________________________ COLOR B G R E.sub.c E.sub.c .times. 1.67 ______________________________________ Blue backing 80 20 20 60 100 Blue eyes 80 70 60 10 17 Vibrant blue 80 50 20 30 50 White 80 80 80 0 0 Grey 20 20 20 0 0 Black 5 5 5 0 0 Cyan 80 80 20 0 0 Magenta 80 20 80 0 -100 Green 20 80 80 -60 -117 Yellow 10 80 80 -70 -117 Red 10 10 80 -70 -117 Flesh 20 25 70 -50 -63 ______________________________________
In the last column of Table 1, E.sub.c is raised by a factor of 1.67 so as to provide 100% turn-on of the background scene in the blue backing area. Note that for blue eyes and vibrant blue, E.sub.c is above zero. The background scene will show through blue eyes as though they were 17% transparent. It will also show through the vibrant blue as though it was 50% transparent. E.sub.c for green, yellow, red and flesh tones (with Caucasian makeup) is negative. This means that the background scene is shut off (zero E.sub.c) when the scanning spot is about half way onto a green subject and less than half way onto a red or yellow subject. Shutting off the background scene before the scanning spot is fully onto the subject leaves a gap of reduced video (i.e., a dark line). When flesh tones are dominant, dropping K.sub.4 to reduce red to the level of green in Equation 6 eliminates the dark line around red and flesh tones. When green is predominant, K.sub.3 is used to drop green to the level of red to eliminate the dark line around a brilliant green.
Nothing in the E.sub.c equations listed in the referenced patents permits the use of blue-subject colors except by increasing K.sub.2 in Equation 6. In doing so, all grey scale subjects as well as cyan and magenta will exhibit a negative E.sub.c, as shown in Table 2 below, as well as increased noise level.
Table 2 shows E.sub.c when K.sub.2 is raised to 1.14 in Equation 6. E.sub.c is multiplied by 1.75 to bring it up to 100 for full turn-on of the background scene.
E.sub.c for blue eyes is now zero, which is what is needed. However, E.sub.c is 40 for the vibrant blue which means it will print-through. Other colors are acceptable depending upon adjustments to K.sub.3 or K.sub.4, except for yellow.
Equation 6 is able to reproduce pale shades of blue against a blue backing by increasing K.sub.2. A limit as to how blue a subject can be is soon reached by increased noise and visible dark edges to other colors. Raising K.sub.2 pushes E.sub.c below zero for all grey scale subjects, as well as cyan and magenta, which is undesirable.
TABLE 2 ______________________________________ COLOR B 1.14G 1.14R E.sub.c Eq. 4 E.sub.c .times. 1.75 ______________________________________ Blue backing 80 23 23 57 100 Blue eyes 80 80 68 0 0 Vibrant blue 80 57 23 23 40 White 80 91 91 -11 -19 Grey 20 23 23 -3 -5 Black 5 6 6 -1 -2 Cyan 80 91 23 -11 -19 Magenta 80 23 91 -11 -19 Green 20 91 91 -81 -142 Yellow 10 14 91 -81 -142 Red 10 14 91 -81 -142 Flesh 20 25 70 -50 -88 ______________________________________
The above referenced patents also describe techniques for eliminating secondary illumination from the backing and lens flare from the foreground (FG) object. In case of a blue backing, an object placed in close proximity to the backing receives secondary blue illumination from the backing, which gives the object a pronounced blue tint. The field of the camera lens filled with blue light will cast a blue veil over the foreground object due to multiple internal reflections within the lens.
The method described in the above patents subjects the blue channel to a dynamic clamp, after which all evidence of blue backing is eliminated from the foreground object. The blue clamp equation is: EQU B.ltoreq.K.sub.20 G+K.sub.22 (G-R).sup.+ +K.sub.23 (R-G).sup.+ +(1-K.sub.22)min(G,R)
This blue clamp equation, also known as Flare Suppression equation E.sub.k, can be written as follows: EQU E.sub.k ={K.sub.20 B-[K.sub.22 G+max[K.sub.22 (G-R),K.sub.23 (R-G)]+(1-K.sub.22)min(G,R)]}.sup.+
Where:
and the following procedures apply:
When dealing with blond or brown hair in front of a blue screen, K.sub.20 [white balance] is adjusted to suppress blue flare. In doing so, white foreground objects take on a warmer hue. In this invention, a new control is added to the E.sub.k equation, to deal with the problem described above. This new control, BLACK BALANCE, acts as a "negative gain" on the blue channel in the E.sub.k equation, as opposed to a "positive gain" of the WHITE BALANCE control.
In many blue screen productions, it is necessary to electronically eliminate screen markings, imperfections, suspension wires, unwanted shadows and other undesired screen elements from the final composite. The above mentioned patents describe means of eliminating all these undesired screen elements with the use of CLEAN-UP and CLEAN-UP BALANCE controls. Unfortunately, the use of these controls has an undesired penalty: loss of fine detail from the foreground object. Later patents by the inventor of the above mentioned patents, dealing with screen correction, describe alternate techniques for handing variations in screen brightness and color uniformity, without any loss of detail from the foreground object. Screen correction processing eliminates all screen imperfections that are common in the foreground frame and the reference frame, while preserving all detail and shadow information of the foreground object. There are instances however, where it is desirable to preserve foreground object detail, with simultaneous elimination of foreground shadow. Neither clean-up in its existing form, nor screen correction will yield a satisfactory result, as elimination of foreground shadows will result in loss of foreground detail. This invention is directed to a method and apparatus for achieving the desired result.