Recent years have witnessed the establishment of a standard for transmission of high definition television (HDTV) signals, over both cable and terrestrial broadcast modes throughout the United States. Although it offers significantly enhanced picture resolution, terrestrial broadcast of HDTV signals is somewhat problematic due to the almost universal installed base of conventional NTSC broadcast and more particularly, reception equipment. The present system provides for simultaneous transmission (simulcast broadcasting) of HDTV signals and conventional NTSC analog television signals in order to provide high definition television services without obsoleting the installed base of NTSC receivers. Conceptually; program material is encoded into the two different formats (NTSC and HDTV) and simultaneously broadcast over respective 6 MHz transmission channels. Viewers having conventional NTSC equipment would be able to receive and view NTSC programs by tuning in the appropriate NTSC channel, while viewers equipped with HDTV equipment would be able to receive an HDTV program by tuning their receiver to the appropriate HDTV channel. While conceptually simple, simultaneous broadcast of NTSC and HDTV signals often results in characteristic portions of an NTSC signal interfering with adjacent channel or co-channel HDTV signals causing degradation to the HDTV signal.
The cause of this form of signal degradation is well understood by those familiar with high definition television transmission systems and is conventionally termed NTSC co-channel interference. Various means have been proposed in the art to reduce NTSC co-channel interference in current HDTV transmission methodologies, and particularly with respect to vestigial sideband (VSB) HDTV transmissions, which form the basis of the HDTV standard in the United States. Certain of these conventional NTSC interference rejection means are summarized in ATSC standard A/53 (1995) ATSC Digital Television Standard. Briefly, the interference rejection properties of a conventional HDTV system are based on the frequency location of the principal components of the NTSC co-channel interfering signal within the 6 MHz television channel.
FIG. 1 depicts a typical 6 MHz channel spectrum, represented in baseband in the frequency domain (i.e., symmetric about DC), and illustrated in its characteristic raised cosine form 10 with root Nyquist band edges. NTSC co-channel interference is generally recognized as caused by the three principal carrier components of an NTSC signal; the video carrier (also termed the luma or luminance carrier), the color subcarrier (also termed the chroma or chrominance subcarrier), and the audio carrier (also termed the aural carrier). In the illustrative channel spectrum diagram of FIG. 1, the location and approximate magnitudes of the three principal NTSC components are depicted with the video carrier, indicated at V, located approximately 1.25 MHz from the lower channel band edge. The color subcarrier, C, is located approximately 3.58 MHz above the video carrier frequency and the audio carrier, A, is located approximately 4.5 MHz above the video carrier frequency (i.e., approximately 0.25 MHz from the upper channel band edge). As depicted in the Figure, and as well understood in the art, NTSC carrier component interference is of particular concern due to the relatively large amplitudes of the video carrier V and color subcarrier C which characterize and NTSC transmission. Although the audio carrier A is present at a relatively smaller amplitude, it nevertheless contributes a significant interference characteristic. Thus, it will be understood that NTSC co-channel interference rejection is an important consideration in the design of HDTV reception equipment. The carrier and subcarrier components of an interfering NTSC signal must be removed from an HDTV channel in order to ensure the enhanced quality of an HDTV signal.
A conventional approach to NTSC co-channel interference rejection is based on the frequency location of the principal components of the NTSC co-channel interfering signal within the 6 MHz HDTV channel and the periodic nulls of a conventional twelve symbol, feed-forward, subtractive, baseband comb filter, disposed conventionally in the demodulation path of a typical prior art-type VSB receiver.
Such a conventional baseband comb filter is depicted in semi-schematic block diagram form in FIG. 2 and suitably comprises a 1 tap linear feed-forward filter, indicated generally at 12, which can be represented as in terms of a feed-forward delay stage 13 providing an inverted, delayed, input component to a composite adder 14. Such comb filters are well understood by those having skill in the art and its component parts and principals of operation require no further explanation herein. It will suffice to state that the delay stage 13 is constructed such that the filter produces an output spectrum having periodic spectral nulls equally spaced about 57×fH (896.85 kHz) apart, where fH is equal to the NTSC horizontal line rate. Thus, as shown in FIG. 3, there are 7 periodic nulls occurring within the 6 MHz channel band, with the NTSC video carrier frequency V falling approximately 2.1 kHz below the second null of the comb filter, the color subcarrier C falling near the sixth null, and the audio carrier A falling approximately 13.6 kHz above the seventh null.
Although the comb filter (12 of FIG. 2) has been generally adopted by the television transmission and reception industry, it suffers from certain significant disadvantages that make its universal use problematic. While providing rejection of steady-state signals located at the null frequencies, only the NTSC color subcarrier C is correctly placed in the center of the filter's sixth null frequency. The video and audio carriers V and A occur at frequencies that are offset from their respective filter null positions. This prevents the NTSC video and audio carrier signals from being completely canceled by the filter. In addition to incomplete rejection of the NTSC interference components, the filter also has the effect of modifying data signals which occur at the location of the periodic nulls throughout the 6 MHz HDTV channel. Although the modified data signal can be recovered and somewhat properly decoded by a trellis decoder, the complexity of such a decoder is substantially increased, particularly when it is recognized that the number of slicing levels, comprising the decision loop, will necessarily be increased from 8 to 15 (a consequence of the partial response process characterizing the system).
Moreover, the effects of channel band noise may be significantly increased by the filter. This results, in part, by the reproduction of noise appearing on the input line in the filter's delay stage 13, such that the filter output contains an accumulation of a noise component through the delay stage 13 and a noise component contained in the original signal. As mentioned above, the conventional comb filter is generally effective in rejecting steady-state signal components. Most forms of noise, however, are random in frequency, phase and amplitude. Many situations will necessarily occur when noise components are additive, and the resulting noise product may significantly interfere with desired signals, thereby substantially degrading the quality of an HDTV signal.
Accordingly, there remains a need in the art of HDTV transmission and reception system design, for a more effective system and method of reducing the effects of NTSC co-channel interference. Such a system should be able to selectively and precisely remove interfering NTSC carrier component signals without substantial effect on the remainder of the channel spectrum (i.e., on user significant data). Further, the system should be able to process input channel data and remove unwanted interference components without introducing extraneous noise and without skewing the channel, thereby maintaining the original simplicity of the demodulator block.