The development of the DSL technique is oriented towards a constantly rising signal bandwidth. With the VDSL2 (“Very high-speed DSL”) standardisation process, a maximum transmission bandwidth of 30 MHz is provided. The development of analogue front ends for such bandwidths is in general a considerable challenge, because the performance demands for VDSL2 are based on the same model for line noise of −140 dBm/Hz, as is used for narrower bandwidths such as ADSL (1.1 MHz, ADSL2+ (2.2 MHz), SHDSL (600 kHz) (“Symmetric High-bitrate DSL”) or VDSLI (12 MHz). In this situation, there is particular emphasis on line drivers which are also used in communications technology, among other things, in xDSL transceivers, for example, and in that context are used in particular in the analogue front end.
With the line driver development for a broadband application, such as for VDSL2, the two aims, improving linearity and what is referred to as PAR (“Peak-to-Average Ratio”) or what is referred to as the CF factor (“Crest Factor”), play a substantial part. The non-linear switching theory shows that for a non-linear component and with the same power spectral density (PSD), the signal-to-noise ratio (SNR) decreases as the signal bandwidth rises due to non-linear distortion. This means that a large bandwidth and good linearity represent a conflict of aims. A similar problem arises with the PAR property of a line driver. The statistical distribution of the input signal of a line driver for DMT (“Discrete Multi Tone”) modulated signals, such as pertain with ADSL and VDSL2, satisfies a Gauss distribution, as a result of which the PAR for a given probability section is constant. If the bandwidth is enlarged, the average power also increases, and so leads to a higher signal peak level.
From the prior art there exist a number of different arrangements, described hereinafter, which for reasons of economy and power consumption have been based hitherto on the use of one line driver per DSL channel in order to reduce the signal peak level.
A first arrangement consists of a feed voltage being increased, which with constant transmission power at the line driver incurs a lower output current and in consequence leads to a lower distortion. The increased feed voltage, however, results in the undesirable disadvantage of an increased PAR.
A second arrangement consists of exploiting the very large amplification bandwidth product of the CMOS technology for the line driver. A more powerful gain feedback in this case improves the linearity behaviour during amplification.
In addition to this, other known arrangements are based on the use of analogue filters controlled directly from the line drivers, in particular bandpass filters, of a higher order, in order to reduce out-band distortion. Despite substantially improved performance features, however, there are a series of substantial disadvantages. For example, for VDSL2 with at least three frequency bands both upstream and downstream, with the line driver development this would require a filter of the order of twenty. In addition to this, filters of a higher order have a not insubstantial pass attenuation and disadvantageous inductivity and capacitance tolerances, which leads to a deterioration of the performance features. Filters of a higher order also cause a greater inter-symbol interference. In this situation, higher costs are also incurred due to the BOM (“Bill Of Material”). In general, analogue filters are characterised by lesser flexibility, because by contrast with digital filters they are only programmable to a very restricted degree. This is very disadvantageous in particular in connection with VDSL2, since in this case a large number of different frequency band plans must be supported. In addition, the analogue filter must cover the entire voltage range of the line driver, which imposes very high demands on the linearity behaviour. In this situation, the linearity of the inductivities used is comparable with the linearity of the line transmitter used for coupling the transceiver in each case to the transmission line.
From the prior art various different algorithms are known for the reduction of the PAR, also designated as PAR reduction algorithms (PARR), which are usually associated with the following disadvantages. For example, by a reduction of the PAR only a smaller signal bandwidth can be obtained, which is at the expense of correction signals. In addition, the development of PARRs is restricted due to the interference radiation based on the correction signals or the restricted power spectrum density (PSD) respectively. A further disadvantage to be pointed out is an increased complexity of digital sub-systems, due largely to a large number of data memories. Some algorithms also cause substantial signal fluctuations at the remote ends, which can lead to undesirable interference elements at remote receivers.