Generating radiofrequency signals necessary for this imaging technique requires linearity characteristics that are strict in amplitude and in phase. It has to be possible to adjust the output power while complying with these linearity criteria. An example of a linearity criterion to be complied with is to maintain a differential gain between the output and the input of: +/−0.15 dB and a differential phase of: +/−1.25°.
Currently the amplifiers used for magnetic resonance imaging applications are linear amplifiers operating in class AB. The use of such amplifiers all the same requires the putting in place of a linearity correction of the “pre-distortion” type. For such a correction to be used, the class AB amplifiers must work with compressions of less than one dB. This type of correction is used in the RF band, which may pose many problems such as, for example, the temperature stability of the power components and the variation of the gain as a function of the temperature. The technology of the electronic components used may also be a blocking factor. The use of MOS and LDMOS technology shows a phase rotation in advance of the compression unlike the bipolar technology currently abandoned in the frequency bands used in magnetic resonance imaging. Note that the technology called MOS is derived from “metal-oxide semiconductor” for transistors of which the gate is insulated from the drain source connection by an oxide layer. The LDMOS “laterally diffused metal-oxide semiconductor” technology is used for high-frequency power transistors and is achieved by the use of a particular mode of diffusing doping elements on the substrate.