Conventionally, a degeneration technique may be adopted as a method for improving linearity of an amplifier. For example, in an amplifier using a source-grounded transistor, linearity can be improved by a degeneration resistance. The degeneration technique, however, has a nature of improving linearity as well as decreasing gain. Therefore, in a case of increasing a variable range of gain of the amplifier, it may be necessary to adjust a degree of reducing nonlinear distortion (an amount of degeneration).
However, when a resistance value of a degeneration resistance is changed, a DC operating point may change. Further, it is not possible to cause the amount of degeneration to be 0, and it is not possible to sufficiently increase the variable range of gain of the amplifier. Further, in a case of using an on-resistance of the transistor as the degeneration resistance, it is necessary to provide a transistor with a large transistor size in order to sufficiently increase the variable range of gain of the amplifier, and there is a problem from a viewpoint of chip size and a view point of characteristics accompanying parasitic capacitance.