The 3GPP standard Long-Term-Evolution (LTE) of UTRAN is a system using orthogonal frequency division multiplex (OFDM) standards with frequency-localized allocations.
In the LTE system one main difference to earlier 3GPP releases is the use of wide channels that are shared with users in frequency domain (i.e., frequency division multiplexing). Allocation for one user can vary from one resource block (RB) to maximum number of resource blocks in the channel (e.g. 50RB for 10 MHz channel). A resource block is the smallest allocable frequency range of the uplink or downlink frequency band lasting a predefined time. E.g., in LTE, a resource block is 180 kHz wide and lasts for a 0.5 ms time slot.
In such a system an effect called UE self-interference may occur, according to which emissions of the uplink (from the user equipment (UE) to the base station) extend into the downlink (from the base station to the user equipment). UE self-interference may exist both for intra-technology (e.g. LTE transmission to LTE reception) as well as for inter-technology cases. Interference may also occur where the LTE UE transmitter interferes with UE's receiver in the adjacent channel. This again may exist for intra-technology (LTE-TDD next to LTE-FDD) or inter-technology.
Conventional techniques for avoiding both UE-to-UE interference and UE self-interference are                Uncritical frequency band organization (e.g. large duplex gaps);        Large guard bands; and        High quality (duplex) filters.        
The two straightforward ways to solve the UE self-interference issues if conventional techniques are excluded consist in:                Limiting the UE power; or        Just accepting the lower sensitivity.        
The latter alternative means that the evolved NodeB (eNodeB) must increase the downlink power to compensate for the reduced sensitivity. The current sensitivity specification allows for some sensitivity degradation for wide allocations (3GPP TS36.101 sub-clause 7.3).