In monolithic radio frequency (“RF”) circuits, heat generated by semiconductor devices (e.g., RF switches or power amplifiers) integrated in semiconductor wafers may have deleterious effects on device performance. For example, heat generated by logic devices, such as complementary metal-oxide-semiconductor (CMOS) transistors, integrated in semiconductor wafers can result in degraded linearity and voltage imbalance across large branches of stacked transistors.
In conventional monolithic RF circuits integrated on semiconductor wafers, the spacing between neighboring semiconductor devices is kept at a predetermined minimum distance (e.g., a minimum pitch) to prevent non-linear behavior of the semiconductor devices due to overheating. As a result, the cell density of a semiconductor wafer can be adversely affected by this limitation. In case of a conductive substrate, an increase in the spacing between neighboring semiconductor devices can worsen RF linearity, as most conventional thermal conductors (e.g., metals) are electrical conductors, and can interfere with RF signals.
Thus, there is a need in the art for integration of a thermally conductive but electrically isolating layer with a semiconductor wafer to increase cell density and improve RF linearity.