Revising 5G RF Calibration Procedures For RF IC Production Testing

At mmWave frequencies, signals are more susceptible to impairments, requiring extra consideration in the selection of test solutions, cables and connectors. System-level calibration is also essential to achieve accurate measurements.


Modern radio frequency (RF) components introduce many challenges to outsourced semiconductor assembly and test (OSAT) suppliers whose objective is to ensure products are assembled and tested to meet the product test specifications. The growing advancement and demand for RF products for cellphones, navigational instruments, global positioning systems, Wi-Fi, receiver/transmitter (Rx/Tx) components and more keeps growing and driving the demand for more advanced 5G cellphone and Wi-Fi components.

In any RF test system, the ability to achieve instrument-port accuracy at the device under test (DUT) enhances measurement accuracy and repeatability. Unfortunately, the non-ideal nature of the cables, components, traces and switches and other items in the paths between the instruments and the DUT can degrade measurement accuracy.

While current calibration methodologies may have worked in the past, the mmWave advancement in RF technologies demands calibration procedures be revised for the specification extensions. It is important to consider key major points of signal path calibration, namely: system calibration, cable calibration, load board trace de-embedding and golden units’ calibration as well as how to use these aspects as a unique advantage in developing calibration standards.

Click here to read more.


Leave a Reply

(Note: This name will be displayed publicly)