Insertion Loss Deviation

This is not a field test parameter.
 
Impedance uniformity is an increasingly important parameter to understand, measure, and quantify for high speed full duplex transmission systems. The most common way to specify cable roughness or impedance uniformity has been to measure return loss. Since return loss is a reflection measurement, the amount of impedance variation measured becomes restricted at high frequencies to the first few meters of cabling. There is an interest in looking at the degree of impedance uniformity over an entire 100 meter segment in such a way as the high frequency components or roughness are not masked or attenuated by distance.

One way to accomplish these objectives is to make a through measurement rather than a reflection measurement. When insertion loss is measured on links exhibiting structural impedance variations, a ripple occurs in the insertion loss results at high frequencies (typically above 75 MHz). This ripple increases in magnitude as a function of frequency and the amount of structure in the cable. Insertion loss deviation is a measure of the worst case difference in magnitude between the expected insertion loss and the actual measured insertion loss. Insertion loss deviation is measured by first finding the insertion loss, and then computing the maximum amplitude across the specified frequency range between the insertion loss and the least squares curve that fits the insertion loss data.

The term "insertion loss" is used instead of attenuation because attenuation assumes matching impedance between the system under test and the test device. For insertion loss measurements the test device is set at 100 ohms and the system under test may have an input impedance between 85 and 115 ohms.

Experiments show that return loss is not necessarily correlated to insertion loss deviation.
 
Interpretação dos resultados
All that can be said is the minimum possible insertion loss is desirable.