thank you, drkirkby, your answer really helped me. Is their a way to prove the above statement mathematically, that is, when antenna matched and next to lossy dielectric the S11 could drop by 20dB vs when it is not matched, it will not drop by that much. Although conceptually it makes sense, it would be nice to confirm it with the math behind it.
Thank you.
The return loss
could drop substantially if the antenna is placed next to a lossy dielectric. Ultimately, putting it next to a lossy dielectric will cause it to have some impedance R + j X. If your matching network matches your system impedance (probably 50 Ohms) to R + j X, then the return loss can be theoretically infinite. So the lossy dielectric could cause the return loss to change from 3 dB to 30 dB. However, that is unlikely to happen with random placement of the lossy dielectric. If the antenna is well matched without the lossy dielectric, a randomly placed lossy dielectric is far more likely to cause S11 to change for the worst.
I'm not convinced you will be able to find any mathematical proof of this, but I suspect you can show it to be so statistically with modelling software.
One argument might be that if the antenna is perfectly matched, then
any introduction of a dielectric
must make S11 worst. It can't possibly improve S11, since the antenna was previously perfectly matched. However, if the antenna is poorly matched, then introducing a dielectric can make S11 better of it can make it worst. If fact, if you had the impedance's of an antenna with 1001 different locations of a lossy dielectric, there must be some impedance where 500 locations make S11 worst and 500 make S11 better. Of course, for a different impedance, it might happen that 900 make S11 better and 100 make S11 worst, or the other way around. But if the antenna is perfectly matched to start with, any introduction can only make S11 worst.
I think based on the above, it is reasonable to say that the likelihood of S11 improving with random placement of a dielectric depends on how close to matched it was originally. Clearly if the antenna was perfectly matched, any introduction of a dielectric will make matters worst. If the antenna was very well matched (say return loss 40 dB), then it is highly unlikely that random placement of a dielectric will improve upon the very good 40 dB return loss. It seems far more likely that placing a dielectric near the antenna will make S11 worst.
It looks to me its easy to prove the limiting case (when antenna is perfectly matched, introducing a dielectric can only make matters worst). I can't see how you will get a mathematical proof when not in the limit, but it seems logical. I think the best you could do is some sort of statistical analysis.
Dave