A dual frequency antenna, At first resonance return loss is high and for second resonance return loss is very low.
What you think is the reason behind this?
have you checked the gain(dB) v/s frequency plot? What is the gain at second freq? is it +ve dB or -ve dB? Are you sure that it is a new mode? if gain is also less, then second frequency may be a harmonic.
Just curious, return loss high means big mismatch, return loss low means match well, right? So at different resonant frequencies, the input impedance is different as well. However, if the source impedance is fixed, say 50 or 75 Ohm, then the return loss should be different at different resonant frequencies, am I correct? Please help me understand.
Why the return loss of an antenna decreases at higher frequencies?
Is there any mathematical proof for this?
...
A dual frequency antenna, At first resonance return loss is high and for second resonance return loss is very low.