Why the return loss of an antenna decreases at higher frequencies?

Status
Not open for further replies.

arxlan

Member level 3
Joined
Oct 28, 2011
Messages
67
Helped
14
Reputation
28
Reaction score
14
Trophy points
1,288
Activity points
1,690
Why the return loss of an antenna decreases at higher frequencies?
Is there any mathematical proof for this?
 

Re: Return Loss Reduction

Could you provide us some more info (frequency range, size and type of antenna, etc)?
 

@WimRFP

A dual frequency antenna, At first resonance return loss is high and for second resonance return loss is very low.
What you think is the reason behind this?
 

have you checked the gain(dB) v/s frequency plot? What is the gain at second freq? is it +ve dB or -ve dB? Are you sure that it is a new mode? if gain is also less, then second frequency may be a harmonic.
 

Just curious, return loss high means big mismatch, return loss low means match well, right? So at different resonant frequencies, the input impedance is different as well. However, if the source impedance is fixed, say 50 or 75 Ohm, then the return loss should be different at different resonant frequencies, am I correct? Please help me understand.
 

Why the return loss of an antenna decreases at higher frequencies?
Is there any mathematical proof for this?
...
A dual frequency antenna, At first resonance return loss is high and for second resonance return loss is very low.
This is not a general rule. That can happen in your particular design, but it could be otherwise.
Regards

Z
 
Reactions: WimRFP

    WimRFP

    Points: 2
    Helpful Answer Positive Rating
Thank you all for your suggestions. Actually i'm trying to devise some mathematical proof for this condition in my design.
 

@Zorro: I agree with you. It was the reason for asking more info, but based on the info provided I can't give any useful feedback.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…