return loss at 0 frequency?

Status
Not open for further replies.

rameshrai

Full Member level 3
Joined
Aug 16, 2010
Messages
158
Helped
7
Reputation
14
Reaction score
7
Trophy points
1,298
appliedelectronicsengineering.blogspot.com
Activity points
2,272
Hello,

My inset fed patch antenna simulation shows return loss of -12.5dB at 0 frequency. attached pic shows the problem. I have changed almost all parameter(gap, fed width and fed length etc) to bring it up to 0dB but it is not happening.

How can I bring it up,

looking forward for advice

thanks,
ramesh
 

Attachments

  • s11db.png
    25 KB · Views: 136

FYi your attachment is flagged by the forum admin and we can't open it. You might want to fix it.
 

What does return loss at 0 Hz matter at all?

You didn't tell how you achieved the results, we just can assume that the loss curve is plausible according to your antenna design and/or simulation setup.
 

Every EM simulator has a lowest limit.DC is not included in simulation limits.
It's normal, the simulator made an extrapolation even it's wrong..
 

Initially, I thought it was the gap between the microstrip and the patch because I had put a guess value for the gap as I didn't find any good supporting theory. but as all of you pointed out specially bigboss that the curve at dc seen is not a much of concern i know now this.

I want to get confirmed about the gap, what is the best gap value, if there is any?

thanks
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…