wired phase change with distance

Status
Not open for further replies.

gary1943

Newbie level 2
Joined
Apr 26, 2012
Messages
2
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Visit site
Activity points
1,297
Hi, all. Recently i found some weird things with my antenna. My antenna has a frequency range from 850Mhz to 4Ghz. When transmitting signals by using carrier from 850Hz to 2Ghz, I can find the phase rotation of the received signal when decreasing the distance between the transmitting and receiving antennas. However, when doing the same thing by sending signals on 2Ghz to 4Ghz, i did not see dramatical phase rotations. There were just some trivial changes about the amplitude and phase when decreasing the distance. That confused me. Any ideas? Thanks so much.
 

not sure precisely what you are measuring, but in general the phase will increase as the frequency increases, if for no other reason than the transmission line has more phase shift at higher frequencies.

But if the antenna is matched over only a small frequency band, or if there are other mismatches in the system, you can have standing waves set up on the tramission lines, which correspond to significant phase ripple.

also, of course, you could have wireless multipath that would make the transmitted signal phase look non-linear with frequency.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…