Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Problem in measuring distance with time-of-flight of radio waves

Status
Not open for further replies.

amirahmadian

Member level 1
Member level 1
Joined
Jul 23, 2012
Messages
38
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,619
Hi dear friends,
I've decided to build a simple system that measures the distance between a RF transmitter and a receiver (range finding). My idea is based on measuring the Time-Of-Flight of the radio wave ,with a digital circuit (also a microcontroller may be used). I need an accuracy about 10 meters. I know in theory it's not very difficult. But I have an important problem: My circuit works at a frequency around 20MHz. Since the radio wave travels at a very high speed (300000km/s), the system must be able to measure time periods as small as 33ns, which is not possible.
Are there any methods to overcome this problem? (Note that I can not increase the frequency of my circuit). Do you have any ideas?
 

You could measure phase. Something like the AD8302 will do that with no problem. You haven't explained very much of your proposed setup so it is difficult to comment further other than to say a "simple system that measures the distance between a RF transmitter and a receiver" probably doesn't exist!

Keith
 

Thanks for your reply.
You could measure phase.
I know there are other methods like measuring the phase or signal strength, but I'm really interested in using time of flight. The basis of my project: The transmitter sends a signal (a pulse, for eaxmple) and after a predefined constant delay, it sends another signal (another pulse). The receiver gets these signals , it uses a counter to measure the time between receiving these two signals. As we know the constant delay (and the possible processing time), we will be able to calculate the time that takes radio wave to travel from transmitter to receiver. I think my only problem is that I don't have such fast counters.
 

there are very simple circuits using equivalent time sampling that can easily measure 33 ns! Look around. Look at tank level microwave sensors, for example. They use the same type of circuitry that Tektronix used back in 1970 to measure 5 GHz signals with their sampling oscilloscope.
 

The basis of my project: The transmitter sends a signal (a pulse, for eaxmple) and after a predefined constant delay, it sends another signal (another pulse). The receiver gets these signals , it uses a counter to measure the time between receiving these two signals. As we know the constant delay (and the possible processing time), we will be able to calculate the time that takes radio wave to travel from transmitter to receiver. I think my only problem is that I don't have such fast counters.

Sorry but probably I missed something. If you send a pulse and after a known period, let me call it "Td", another pulse at receiver side the two pulses will be received always spaced by "Td" regardless the distance between TX and RX.
To perform the distance measurement, instead, the clocks at TX side and RX side must be synchronized one each other so you know the instant the pulse has be sent and the instant the pulse has been received.
Other method: the signal can be sent from TX to RX, here reflected back to the TX that receive it (of course by means of an additional receiver) where the elapsed time is calculated (f.i.: radar).
 
Last edited:
If you send a pulse and after a known period, let me call it "Td", another pulse at receiver side the two pulses will be received always spaced by "Td" regardless the distance between TX and RX.
Yes, you are right. Sorry I made a mistake. Actually I had an idea like yours in my mind, but I couldn't explain it correctly! I just wanted to emphasize on time of flight.
the signal can be sent from TX to RX, here reflected back to the TX that receive it (of course by means of an additional receiver) where the elapsed time is calculated
This method is OK. But as I said, my problem is with the hardware. As the clock frequency is not high enough, how can I calculate the elapsed time? (I need an accuracy about 10 meters). Is there a way to improve this method so it can be used with slower clocks?
 

If the signal travel a round trip, all the distances are doubled (because the signal runs forward and then back) so 10 meters will be equivalent to about 66 ns.
 

albbg brought up the main problem which is that doing this by measuring a one-way delay requires good synchronization at both ends, and it probably isn't feasible to run a reference clock between the two ends. So you can either do a call and response approach, or a two-tone approach. For example if you transmit 20MHz and 25MHz simultaneously, with a known phase offset, and the receiver can recover both carriers, then it should be possible to recover distance based on the phase difference, without a phase-locked reference. Your effective frequency would be 5MHz in this case, and you could only know the distance certainly if it's less than one wavelength of that 5MHz (unless you use more sophisticated modulation techniques).

And it is possible to detect "delay" less than one fundamental period, you just need a good phase-sensitive detectors.
 

the signal can be sent from TX to RX, here reflected back to the TX that receive it (of course by means of an additional receiver) where the elapsed time is calculated (f.i.: radar).
I thought about it again and I came up with a question. Your suggested system works like this: A transmits a signal to B , B responds to A and finally A calculates the elapsed time between sending the signal and getting the response (round trip time). In practice, when B receives the signal it can not transmit the response signal immediately, I mean B would have a delay (maybe short) before sending the signal to A (For example, a microcontroller needs some time to do the process). So this extra time is added to the round trip time of radio wave. I think this makes the calculation invalid, doesn't it?
 

Attachments

  • final.rar
    53.4 KB · Views: 123
Last edited:

1. You finally discovered that the distance must be measured as round-trip-delay
2. Referring to the altitude measurement link by Bob60, a radar (FMCW or pulse) with passive reflector can be a solution. I guess, a reflector may be infeasible for your project.
3. Presumed an active transponder can be made with sufficient timing accuracy, I simply can't imagine that you have the knowledge to design it.
 
  • Like
Reactions: tpetar

    tpetar

    Points: 2
    Helpful Answer Positive Rating
A microwave device for this application was named "tellurometer", now replaced with laser distance meters.
The time of flight measurement needs a reference. As it is not available, you must use a retransmitter and evaluate the double delay at the first transmitter. Each unit must be calibrated for internal delay to determine the exact time of flight.
 

Two other questions:
1. Assuming that I have found the exact dealy in the receiver and the transmitter (delay of demodulator,stages,etc), can I suppose this delay to remain always the same (over time)?
2. Even if I ignore the problem of dealy, what about the accuracy of clock frequency? Every oscillator has a frequecy stability. Doesn't the clock frequency drift affect the measurements?
 

Two other questions:
1. Assuming that I have found the exact dealy in the receiver and the transmitter (delay of demodulator,stages,etc), can I suppose this delay to remain always the same (over time)?
2. Even if I ignore the problem of dealy, what about the accuracy of clock frequency? Every oscillator has a frequecy stability. Doesn't the clock frequency drift affect the measurements?

1. Instrument delay must be defined. By measurement you must also know if e.g. gain adjustment affects the delay and how. Cables used in the equipment have the most important effect on delay.
2. Clock frequency is important and there are precise oscillators that are very stable. For instance, 10 MHz OCXOs are adjusted to exactly 10MHz, can have a precision better than 10E(-8), and have a specified aging.
 
Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top