1553 bit rate tolerance

Status
Not open for further replies.

im&u

Newbie level 4
Joined
Jan 21, 2011
Messages
7
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Visit site
Activity points
1,349
Hi!
I'm designing a 1553 decoder.
1553 bit rate = as in here : https://en.wikipedia.org/wiki/MIL-STD-1553

The bit rate is 1.0 megabit per second (1 bit per μs). The combined accuracy and long-term stability of the bit rate is only specified to be within ±0.1%; the short-term clock stability must be within ±0.01%

I am trying to find a way to handle this accuracy in RTL but I don't understand well this requirement.

Here how I see it:
1 Mb/s +- 0.1% (worst case) => 1 bit each 1 us +- 1 ns

Here what I did:
I choose to sample the data with a clock @ 8 MHz, and I tolerated 1 clk cycle (125 ns)

This means that I will consider a signal of 1 us +- 125 as valid.
For example 1 us -125 = 875 ns but this is outside the 0.1 % (1 ns) required by the standard

I am confused.


Any idea/suggestion/link will be helpful.
 

1553 modulation is Biphase , so clock recovery to 0.1% long term is normal, short term depends on Noise and method used, PLL or 1-shot at 3/4T

i have no idea why they specify freq error on Biphase , when for Rx, only phase noise and phase margin counts.
 
Last edited:

1553 modulation is Biphase , so clock recovery to 0.1% long term is normal, short term depends on Noise and method used, PLL or 1-shot at 3/4T

Thank you.

Forgive me, this concerns the analog part of the system but I am only designing the RTL code.
Does that requirement have an impact on the digital decoder?

Another difficulty I am having is finding a good paper on 1553 transceivers (in order to see the transformation of the analog signal to the digital format and the tolerance applied to it).

any idea?
Thanks in advance.
 

I would interpret that as the accuracy required when transmitting a signal. I didn't study the standard enough to determine if there is a required phase relationship between the transmit and recovered clock (in your case a oversampling clock).
 

A Software based RTL decoder by oversampling 8x would compromise the error rate in noise conditions. Phase noise can also occur with transmission line group delay distortion to the Eye Pattern.

Sorry but I cant help you with RTL code except that you would be better off with an analog discriminator and FIFO if you cannot run synchronously to the incoming synchronous data.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…