dr pepper
Advanced Member level 1
- Joined
- Mar 15, 2010
- Messages
- 434
- Helped
- 35
- Reputation
- 70
- Reaction score
- 41
- Trophy points
- 1,308
- Location
- lancs
- Activity points
- 4,191
I'm writing code for a pic micro to decode the msf timecode signal which has moved from rugby to anthorn.
The datasheet for the code says that every second the carrier is switched off for 0.1, 0.2 or 0.3 seconds depending whether the data is 0, 1 or parity, with the exception that second 0 has a 0.5s off time to mark the start of the minute.
All very well, but when I connected an led to the output of my msf receiver there is clearly 2 or 3 places where the carrier switches off and on for about 0.1s, these occur around second 10, which is the point where the difference between UTC and GMT is broadcast.
This is contradictory to my datasheet which says there is a minimum of 0.5s carrier every second.
I looked for other data on the encoding, couldnt find anything different, anyone shed any light on this?
The datasheet for the code says that every second the carrier is switched off for 0.1, 0.2 or 0.3 seconds depending whether the data is 0, 1 or parity, with the exception that second 0 has a 0.5s off time to mark the start of the minute.
All very well, but when I connected an led to the output of my msf receiver there is clearly 2 or 3 places where the carrier switches off and on for about 0.1s, these occur around second 10, which is the point where the difference between UTC and GMT is broadcast.
This is contradictory to my datasheet which says there is a minimum of 0.5s carrier every second.
I looked for other data on the encoding, couldnt find anything different, anyone shed any light on this?