Reliability of HT12E/D pair.

Status
Not open for further replies.

Pjdd

Advanced Member level 2
Joined
Jan 26, 2011
Messages
548
Helped
202
Reputation
406
Reaction score
202
Trophy points
1,323
Activity points
6,057
The Holtek HT12E/D pair is widely used in simple remote control systems, although I've never used them in my own designs. As blackbox devices, they are obviously easy to use. I've read the datasheets and their data integrity check technique seems to be good enough for applications like direction control of RC models.

What I'd like to ask of experienced users is whether these devices are suitable for more critical applications. For simple RC toys, a glitch or errant pulse at the output may be hardly noticeable, but could produce undesirable effects in certain applications.

For example, suppose each press of the same key at the encoder is used to produce a slow train of pulses at the decoder output to manually control a sequence of events. For clarity, suppose further that the encoder and decoder are linked by ASK/OOK wireless at long distance and/or in a noisy environment. A missed signal might be tolerated but spurious output from the decoder could produce undesirable results. In such a scenario, how reliable would an HT12E/D-based system be? (This is for reference, not about an actual project).
 

I think that the reliability question is more concerned to the RF stage than to the codec pair itself. There is no impediment to you implement further actions to minimise the likely to lose data, such as command redundancy, acknowledge request, different carriers, append error correcting codes, etc...
 

There could be a problem with oscillator stability unless stable components are used. to improve the security of a link, on transmit, repeat the controls, and the receiver end does not actuate them unless two identical controls are sent.
Frank
 

What I had in mind for this hypothetical project is to use readymade crystal-controlled Tx-Rx modules. The HT12 pair uses 8 address bits and 4 data lines. All 12 are serially scanned and the decoder checks them for consistency three times before releasing an output. The encoder also sends header bits and automatically completes a word if a send command is terminated in mid-transmission.

Looks pretty good for the type of application I outlined in my opening post. But then I'm no expert in this field, which is why I asked for your opinions. My main concern is about the possibility of spurious output from the decoder as a result of noise and RFI. Seems highly unlikely to me. What do you say?
 

My main concern is about the possibility of spurious output from the decoder as a result of noise and RFI. Seems highly unlikely to me. What do you say?

In general, the overall SNR is what will dictate the suited approach. As mentioned above, the greater reliability requirement, the most complex communication protocol will be appropriate to overcome these problems.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…