How do I reduce the noise of the digital control signals?

Status
Not open for further replies.

ahgu

Full Member level 3
Joined
Jun 19, 2001
Messages
172
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,298
Activity points
1,552
I have several digital signals from a microchip into a CCD chip.
The signals coming out of the microchip is very noisy. What is the best way to reduce the noise/glitches? The noise is small compare to the logic level, but might add some noise to the CCD output.

If the best way is to use a buffer, is there any difference between a common bus buffer/driver and schmitt trigger buffer?

thanks
Ahgu
 

Hi,
schmitt inputs has some hysterese at the switching level> the noise on your signal will be some "cleaned". You must check that the noise on your digital signal is not so much high that the logic levels (in wort case) of your next stage are relevant...

I think you must check the bypassing of your circuits_do you have some 10...100nF on the boards/on all CMOS ICs?
K.
 

If the noise is present on the outputs, away from their
transitions, this must be supply noise "shining through"
the railed output. That says, more decoupling or higher
quality.

If it is only at transitions, a bit of series resistance (like source
terminating the digital drivers to minimize trace overshoot)
might be the trick.

Limiting the peak output currents is a good way to soften up
supply noise as well. Current spike from one, contaminates
all.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…