I was wondering what the explanation for the following observation is:
When looking with an oscilloscope at the clock signal that I generate with a microcontroller I do _not_ see an expected amplitude range of 0V to 5V but a large overshoot blow 0V and above 5V. See attached image. Are these amplitude real or just some sort of artefact due the interaction of the input stage of the oscilloscope with the steep transients ? If they are real, can they damage other components that are driven by this clock?
It's not the oscilloscope input stage but maybe the probe, typically the ground clip. It may be real as well, depending on the circuit. A CMOS IC, e.g. a µC will usually clamp the voltage 0.7 V below/above the supply rails. Seeing no clipping at all suggests that it's a measurement artefact.
Overshoots can damage sensitive components, particularly if they don't implement clamping diodes.
At a ~20MHz ring-note, this might be about local decoupling
"tanks". But ground clip positioning does matter. For higher
speed (edge rate) signals you now see differential probes
with very tight ground-prong spacings to the tip.
As it turned out, the origin for the observed behaviour was the probe I used. It was a basic BNC cable with clips attached. Using the ( matching ) probes that came with the oscilloscope showed the expected waveform.
for sure I learned something ... but why does it always has to be the hard way ... ;-)
By the way, I followed your earlier advice on using a differential amplifier with great success ... Thanks again!
https://www.edaboard.com/threads/240653/?#post1030233