I am working with CMOS 0.35 µm, 3.3 V technology standard technology.
Consider simple digital inverter as an example, and I want to apply an external input voltage of 5 V while the inverter is powered by 3.3 V. Will that cause a damage?, what is the relevant parameter that I can read on this voltage from the technology data sheet.
You care about long term BVox and any enhanced hot carrier effects
(you'd be outside any qualification testing, to say about HCE; a decent
test-to-fail series ought to deliver BVox / lifetime data on single devices
out to insta-fail voltage.
Now in my experience the BVox is going to be at least 2X, probably 3X
the supply voltage rating (depending on what the first fail mode is, for
the sum of all sensitivities).
But good luck getting "blessed" if nobody's plowed that road and got
waivers on file.
On the plus side, 0.35u is so mature that there's a chance this has happened.
An additional concern is, just where does this "5V" come from?
If a pin / pad then now you need ESD protection and that may
your much bigger challenge if all you have is 3.3V MOSFETs.
Look for "overvoltage tolerant CMOS input buffer" and see
schemes for blocking overvoltage and if you're lucky, hints
about over-rail pin protection.