Some theoretical thoughts. Now i am looking at
https://www.minicircuits.com/pages/pdfs/dg03-110.pdf.
For example,
0dbm -> 0.225 volts
+20dbm -> 2.25 volts
I am not sure what volts here is, AC amplitude or something else. Anyways the more power we have, the bigger sine amplitude is.
It looks that tuning 0dbm input multiplier would be more difficult, because output will be too small:
Small error in gate voltage will lead to big error in choosing point where sine must be cut off. Temperature changes, etc. will impact amplifier tuning heavily. For example, error of 0.2volts will make multiplier useless.
When using +20dBm or so, voltage swing itself is big, so even huge error of 0.2v will not tune-out multiplier out of working values.
So is it true, that when we have more input power, it is easier to tune multiplier for desired N harmonic for cercain cut-off time?