in the cadence environment output noise can be simulated, for bandgap simulations the integrated output noise over 500kHz is about 10uV@dc, input noise 0.5V. The specified source is the supply with 1V ac amplitude.
Is there some relation to the offset voltage of the bandgap input stage? Is the output noise at dc the amplified input offset voltage?
What means the result 0.5V input noise, input noise of what?
Noise voltage at input.
Always to be chosen much greater than any actual input noise, in order to calculate the noise figure of the circuit:
NF = 20dB(output_noise_voltage / input_noise_voltage*gain) ; dB=log10