I have this ATM90E26AS - Energy metering IC and this is the application note that decribes the process of calibration. (
https://ww1.microchip.com/downloads...tes/Atmel-46102-SE-M90E26-ApplicationNote.pdf)
In this application note, on page 14, we see the steps they mention to calibrate the voltage.
Ugain = ((24600 x Un)/(Vol_mea))
So, as far as I understand, Ugain is a register and Urms is another register.
As far as I understand, when the ATM90E26 is initially powered, there's some default value in the Urms register.
Now, based on the input voltage that is given (220.024V, as given as an example in the application note), there will be a different value provided by the Urms register.
Converting this Hex value from the Urms register to Voltage (as described in the app note), I can obtain a value which can be converted into Hex. And this is the final Hex value that needs to be written into the Ugain register for calibration. Please correct me if I am wrong.
My question is,
1. How can this approach be termed as calibration? I am given some voltage (220.024V) and just writing that in a register. I am not comparing or validating this 220.024V voltage value against any standard reference. I am also not conveying to the IC that this final Hex value which is being written into the UGain register corressponds to 220V. In that case, how is the calibration process termed as correct? Am I missing something to understand?