Hi dear aamear*
If you will make a voltmeter just with mcu :
you must convert your maximum voltage to 5v .(so i think this isn't very accurate because you can sense for example 12.15v)
If more accurate isn't important for you, this way is good.
But i like say you a way that when you want more accurate it is good!
In way number 2 you have one mcu and one D/A and one op-amp.
The else advantage of this way you can sense mv . because you can have more bit for A/D!
At first you make this circuit :
voltmeter.jpg
Vref is maximum input voltage. total of this circuit is A/D! so your mcu ports define bit of A/D and so define accurate .
Indeed mcu is n bit binary counter , so you must increase the value of port then its value at D/A convert to a analog voltage and in op-amp compared with your voltage. if it be more than your voltage, the out of op-amp is 0v then the number of ports increase ... insofar as the out of the D/A be equal with your voltage, so the out of op-amp will be 1 then if you composed in mcu when this happened the mcu decimal equivalent of the last value of ports put on the lcd you can found the value of your voltage!