I found the relation that the voltage is changed as 1.4mV/°C. Depending on this I tried to calculate the voltage from the divider and then convert it to temperature dividing by 1.4mV. Unfortunately it shows data that is not accurate.
Pt100 is the industry standard solution for accurate measurements in the 0 to 400 °C range. It can be implemented as a simpled ratiometric measurement with a voltage divider, similar to post #19. But you can't achieve sufficient resolution and accuracy with a µC 10 bit ADC. You should have realized that the said 1.4 mV/°C (or a similar number) is only a fraction of the ADC LSB. Some options to get a better resolution:
- use an amplifier, e.g. x10
- use a higher resolution ADC
- use Pt1000 instead of Pt100
Reducing the reference resistor value isn't a solution because it would cause excessive self-heating of the Pt100 sensor. Only in some application with good thermal coupling, the sensor current might be exceptionally increased to 10 or even 20 mA.
In addition, the measurement with an µC internal ADC needs usually analog and software filtering to utilize the theoretical 10-Bit resolution. Slow ADCs, e.g. sigma-delta or dual-slope give need typically less filtering.
A disadvantage of the voltage divider circuit is the nonlinear characteristic of the R2/(R1+R2) function. It should be eliminated in software before the result is finally scaled. But it is worth doing because the overall accuracy depends only on Pt100 sensor, reference resistor and ADC, but no other components.