I am looking for a circuit that could measure AC voltages upto 1000V. The input will be AC voltage that varies from 0-1000V. The output from circuit should be 0-100V. i.e, For actual 0V, the circuit should give o/p of 0V and for 1000V, the o/p should be 100V. i.e, in the form of desired ratio.
You'll need power resistors. Rated for a watt or two. That's what it would be if you use, say a megohm in the upper branch of the network across 1000V.
Now the question becomes, what load resistance is your measuring device?
A typical multimeter might be a megohm. Then it will pull the tapped voltage down by about half.
However your measuring device might have a very high impedance. Then you'll have an easier time. You could even use 10M ohms in your resistor network. They could have a lower watt rating. Less expense that way.
Another way to drop AC down in voltage is to use a capacitive divider. Similar to what's used to power an LED from house AC. It might be tricky to adjust for accuracy however. And it can be fatal to your meter if the capacitor were to short out.