Back to basics....
Lets suppose the clock came from a switch and you controlled it manually. If you switched the input to ground, any charge on the capacitor would discharge through both resistors and the result would eventually be 0V across it.
Now change the switch so it sends 10V to the circuit, the two resistors will limit the maximum voltage to 5V because they form a potential divider, some extra current flows through the input resistor and charges the capacitor but after a short time, it reaches 5V and no more extra current flows.
So at slow switching speed, the voltage goes from 0V to 5V.
Now lets switch it faster. Up to a certain speed the charging and discharging lets the voltage rise to 5V and fall to 0V just like before but there comes a point where the switching is too fast for the capacitor to fully charge and discharge. Now the voltage doesn't have time to reach 5V before the input changes state so it reaches a voltage less than 5V, similarly, if the input changes to 10v before the capacitor has fully discharges, it will not go down to zero.
As the switch or clock signal gets faster and faster, the capacitor has even less time to charge and discharge, as some point the voltage across it looks almost constant. There will still be a small ripple in the voltage but at faster clock frequencies the ripple gets smaller and smaller. The voltage will be stable at around 2.5V if the input clock was a square wave.
If you do some research on "time constant" you will see there is relationship between the value of a capacitor and the speed it charges and discharges. No matter what value capacitor you use, there will be a clock frequency that lets the voltage average without much ripple. At higher frequencies, the capacitor doesn't need to be so large.
Brian.