I'm confused (again).
It seems to be a simple bridge rectifier charging the 1mF (1,000uF) capacitor when the switch is to the left. When moved to the right, it discharges the capacitor through the resistor to the LED.
Charging time depends on several factors: the source impedance, the resistance of the rectifiers, the ESR of the capacitor and wiring resistance. For most practical purposes the capacitor will charge almost instantly to (VAC - (2 * rectifier Vf)) volts. In other words it's voltage will follow the AC waveform voltage minus the drop across the diodes. As Klaus points out, with nothing to limit the current it could be significant until full charge is reached. Ignoring series resitances, with 4V RMS input, the capacitor should reach about 4.25V after one half cycle.
Discharge also depends on several factors. The T=CR formula applies but an LED doesn't provide a constant 'R', in fact it goes virtually open circuit as the voltage drops below the LED's Vf. The first thing you have to know is what Vf actually is, some LED's have a Vf higher than 4.25V so it would never light up anyway. The actual load current and hence time the LED lights up is a combination of the fixed 170 Ohm resistance and the non-linear resistance of the LED. For example, if the LED VF is say 1.6V (typical for a small red LED) the capacitor voltage will drop from 4.25V with decreasing rate of discharge until it has about 1.6V across it then the discharge will be significantly slower.
As you can see, it isn't a straight forward calculation and a simulation, which takes into account real life parameters, is a good option for finding out what really happens.
Brian.