# A digital voltmeter charged a 1-uF capacitor to the level of an unknown voltage using a 1-mA constant current. The charging required 141 cycles of a 100-KHz reference frequency. How do I determine...

A digital voltmeter charged a 1-uF capacitor to the level of an unknown voltage using a 1-mA constant current. The charging required 141 cycles of a 100-KHz reference frequency. How do I determine the charging time and the measured voltage?

*print*Print*list*Cite

Expert Answers

valentin68 | Certified Educator

The time period (time for one cycle) T is by definition the reverse of the frequency F

`T =1/F =1/(100*10^3) =10^-5 seconds = 10 "microseconds"`

The total time taken by 141 cycles is

`t = 141*T =141*10^-5 =1.41*10^-3 seconds =1.41 miliseconds`

This is the total time of charging therefore the total charge on the capacitor is

`Q =I*t = 10^-3*1.41*10^-3 =1.41*10^-6 C`

Now, by definition again the capacitance is

`C = Q/U`

Therefore the voltage on the capacitor is

`U =Q/C = (1.41*10^-6)/10^-6 =1.41 "Volt"`

Further Reading: