A 220-V, 25-A motor is to be directly connected to a 230-V (+-5%) supply. Assume the motor has a voltage tolerance rating of (+-10%). Would the minimum and maximum voltage at which the motor may be...
A 220-V, 25-A motor is to be directly connected to a 230-V (+-5%) supply. Assume the motor has a voltage tolerance rating of (+-10%). Would the minimum and maximum voltage at which the motor may be operated be (Min=198V and Max=242V). And the minimum and maximum possible supply voltages be (Min=115 and Max=345). And how do I figure out if the motor will be operated within its acceptable voltage range?
First a bit of history.
In Europe, about 20-30 years ago the standard voltage for AC household voltage supply was 220 V. About 10-15 years ago it has changed to 240 V and now since 2008 (by the EU standards) it is defined as a compromise between these two values to 230 V (+-5%).
The maximum deviation from the standard value of 230 V is
`Delta(V) = 5%*230 =5/100 *230 =11.5 V`
which means that the maximum and minimum are
`V_min = 230-11.5 = 218.5 "Volts"`
`V_max =230+11.5 =241.5 "Volts"`
For the motor the voltage tolerance is
`Delta(V) =10%*220 =10/100*220 =22 "Volts"`
which means that the operating voltage for the motor are in the range
`V_min =220-22= 198 "Volts"`
`V_max =220 +22 =242 "Volts"`
Since the operating voltage range for the motor is greater than the voltage an AC household voltage supply can have it means that always the motor is working within its the acceptable voltage range.