There are documented cases of motors that have given 20 or 30 years’ sterling service only to fail within hours of being fitted with a variable speed drive which was meant to save energy and protect the motor from high load on start-up. It is a fairly unusual occurrence, but production engineers planning an upgrade to their facilities should be aware of the possibility. Here, Jerry Hodek, Managing Director of Rotor (UK) Limited explores the problem and looks at solutions and alternatives.
Today, inverters are common throughout most industrial sectors and are well understood; they are becoming ever-more popular as people come to recognise their energy-saving potential and the advantages that speed control can bring. In fact, most national governments have incentive schemes in place to encourage their adoption as a way of reducing carbon emissions.
However, it was not always like this. Modern inverter technology emerged in the 1980s and its adoption developed from there. At this time it was noticed that some motors failed when fed via an inverter, and it was realised that the voltage spikes caused by the inverter could make micro holes in the enamelled wire insulation. Many micro holes close to each other become one big hole, and this is when the motor burns out.
Several solutions to this problem were identified, including fitting harmonic filters, using oversized motors and improving the insulation within the motor. It was soon realised that a modified breed of ‘inverter-rated’ motor was the best option, and electric motor manufacturers were keen to include such a range in their catalogues.
Interestingly, motor design and quality improved on many fronts steadily through the 1990s and beyond, improving energy efficiency, reliability, etc., and for many manufacturers the inverter-related improvements became part of this overall enhancement. Therefore, the situation today is that many standard motors come ‘inverter-rated’ by default; the problem, if you are buying a branded motor of reasonable quality, has largely resolved itself. So, nowadays, the issue mainly arises in retrofit, modernisation and upgrade projects, when an existing motor is being reused.
It is worth taking a deeper look at the subject and understanding the physical principles in play. Rotor UK supplies standard motors, heavier duty inverter-rated motors as well as being a distributor for the AC Tech range of frequency inverters, so is in a good position to comment.
Inverters change motors’ speed by changing the nature of the supply voltage from a smooth sinusoidal waveform to a rapidly switching one. Unfortunately, this inevitably causes voltage spikes, which can erode the insulation and lead to motor failure. There are other adverse side effects too, including inadequate ventilation at low speed, dielectric stresses on motor windings and magnetic noise.
These arise because the motor impedance rises and falls and if it reaches a point where it is greater than the impedance in the supply cable, reflection occurs and standing waves develop. (The longer the cable, the worse the potential problem.)
Solutions include the use of filters or load reactors, and using short cable lengths. Some people will also use an oversized motor (compromising energy efficiency performance), while others use high-temperature resistance Class H insulation.
Today, most motors are made to be highly efficient and include ‘inverter-friendly’ insulation. These are suitable for use in a wide variety of applications, so are generally the default choice in most projects.
However, for arduous situations specific ‘inverter-duty’ motors can be a better option. These use extra thick insulation to cope with voltage spikes and, in some cases, magnetic wire to counteract the adverse effects of severe waveforms. Inverter-rated motors are also designed for wider performance ranges than can be provided with a general-purpose motor including full-rated torque at zero speed, i.e. holding a load still using the motor rather than a mechanical brake.
They are also designed with better thermal management, so that operating temperatures remain lower, improving resistance to voltage stresses. In fact, large inverter-duty motors often have a blower fan to ensure adequate cooling during low-speed operations, while water cooling jackets have proved effective, typically in larger capacity, or particularly demanding applications.
The conclusion we must draw is that in many cases a modern motor is perfectly suitable for use with an inverter; however, engineers should be attuned to potential problems. These are likely to arise when:
- There is a high ambient temperature
- The duty cycle includes a lot of low-speed operation
- Loads are held at full torque, zero speed
- There are safety issues related directly to motor failure
- Where replacement of a failed motor would be difficult and/or expensive
- Where burnout of a motor can lead to shutting down an entire system or production line
- Where an existing motor is of unknown quality
- Where a retrofitted motor is more than 10 years old
In short, it may be better to fit a new inverter-rated motor by choice rather than risk an expensive breakdown in the (possibly quite near) future. To learn more, please visit www.rotor.co.uk.