For the past few decades, the transformer has been the workhorse of power electronics. It has been a critical, enabling, dependable ingredient to safely transform power from the grid in its raw AC (alternating current) form into usable DC (direct current) to power virtually all modern electronics. However, at the same time, the transformer has also been the anchor of power electronics … literally weighing down power supplies with as much as 50% of the weight of power systems. Transformers are the primary reason that AC-DC power systems have seen little improvement in power densities over the past three decades, stagnant at around 5-10 Watts per cubic inch (W/in3) for most systems under 1,000W.

When you open up a power adapter, it looks like it was designed in the ‘70s … because it virtually was! The transformer typically wastes over 50% of the power loss and is often the most expensive component. To make matters worse, traditional transformer manufacturing is almost impossible to automate, with labor-intensive windings, and the use of messy tape and plastic barriers to meet voltage-safety requirements, difficult shielding techniques to reduce electrical noise (EMI) and hand-inserted, though-hole manufacturing methods.

There must be a better way!

For power systems to make significant advances and catch up with the developments of modern-day digital electronics, something has to be done about the transformer. It IS the elephant in the room (figuratively, and somewhat literally). But how do we fix it? Let’s start with physics. Transformers operate based on the principles defined in Faraday’s Law. To understand what is driving the size of a transformer, this equation can be re-written as shown below:

Put more simply, the size of the transformer (# of turns x area of the transformer core) for a given output voltage is inversely proportional to the flux density and the switching frequency. So why not dramatically increase the flux density or the switching frequency to shrink the size (and, in turn, the cost and weight) of the transformer? If only it was that easy! There are three challenges to overcome:

  1. Flux Density – to create the desired output voltage, current flows through ‘primary’ wires that are wound around a magnetic core and, in so doing, generate a magnetic field (called flux). This flux is transformed back into a secondary winding current and output voltage. With higher switching frequency, lower flux density is needed to maintain the targeted output voltage, but with lower flux density we start to offset the benefits of higher frequencies. Not good!
  2. Power losses – as mentioned earlier, the transformer often contributes up to 50% of the power losses in a given power system. What happens to power loss if we increase the switching frequency? Power losses in the transformer are, unfortunately, proportional to switching frequencies. For a given core, higher frequencies means higher losses. Uh oh…
  3. EMI – Power supplies must meet EMI (electro-magnetic interference) regulations which demand that negligible electrical noise is generated to minimize interference back to the power grid or downstream electronic devices. Common (engineering) sense suggests that faster switching, with its high dV/dts and di/dts will generate more electrical noise. This will take a little uncommon sense…

These are not easy challenges but they are solvable! In my next blog, we will discuss solutions for each and see if we can slay this magnetic dragon once and for all. Stay tuned!

This article was originally published: https://www.planetanalog.com/author.asp?section_id=3359&doc_id=564982&