In the heyday of the linear power supply, efficiency wasn’t an essential parameter—and it hasn’t changed much to this day. Power supplies were (and are) heavy and large, but the focus was (and is) on low noise, accurate output voltage, tight regulation, and low ripple. It’s often hard even to find efficiency, which is still in the range of 45% to 60%, listed in the specifications.
Before the introduction of the brick, switching power supplies were called box switchers because they were about the size of a shoebox. They improved efficiency to above 80%, which was a big advantage from a thermal standpoint and an improvement in size and weight over the linear supplies.
The pulse-width modulation (PWM) control chip introduced in the mid-1970s gave a boost to switching power supplies, which typically had an efficiency peak at 50% load. Zero current switching (ZCS) modules followed about 10 years later with a flat efficiency curve near the peak of the PWM from 20% load to full load, which was a valuable improvement.
The next break in efficiency occurred when brick designers began using synchronous rectification, which took hold in the late 1990s, achieving efficiencies in the mid- to upper-80% range. This was a major improvement in the power dissipation of the output rectifiers, which traditionally used diodes. Using a MOSFET instead greatly reduces the forward voltage drop.
Synchronous rectification had its biggest impact on the low output voltages—5 V and less—because the output rectifier’s forward voltage drop was a significant percentage of the output voltage. The ability to reduce power dissipation at the low output voltages, which traditionally offered the lowest efficiency, was a particularly timely advance in the industry.
That’s because output voltage requirements were getting lower and lower. Around the same time, higher-voltage converters (e.g., 300 VIN, 48 VOUT), still using traditional rectification, were also reaching the high-80% range.
Today, the drive for energy efficiency is pervasive. Energy Star recognizes efficient products such as computers, appliances, lighting, and even new homes. Personal computers with internal power supplies, for example, must achieve a power factor greater than 0.9 to meet Energy Star guidelines.
At lower power factors, the apparent power is greater than the real power consumed by the load, increasing losses and cost to carry the higher current. The European Union EN61000-3-2 standard places upper limits on the amount of harmonic currents certain products may generate. Though harmonic distortion is not efficiency per se, limits on electronic equipment help to more efficiently pull energy from the grid.
Meanwhile, as the number of onboard voltages proliferated and output voltages dropped, the intermediate bus architecture (IBA) was an attempt at addressing some of the challenges (e.g., high cost) with bricks in a distributed power architecture. IBA generates several low voltages by using non-isolated buck step-down regulators at the point of load (niPOL).
An intermediate bus converter (IBC) transforms the input bus voltage to provide a common voltage source called the “intermediate bus voltage” from which several niPOL buck regulators may be powered to regulate their respective loads.
The intermediate voltage level, such as 12 V, is chosen to bridge the gap between a typical input bus voltage of 48 V and a typical load of 3 V. Through its output inductor, the buck regulator delivers a voltage to the load equal to the average voltage at the common node between its top and bottom switches. This is equal to the duty cycle of the top switch times the intermediate bus voltage.
Although the niPOL converters are smaller and less expensive than bricks, the IBA (with two power conversions) pays an intrinsic penalty in efficiency. In addition, especially in high-power conversions, the typical 12-V intermediate bus is too low for efficient power distribution.
A 12-V intermediate bus and niPOL output voltages around 1 V force duty cycles to around 10% with currents often in the neighborhood of 100 A. A low duty cycle adversely affects niPOL efficiency, and I2R losses become significant.
Consider the Application
Efficiency improvement has to be seen in terms of the application itself, rather than on simply specifying power components with high efficiencies. In high-end computing and telecom applications, the ac to 12-V dc silver box followed by 12-V to 1.x-V synchronous buck converter solution was brought down by the continually lower load voltages that elevated it to success in the first place.
Advanced power train technology can eliminate the step-down stages and enable direct 48-V to load conversion. Building-block modules, running at multi-megahertz frequencies, boost power-conversion efficiency. For the 48-V to processor stage, basically at 100% duty cycle, factorized regulation and voltage transformation provide a compact, efficient solution.
Individually, the building block modules can achieve as high as 97% efficiency. Overall efficiency for a power system, including a regulator module and a transformation module, operating from an unregulated dc source and supplying a low-voltage dc output typically ranges from 90% to 95%.
The most important gain, however, is that of system efficiency and savings in power drawn from the ac line. High-end computer applications, such as data centers, are expected to account for about 3% of total U.S. electricity consumption by 2011. Lower energy losses also mean associated systems such as air conditioning incur savings in power and cost.
Tom Curatolo is the director of applications engineering at Vicor Corp.