Portable devices often need the ability to vary the supply voltage of their applications processor to enable special power-savings modes and to control output voltage ramps. A simple but effective way to implement dynamic voltage control is to pair a voltage regulator with a current-output digital-to-analog converter (DAC).
Voltage regulator IC1 in this design is a high-efficiency, synchronous 5-A stepdown converter with a 65-µA supply current that is well suited to portable device designs (see the figure). Strapping the SET pin to VCC, leaving it open, connecting it to the REF pin, or connecting it to ground sets the converter to produce output voltages of 1.8, 1.5, 1.1, or 0.75 V, respectively.
With SET connected to ground, a resistive feedback divider (R3 and R4) connected to VOUT provides an opportunity to alter the output voltage by injecting current into the resistor junction, within a maximum range of 2.7 to 0.75 V.
The converter’s output voltage in this configuration is given by VOUT = Vfb (R3/R4 + 1). With SET grounded, the converter will hold Vfb = 0.75 V. The values shown for R3 and R4, then, yield a value of 1 V for VOUT.
The 7-bit current DAC IC2 (one half of a dual DAC), when connected to the feedback node of the divider network, allows you to source or sink as much as 200 µA into the network. Because the converter keeps Vfb at 0.75 V, the current through R4 is fixed at IR4 = 0.75/R4, and we see that any current from IDAC adds to or subtracts from the current through R3.
This additional current causes VOUT to rise when the DAC sinks current or fall when the DAC sources current. By setting the DAC output, then, you can control VOUT, with VOUT = Vfb + (IDAC – IR4) × R3. At power-up the DAC outputs assume a high-impedance state, so only the divider resistors determine the starting value for VOUT.
The DAC’s I2C control interface allows selection from 127 sink and 127 source current values for its output. The command format is an 8-bit big-endian word with the most significant bit (MSB) serving as a sign bit (1 = source, 0 = sink) and the remaining bits providing the relative output value. Resistor R2 at the FS0 pin determines the actual full-scale output current (IFS) the DAC will generate, with IFS = (0.997/R2) × (127/16).
The resistor values in the design were chosen so VOUT has a starting value of 1 V and the DAC can adjust the output voltage with a resolution of 5 mV. To achieve this resolution, then, one least significant bit (LSB) of the DAC must equal 5 mV/R3.
The full-scale output IFS must therefore be 127 × (0.005/6340), which equals 100.16 µA. To achieve IFS = 100.16 µA, R2 = (0.997 V × 127)/(16 × 100.16 µA) or 79012 Ω. The nearest standard-resistor value is 78.7k, which gives the actual value for IFS = 100.55 µA.
The converter must see a load current at least as great as the DAC’s maximum source current, even if no real load is attached, to operate properly. This restriction arises because, to improve efficiency, the converter automatically switches to a pulse-skipping mode when its output current is below 100 mA.
In this mode, the converter turns off its low-side MOSFET and thus cannot sink current. If there is no load current and IC2 tries to set VOUT below 0.765 V (the datasheet’s no-load value for Vfb), then IDAC will look like sink current to IC1 and the converter output could become unstable, triggering the converter’s overvoltage or undervoltage detectors.
For example, to set VOUT to 0.6 V, the sink current into the converter must be (0.765 V – 0.6 V)/R3, or 26 µA. To ensure that IC1 operates properly with a low voltage output even when there is no real load, you must force IC1 to produce this sink current. Thus, there needs to be a dummy load RL present on the output to draw the 26 µA.