The last five to seven years has seen more circuit designers integrate mixedsignal and analogue functions onto digital chips such as microcontrollers, DSPs, and ASICs.
Many technological and financial reasons are fueling this trend. One of the most important is that CMOS processes are more capable and better characterised for analogue functions. Analogue-to-digital converters (ADCs) can now be implemented on submicron processes up to certain speeds and resolutions.
For general functions, like monitoring a power-supply voltage, currents, or temperatures, most design engineers use ADCs that are integrated onto microcontrollers or digital processing units fixated at the heart of a system.
Integrated ADCs are fine for these functions, but the quality and dynamic performance is generally not good enough to perform in signal-path applications, such as in a digital receive channel front end. On these huge chips, the converter function almost comes for free. Resolutions go from 8 to12 bits and speeds to a few hundred kilosamples (ksps) on MCUs. DSPs can provide more resolution up to 14 bits and higher sampling rates of up to 2.5Msps. The limitation in speed is due to the inherent noise injection of the digital parts into the silicon substrate.
With mainstream CMOS processes, it is not possible to isolate the high-speed sample and hold of the digital core’s ADCs. The majority of integrated ADCs have no missing codes (INL/DNL is ok), at least with a 5V supply. When the supply is reduced to 3V, then the datasheets’ full-scale error of 2LSB INL/DNL and offset/ gain variation of 4LSB becomes a real issue.
Reasons for disintegration
We all realise that increasing performance needs and added functionality at no extra cost are key drivers of shrinking geometries in silicon transistors. Today MCUs are still residing at 0.35 m and 5V supply levels. Soon manufacturers will be forced to go to smaller feature sizes like 0.18 m, which will drop supply voltages to 3V. On the DSP side, the speed/power ratio has already forced the analogue supply to 3V and the digital supply to 2.5V.
Looking at the ADC blocks, the drop in supply voltage means the maximum acceptable input voltage is reduced. As a result, the least significant bit (LSB) gets smaller.
Unfortunately, the RMS level of noise on the digital side will stay constant—or even go up. That means the signal-to-noise ratio will shrink, resulting in a loss of dynamic range. The performance of such an integrated ADC may no longer be sufficient for certain applications. In fact, it may only find use in voltage/current monitoring situations.
As a rule of thumb, today’s integrated ADCs already only have an effective number of bits (ENOB) that is two bits less than advertised. In other words, a 12-bit converter has only 10-bit performance. One way to overcome dropping dynamic range is to use higher resolution, even at the expense of losing some bits. Unfortunately, this option is rather limited because the noise that the DSP/MCU digital block injects into the common substrate plays havoc with small LSBs.
A similar situation occurs with op-amp integration—another analogue function that’s been integrated into digital chips (e.g., ASICs). Here, noise versus performance issues forced disintegration for high-performance amplifying functions.
Therefore, as digital process technologies continually evolve and transistor dimensions decrease, we will see an increased need for general-purpose ADCs with real 8- to14-bit resolution with speeds reaching 3Msps. As for now, the demise of integrated ADCs is not in sight. Nonetheless, requirements for multiplexed input channels, single-ended and differential inputs, external and internal references, and so on, will also further increase the need for standalone, generalpurpose ADCs.