Factor In Memory Tradeoffs When Using Today's Processes
Today's ever-diminishing process geometries offer system-on-a-chip (SoC) designers unprecedented opportunities to increase their target applications' performance and functionality. But the use of modern processes also poses new challenges that previously were of little concern. One such issue is the selection and application of the appropriate memory technology.
Due to the ever-increasing amounts of memory that are being utilized, embedded SoC memory has now taken center stage. According to the International Technology Roadmap for Semiconductors (ITRS), the percentage of an SoC that's devoted to memory will increase from 20% of the die in 2000 to over 90% in 2011. In 2003, memory began consuming over 50% of the die area.
The need for memory is clearly evident in the mobile-phone market. As that market transitions from 2G to 3G, products require increased performance and expanded functionality. While a basic 2G handset only requires enough memory for a menu system and protocol stack (typically 1 to 2 Mb of SRAM), a 3G handset with basic video display/capture could require up to 16 Mb of SRAM. Add in features like web browsers, audio players, GPS, and MPEG-4 video, and the memory requirements easily surpass 32 Mb of SRAM. In short, performance and applications are driving the requirement for increased memory.
In the past, external SRAM or DRAM often satisfied the need for large memory. Unfortunately, it's no longer economically feasible to use external memory. Standalone SRAMs in the 8-to-32-Mb size have been and will continue to be relatively costly items. They aren't suitable for high-volume applications, such as consumer wireless.
The PC market has always driven DRAM pricing. By taking advantage of the economies of scale, designers could employ these same DRAMs in their designs. But recent evolution in the PC-DRAM design space introduced the very large dual-data-rate (DDR) DRAM. This DRAM isn't suited for embedded applications, thereby denying the designer the low-price option. The economies of fine-geometry silicon processes imply that it's now cheaper to embed the memory than to use an external solution.
It's not enough to know that one can achieve tremendous cost, performance, and functionality advantages by embedding memory. Designers must be aware of the tradeoffs and pitfalls of low-power design using large embedded memories. Fine-line processes (0.13 and smaller) are especially susceptible to sub-threshold leakage (an uncontrolled or parasitic current flowing across regions of the semiconductor structure in which no current should be flowing). Although this effect is present in all MOS transistors, regardless of geometries, it is dominating today's processes. It promises to be an even greater concern in future processes. Designers can no longer ignore the contributions of leakage currents through the memory when the memory isn't being accessed.
Low power is a concern for both the wired and portable-wireless design communities. In the wired world, power translates directly into heat. Today's designer is always trying to stuff more into a smaller space—be it an 18-in. rack-mount board or a new ergonomic package. The challenge of maintaining thermal control in these enclosures can be exacerbated by the heat contributions of ever-expanding memory. Designers must therefore choose a memory technology that consumes the lowest power in both active and idle mode.
In the wireless arena, this issue is even more critical. Every designer is required to maximize battery life. In the past, this goal was accomplished by not accessing portions of the memory. But simply "sleeping" the memory isn't a viable solution anymore. Leakage current still flows through the unaccessed memory. Memory architectures and technologies must now be chosen for their ability to control active and idle currents.
Today's SoC designer also must ensure the quality of the data that's contained in the embedded memory. Memories are susceptible to random bit failures known as soft errors. Many soft errors result from interactions of high-energy particles with the memory-cell structure. While progress has been made to eliminate the source of some particles—specifically alpha particles in packaging and lead materials—the contribution of soft errors by cosmic rays remains significant. The fine-line geometries of today's processes are more susceptible to cosmic-ray interaction than previous silicon processes. As a result, designers must familiarize themselves with memory technologies that will ensure the highest data quality. Some memory architectures, for example, employ internal error-correcting circuitry. It repairs bad data to ensure high reliability.
The ever-increasing amounts of embedded SoC memory contribute to the performance and functionality offered by today's processes. To make the best use of this embedded memory, however, designers must keep in mind memory tradeoffs like cost, power usage, and reliability.