Specialized processors entertain and keep us connected
Ever since the transistor radio came on the scene in 1954, semiconductor technology has played an important role in keeping consumers entertained and in touch with one another. From the '50s through the '60s, most semiconductor usage in consumer products was in the form of discrete transistors, diodes, and some low-integration analog circuits for audio. In the mid-'60s, digital tuning started to appear in citizens-band and industrial two-way radios. However, digital technology didn't really make it big in the consumer market until moderately priced digital calculators and wristwatches started to appear in late 1969 through the early '70s.
The year 1972 marked a very pivotal development, the release of the first handheld scientific calculator from Hewlett-Packard (www.hp.com) called the HP-35. This calculator was the harbinger of death for the engineer's most loved and hated tool, the slide rule. Over the next few years, the role of digital technology exploded as system designers learned how to leverage it to create a wide array of products based on both standard ICs, such as microprocessors and embedded controllers, to custom-crafted chips that perform dedicated functions.
Before long, digital technology gave rise to the PC (the MOS Technology 6502 used by Apple Computer Inc., www.apple.com., in its first family of computers, and the Intel 8080 selected by MITS for a home-brew minicomputer-like PC called the Altair). Around that time, the Pong video game made its debut. In some ways, the consumer-grade video cassette recorder was made possible by a relatively simple control interface implemented using 4-bit microcontrollers.
The wave of innovation, though, was just starting. With the advent of the digital-signal processor, analog calibration tasks that required time-consuming hand tweaking could be reduced to simple algorithms. DSP technology applied to audio allowed companies to build stereo systems that provided well-filtered audio, or to add special sound-effects such as echos and vibrato, just by processing the audio signals in the digital rather than in the analog domain.
New audio technologies that employed digitized audio (16 to 24-bit digitization) were on the horizon in the late '70s. With the release of the CD-ROM specification by Philips and Sony in 1979, designers' imaginations took flight, bringing digital and audio CD players to market in the early '80s. Other formats such as digital-audio tape (DAT) were also in vogue. Due to cost or copying fears, though, DAT never became a major factor.
The birth of the audio CD in the mid-'80s opened the market's eye to digital content and the reproduction quality possible. Although audio purists still wanted to keep audio signals in the analog domain, digital won out and digital audio overwhelmed the recording industry. Today, the move from CD audio to MP3 and other digitally encoded audio schemes has created yet another layer of silicon support--dedicated ASICs. These form the basis for portable MP3 music players and special memory cards with large amounts of flash memory to replace the audio tapes and CDs.
As systems such as PCs were introduced, the number of custom-crafted chips increased while designers looked for ways to reduce the component count inside the boxes they created. Thus was born Chips and Technologies Inc. (www.chips.com), a company that designed chip sets for the PC motherboard. These sets, typically 3 to 5 chips, replaced over 50 less complex chips and drastically simplified and lowered the cost of designing and manufacturing motherboards.
Additionally, PC graphics capabilities were increasing by leaps and bounds, moving quickly from monochrome character displays employed in basic video-display terminals to multicolor graphics-capable 2D displays. Dedicated video-display controller chips that just performed the digital timing and memory control grew to more highly integrated solutions that combined the control functions with the color palette memory and the high-speed digital-to-analog converters (DACs) required to drive the color gun amplifiers for the CRT display.
Aside from CPU clock speed, graphics performance became one way for manufacturers to distinguish their machines. Not only was there a major effort to increase the speed at which the controller could move images on the screen, but also a move to display more lifelike images. Graphics engines developed in the late '80s and '90s focused heavily on high-resolution 3D imaging. Leading that charge was 3D Labs (www.3Dlabs.com), a company that introduced some of the first 3D graphics engines used in PCs and workstations.
At the same time, graphic workstation suppliers such as Silicon Graphics Inc. (www.sgi.com) and Evans and Southerland Inc. (www.es.com) were developing custom chips for graphics engines that could execute billions of operations per second. By the early '90s, custom graphics chips were tag-teamed with high-performance floating-point DSP chips that took over some complex math operations needed for functions such as ray tracing, shading, reflections, other lighting effects, and image rotation.
As the industry progressed and continually enhanced graphics capabilities, some of the latest graphics engines introduced by companies like ATI Ltd. (www.ati.com) and NVidia Inc. (www.nvidia.com) were actually more complex and required more transistors than the main CPU in the PC. Today's graphics engines let developers craft games and images that are so lifelike that they draw the player into the action.
Plenty of room remains for improvement as designers move from today's XGA (768- by 1024-pixel) resolution images to 1280- by 1024-pixel resolution. Still higher resolutions employ 24-bit true-color, shading, textures, and many other image effects. To deliver these features, graphics chip manufacturers will have to develop faster and more highly parallel engines to pump data to the display. This same trend has been followed by game manufacturers that keep striving to create environments that immerse the player into the action.
Ever since the first bouncing white pong ball moved across the screen, game developers have demanded that hardware platforms deliver lifelike real-time images. The latest game boxes from Sony, Microsoft, and Nintendo are perfect examples of this trend. Sony is developing its own custom graphics chip, the Emotion Engine, and Microsoft is committing to a version of NVidia's G-Force graphics chip. Future systems will incorporate even more complex graphics processors that integrate well over 100 million transistors on one chip.
Improved imaging isn't limited to the computer screen. Advances in television technology have led to large color display, projection TV systems with large five-foot diagonal screens, large-area plasma and LCD direct-view monitors, high-definition television systems that provide movie-theater quality images, digital versatile disk (DVD) technology that packs 6 or more Gbytes on a CD-size platter, and much more. Still-image photography has also advanced from analog film to high-resolution digital imaging cameras that drop into a shirt pocket or are no more bulky than high-quality 35-mm cameras. Moreover, video capture has progressed from the analog VHS tape days to the latest in digital-video handicams that store an hour or more of digitized video on a small tape cassette.
Making all of these systems possible are the ASICs needed to convert, process, compress, expand, and manipulate the data. Image sensors have progressed from simple photocells to multimegapixel imaging arrays that can capture close to 35-mm quality images. On the display side, CRTs have given way to liquid-crystal flat-panel displays in many applications. New digital light valve and micromirror technologies based on micromachined silicon promise brighter, higher-resolution images in future-generation display systems.