1980s: And The Walls Came Tumbling Down

Oct. 21, 2002
It's difficult to think of the eighties as anything but a time of mind-bending change. As the seventies ended, America was in the grip of a frustrating standoff with Islamic militants holding U.S. citizens hostage, underscoring the nation's...

It's difficult to think of the eighties as anything but a time of mind-bending change. As the seventies ended, America was in the grip of a frustrating standoff with Islamic militants holding U.S. citizens hostage, underscoring the nation's vulnerability to hostility from without. As if the oil embargoes of the seventies hadn't already made that point, the Iran hostage crisis made for an edgy beginning to a decade whose hallmark would be sweeping social, political, and technological change.

With Ronald Reagan ascending to the presidency, an era of massive defense spending began. Pledging to make America stronger than ever, Reagan embarked on an ambitious defense agenda that included a futuristic space-based shield against intercontinental ballistic missiles that was dubbed "Star Wars." Eager to reclaim America's dominance in the peaceful exploration of space as well, NASA regained the spotlight as the first reusable spacecraft, the space shuttle, took flight in 1981.

While America was strengthening itself, its rivals were coming apart at the seams. When President Reagan implored Soviet Premier Mikhail Gorbachev to "tear down that wall," it signaled the beginning of the end to decades of conflict, hot and cold, with a rival that had seen its social and political system fail. Henceforth, a new era of cooperation began as America took on the responsibility of ushering the former Soviet Union into the painful, but necessary, process of becoming a free and open society.

But in the eighties, the Berlin Wall wasn't the only barrier to be demolished. Others fell not with the shouts of rebellion and the sound of crumbling brick and concrete, but with the beeps and boops of computer modems reaching out to each other across telephone lines. The eighties were the decade of the personal computer, when computing conquered the world finally and completely. If there were any doubts, Time magazine's naming of the computer as its "Man of the Year" for 1982 shattered them utterly.

PCs had begun as a niche product, aimed at hobbyists and others interested in playing with a new toy that hadn't yet found much in the way of practical application. But it was just a matter of time before the giants of the well-established mainframe computing industry noticed their potential.

IBM had dabbled with personal computing as early as 1975, but in 1980, Big Blue turned its attention to the matter more fully. A top-secret development effort code-named "Acorn" resulted in the development of the open-architecture IBM PC. Based on Intel's 8088 microprocessor, the machine sported 16 kbytes of RAM and one or two 5.25-in. disk drives. Users could choose between two operating systems: CP/M-86 or IBM PC-DOS, which had been developed by Bill Gates' and Paul Allen's fledgling Microsoft.

The open architecture was the cornerstone of a well-thought-out marketing and development program. Before the machine's launch in 1981, IBM had been canny enough to offer prototypes to would-be developers of software and peripherals. By the time the PC hit the street, it was all there. If business users had been skeptical of desktop computing in the past, the fact that IBM was now behind it was validation.

Before long, "clones" arrived on the market and soon there were over 100 companies plying the PC waters. Among them was the Osbourne Computer Company with the first "luggable" portable computer. But IBM's biggest rival, Apple, was a formidable competitor. By 1980, Apple had already captured 50% of the PC market and had the advantage of plenty of available software titles for the Apple II.

Having a PC on one's desk was great. Still, it took a peripheral device to not only cement its utility but also to begin the marriage of computing and telephony. The modulator/demodulator, or modem, had been invented in 1960 at AT&T's Bell Labs as a means for mainframe and mini computers to transmit data back and forth. Reintroduced commercially for the PC in 1981 by Hayes, modems were the vital link that allowed PC owners to dial into online services like CompuServe and the Source. Albeit painfully slow at 300 baud, they created the means for PCs to communicate with each other and began the world's movement to what would be dubbed "cyberspace" by William Gibson in his 1984 novel, Neuromancer.

Indeed, by 1982, the underpinnings of the Internet arrived when TCP/IP (Transmission Control Protocol and Internet Protocol) was established as the standard for ARPAnet. By 1987, the number of network hosts would exceed 10,000. It took until just 1989 for that number to reach 100,000.

The rapid growth of the PC industry called for equally rapid development of enhanced processing power. Faster CPUs, larger memories, and improved disk storage fueled new generations of machines. IBM's PC XT raised the bar in 1983 by providing a 10-Mbyte hard drive, three more expansion slots, 128 kbytes of RAM, and a 360-kbyte floppy drive. Apple answered with its Lisa, a $10,000 machine that fell on its face.

Yet Lisa pointed computing in a direction that would stick. It was notable for a graphical user interface (GUI) that saved users from the dreaded "C:" prompt of a DOS command line.

Apple's next shot across IBM's bow would be more dramatic. Launched in a famous commercial during the halftime of 1984's Super Bowl, the Macintosh marched personal computing into the future. It broke with the PC in that it was based not on an Intel processor but on Motorola's 8-MHz, 32-bit 68000 processor. Not only that, the machine ran a proprietary operating system that was incompatible with the IBM PC's DOS.

The Macintosh was an instant success. It made computing more intuitive and user-friendly. IBM was forced to respond, and the response was Microsoft Windows. It took some time for the Windows concept to catch fire with PC users, but eventually most abandoned the DOS prompt for the icons and pull-down menus of Windows.

Microprocessors, microcontrollers, and memories grew faster, larger, and more powerful throughout the eighties. In 1984 alone, 256-kbit and 1-Mbit DRAMs arrived to satisfy the insatiable demand for memory. Motorola's 68040 processor, a 32-bit CPU, upped the clock rate to 25 MHz. Intel's 16-bit 80286 anchored IBM's PC AT, expanding the desktop machine's capabilities. The 80386 arrived in the fall of 1985, bringing with it 32-bit processing and on-chip memory management. Finally, the 1.2 million-transistor 80486 was introduced in April of 1989.

Click here for several examples of the special photos in this picture album.

In other corners of the semiconductor industry, advances were coming rapidly as well. Single-chip digital signal processors (DSPs) arrived in the late seventies and improved dramatically in the eighties, working their way into communications and process control applications. Fixed-point programmable DSPs were followed by floating-point types, enhancing their scope of applications.

DSPs filled an important niche, addressing many applications for which general-purpose microprocessors weren't well suited. For other applications, particularly in the consumer arena, analog and mixed-signal circuitry was required. The eighties saw a great deal of development in digital-to-analog and analog-to-digital conversion technology. Mixed-signal technology proved central to digital audio equipment. Compact disc players went portable by 1984, as did many other consumer products. Miniaturized TV sets with color LCD screens, digital audio tape recorders, and high-fidelity stereo VCRs all appeared around mid-decade. By 1988, sales of compact discs had topped those of long-playing phonograph records. Despite the protests of analog-loving audiophiles, there was no denying that the future of audio was digital.

The continued drive toward portability and miniaturization of electronic systems, which included laptop PCs by the latter part of the decade, depended on improvements in battery technology and power management. Operating voltages for commodity logic circuits began dropping, first from 5 V to 3.3 V and then, by the end of the eighties, to 2.5 V. That meant a corresponding rise in current consumption and new challenges for designers of power-management circuitry. Nickel-cadmium batteries became the option of choice for portable consumer electronics.

Design requirements were growing in complexity, and the design infrastructure was again finding itself stressed. IC designs were so large and unwieldy that despite the gains in computer and graphic display technology, designers were unable in many cases to see the "big picture" as they pieced together massive circuits. Design work had to be automated to achieve the productivity that shorter design cycles demanded.

A big step came around 1982 when minicomputers from Apollo were adapted for use in turnkey CAD systems by startups like Mentor Graphics and Daisy. But a turning point was reached in 1984 with the creation of the Verilog hardware description language (HDL). HDLs like Verilog and VHDL represented a higher level of abstraction than gates, allowing designers to see more of their design on screen at a time and thus giving them more of the "big picture" that they could no longer get designing at the gate level. The move up to register-transfer level, though, left a gap. How would designs created at the new higher level of abstraction be translated back into the gate level?

The answer was found in logic synthesis, a technology pioneered in 1986 by Synopsys. Logic synthesis, though inefficient at first, was a new paradigm in large-scale chip design. It enabled IC designers to think bigger than ever and spawned tremendous growth in what is now known as the electronic design automation (EDA) industry. Later, logic synthesis was accompanied by advances in place-and-route technology as well as design verification.

The widespread prevalence and acceptance of personal computing in the eighties, along with the rapid growth of online activity, created the beginnings of change for communications technology. Networking, fueled by the proliferation of Ethernet local-area networks, brought huge improvements in productivity to offices and manufacturing facilities. Now that computers were everywhere, they needed to be able to communicate efficiently, and networks were expanding at a phenomenal rate.

Seeds for even greater change were sown in 1987 when Bellcore introduced the concept of asymmetric digital subscriber line (ADSL), which brought with it the potential for multimedia transmission over the nation's copper loops. Fiber optics also came more heavily into play as the first transatlantic fiber link was completed in 1988. The foundations for the World Wide Web had been laid, and not a moment too soon.

A revolution in communications was under way, but that revolution would have a second front. The emergence of broadband technology would serve to feed the ravenous appetite for bandwidth that the Web would develop. Soon, however, an ever-restless society would tire of being tethered to the desktop. This time, the revolution would not only be televised but also beamed through wireless links to cell phones, personal digital assistants, and other portables. Computers, telephones, and televisions all joined together in a symphony that brought down all of the walls and left us at the doorstep to a mobile future.

Click here for several examples of the special photos in this picture album.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!