Ever stand next to a speeding train as it roars past? Its sound and power envelop you, pull you in, and make it difficult for you to think of anything else while it's thundering by. For some, it's a heady, exciting experience. And like a bullet train gathering speed, ubiquitous connectivity rushed with hurricane force into our lives in the nineties. It started quietly, with relatively little fanfare, but by the time the decade was out, the roar was deafening. We were all connected, awash in data. We had become addicted to technology and wanted to be enveloped by it, as by the sound and fury of the speeding train.
So we pushed technology everywhere: the home, the office, the car, and the supermarket. It even invaded the comic strips with Dilbert. Engineers may still have been geeky, but boy, were they churning out good stuff. The nineties was a time when "Intel Inside" became a powerful brand. Internet companies advertised on TV in prime time. We had come to depend on the Internet, e-mail, PDAs, cell phones, and PCs. In fact, it was hard to remember life without them.
Technology provided the background music—digitally, of course—for a decade of strength, growth, and dominance for the U.S. As the nineties began, technology captured the news yet again. Who can forget the images from the cameras borne by cruise missiles, gliding silently, yet unerringly, into their targets during the Gulf War? The nation stood as the world's only superpower, having vanquished a distant enemy almost by remote control. Meanwhile, back at home, the economy was revving up its engines for the longest bull market in history. And what drove that bull market to frenzied heights? Technology.
During a time when one of the biggest decisions was whether to buy the kids a Nintendo or Sega video game console, one thing was certain: You could do your research on the World Wide Web. Developed by Tim Berners-Lee of CERN (Europe's equivalent to ARPA), the Web took off rapidly and quickly assumed Next Big Thing status. It rose out of what was left of ARPAnet, which was decommissioned in 1990. That year, Berners-Lee put together a prototype for the Web based on his work in URLs, HTML, and HTTP.
Before long, commercial Internet concerns like Yahoo began sprouting up, driving popular interest in the Web and giving people a reason to go online other than to use e-mail. The Web also got a big push from government. Even the White House was online (e-mail: [email protected]). By 1992, more than 1 million hosts existed. Traffic exploded by over 341,000% in 1993, and from there, the sky was the limit.
Through its melding of computers and telephony, the possibilities afforded by the Internet to affect daily lives became apparent instantly. The phrase "information superhighway" entered the lexicon. So did the term "convergence," when technologists began envisioning a massive network that would connect not just PCs and servers, but virtually anything and everything that conveyed text, images, sound, and video.
The network and its utility grew fast. The Gopher document retrieval system, introduced in 1991 by the University of Minnesota, showed how the Internet could function as an on-demand information repository. By 1992, audio and video multicasts were distributed on the Internet. In 1994, Real Audio technology allowed "Netizens" to listen to audio in near-real time. The first 24-hour, Internet-only radio station, Radio HK, got its start that same year.
An important underlying technology got off the ground in 1993, when students and staff at the University of Illinois' National Center for Supercomputing Applications put together a graphical user interface for Internet navigation. Called the NCSA Mosaic, it was the first Web browser to achieve wide use and is still extant as Netscape Navigator.
With a means to get around the vast network in place, the Internet became big business. Hundreds of Internet service providers began competing for users, who numbered over 150 million by the end of the decade. The possibilities for convergence seemed endless, as media giants, pure technology players, and infrastructure builders sought to leverage the three-pronged combination of the microprocessor in the computer, the broadband content delivery capability of television, and the global, networked, two-way interconnections of telephony. Everything could be digitized and delivered on demand: music, movies, television, and more.
To make the dream of ubiquitous network connectivity real, we needed technologies to facilitate extremely high-speed delivery of enormous amounts of data. Work on such broadband connections to the network backbone centered initially on asymmetrical digital subscriber line (ADSL) technology, which uses the ordinary twisted-pair wiring already in place for telephone service to homes and businesses to carry video signals and high-speed data, Internet traffic, and many other interactive services.
A pitched battle ensued between telephone companies that wanted to bring digital services to homes and businesses via its own copper wires using ADSL and the nation's cable companies that likewise wanted the Internet, video-on-demand, and other services delivered on its own network. The struggle for dominance continued throughout the latter part of the nineties and remains unresolved today.
The Internet had applications besides just entertainment and e-commerce. Embedded systems, as they became known, used the Internet for their own communication purposes. For example, during the nineties, it became possible for utility companies to remotely read power, gas, and water meters using Internet links. Vending machines could automatically inform operators of their inventory status. Concepts for Internet-enabled appliances abounded. They ranged from useful, like refrigerators that could automatically inventory food and send grocery orders to the market, to silly, such as the oft-cited "Internet toaster." The bottom line was that the Internet was working its way into the fabric of everyday life.
Electronic gadgetry became so pervasive in the nineties that we came to depend on them no matter where we were. Growing up in parallel with the Internet infrastructure was a far-reaching wireless infrastructure that enabled us to "go mobile" in every way possible. Digital cellular telephone service was launched in 1992 in the U.S., which led to an epidemic of poor driving as cell phones proliferated rapidly.
PCs themselves became so ingrained in our world that the information technology profession began looking to a "post-PC era." Thanks to continued improvements in microprocessor power, starting in 1993 with Intel's Pentium and its 3.1 million transistors, PCs had become incredibly powerful. Gains in battery technology, power management techniques, and display technology brought a procession of smaller, lighter portables.
But despite the fact that PCs could—and did—go with us just about everywhere, the search for alternatives that would launch that "post-PC era" was on. PDAs sprouted from jacket pockets, some of which began offering wireless links to e-mail and the Web. As time went by, some PDAs looked more like phones, while phones often crossed over to PDA-like functionality.
Throughout the decade, portable devices gained more functionality. Simultaneously, they shrank and benefited from elongated battery life. Advances in power management ICs stemmed from myriad variations on the theme of pulse-width modulation (PWM), including switching regulators and low-dropout regulators that contributed greatly to the runtime of portables.
Cellular telephones likewise benefited from improvements in MOSFET efficiency and gain as a new generation of power transistors took over RF applications. Equally critical in the development of cellular handsets and other portable systems were improvements in packaging technology, with ball-grid array and chip-scale IC packages coming into prominence.
Alongside the evolution of microprocessors during the nineties, other highly integrated ICs evolved as well. Thanks to the development of logic synthesis technology, the ASIC business model developed in which circuits could be built for highly specific applications. By 1993, IBM had moved past the million-gate chip level in density. Three years later, gallium-arsenide (GaAs) gate arrays hit the 100-kgate mark.
On the programmable logic front, progress also came quickly. Xilinx had invented and pioneered FPGA technology in the latter half of the eighties. Then, Actel launched antifuse technology in 1991. Programmable devices soon became an important means of prototyping logic circuits before committing them to the long and costly design cycle associated with ASICs. FPGAs with dedicated blocks of RAM appeared in 1995, while embedded processors made their debut on programmable devices in 1997. By the end of the nineties, FPGAs had reached mega-gate densities.
As deep-submicron geometries shrank to 0.5 microns and smaller, silicon integration enabled designers to begin working toward entire systems—processors, cache memories, and all associated peripherals and interface logic—on a single chip. The system-on-a-chip (SoC) concept took hold, particularly with the rise of fabless semiconductor vendors such as ARM, which developed embedded processors that were marketed as intellectual property rather than as fabricated silicon devices. Embedded DSP cores debuted as well. Underlying the development of SoCs was an improving design automation infrastructure, which became increasingly capable of pulling together diverse IP blocks and verifying that they functioned as they should.
Analog circuitry enjoyed gains during the nineties too. Delta-sigma analog-to-digital converters rose to prominence among conversion technologies for digital audio applications. Digital-to-analog conversion rose to the challenge of cellular basestations, PC graphics, set-top boxes, and other circuits requiring high accuracy.
Memory technology came along for the ride. Densities ballooned from 64 Mbits to 256 Mbits for DRAMs. At the same time, they became much less expensive, allowing for system integrators to take advantage at minimal cost. Storage technology also evolved, with compact discs becoming first writable and then rewritable.
By its close, the nineties had shaped up as a decade of digitization. The music industry had fully embraced digital technology, both for creation and distribution of content. Digital cameras, both still and video, were all the rage. Phones had gone digital, as had some television broadcasting.
As the new millennium approached, however, a uniquely digital problem cropped up. Back in 1968, an obscure federal standard had codified the six-digit data format (YYMMDD) for information interchange, sowing the seeds of the Y2K crisis. Millions held their breath on New Year's Eve as 1999 rolled over into 2000, wondering if all the traffic lights would stay stuck on red because the computers that controlled them had crashed.
But they didn't. The dike held, the flood was turned back, and the technology proved more robust than the doubters thought. The digital revolution was complete and had its vindication to boot.