Flush from the prosperity of the late fifties, the United States entered the 1960s with more of a sense of manifest destiny than ever before. Having established itself as the world's leading superpower, the U.S. stood at the threshold of a period of economic expansion and global sociopolitical influence. Nothing buoyed that sense of destiny more than the nation's technological superiority. We had Ed Sullivan on our color TVs. We had a telephone network that actually worked. We had surfboards, Barbie dolls, little red Corvettes, and jukeboxes full of 45s. Life was pretty good.
Yet growing civil unrest, escalating political tension with the eastern bloc, and the spectre of military conflict with Communism in the jungles of southeast Asia posed withering challenges to the very fabric of society. The country soon found itself in turmoil as traditional values were challenged by a generation whose creed was to question authority and everything else its elders held sacred.
The tone for the decade was set by the youngest elected president in U.S. history. Recognizing that the Soviet Union's early lead in the conquest of space posed a serious threat to American technical and military superiority, John F. Kennedy wasted little time in seizing the technological high ground. His brash declaration that the U.S. would win the space race by landing men on the moon and returning them safely to Earth before the decade was out amounted to a bold challenge to America's engineering community.
When Neil Armstrong called his step onto the lunar surface "one giant leap for mankind," he wasn't wrong. But the electronics industry made its own giant leaps during the tumultuous sixties, ones that would lay the foundation for a worldwide revolution in mass communication and computing. Integrated circuits (ICs), digital logic, and improvements in linear devices combined to create a communications infrastructure that would result in Marshall McLuhan's "global village," a world made vastly smaller by the changes wrought through microelectronics.
As the 1950s were the decade of the discrete transistor, the sixties were dominated by the IC. After being brought to market in 1961 by Fairchild and Texas Instruments, ICs quickly became the backbone of a broad range of military systems, consumer products, communication gear, and just about everything else. Their proliferation was aided in no small part by a government eager to foster technology development by the private sector.
By mid-decade, the domination of ICs was so apparent, and the progress in integration so stunning, that Intel co-founder Gordon Moore was moved to boldly predict that ICs would double in complexity every 18 months. An oft-forgotten caveat to what became known as Moore's Law is that Moore himself only expected this rate of growth in IC complexity to hold out for 10 years. The caveat, it would turn out, was the only thing Moore got wrong.
Early IC families emphasized digital logic. A host of logic types sprang up from various semiconductor manufacturers, each seeking to establish market dominance. Resistor-transistor logic (RTL) had strong support, as did diode-transistor logic (DTL). Into the mix came transistor-transistor logic (TTL) as well as emitter-coupled logic (ECL). Each had its strengths and weaknesses, some being faster and others more tolerant of noise. The upshot was that competition was good for the industry and even better for designers, who reveled in the broad choices available to them as they fashioned increasingly complex and clever digital circuits.
A latecomer to the logic world was complementary metal-oxide semiconductor (CMOS) technology, introduced by RCA in 1968. Once some of the fabrication difficulties were overcome, CMOS devices delivered much lower power consumption. As a result, they helped pave the way for later generations of high-density memories as well as the first microprocessors.
The prevalence of digital logic called for a new wave in test equipment. First introduced with vacuum tubes in the late fifties, the function generator caught on when it was transistorized in the early sixties. Yet another advancement came with frequency synthesizers, which brought the accuracy demanded by the faster signal transitions of evolving ICs.
Long before America had the moon clearly in its sights, NASA's space program was a major contributor to, and partner of, the electronics industry's efforts to literally "go global" with its technology. Early efforts ushered in a new era in communications with the use of artificial satellites as relay stations. In 1960, an orbiting 100-ft sphere of aluminized Mylar plastic called Echo 1 facilitated the first transcontinental telephone call via satellite by bouncing a call from New Jersey to California.
Just a few months after John Glenn became the first American to orbit the Earth, another orbiting vehicle made history of its own. Launched on July 10, 1962, the Telstar communications satellite took its place in the firmament as the first privately owned satellite (AT&T financed its construction and launch). That very night, the orbiting satellite handled its first telephone call, television program, and photo facsimile transmission.
Although electronics were making their presence felt dramatically in the communications realm, it's interesting to note that the crews of manned spaceflights during the sixties had little or no aid from on-board computers. Glenn's Mercury spacecraft, for example, flew without an on-board computer of any kind. The two-man Gemini orbital missions carried computers with about 4 kwords of memory. The Apollo lunar flights had computers with all of 32 kwords of memory both in the command and lunar modules. This all stands as testimony to both the "Right Stuff" of NASA's astronauts and the engineering acumen of its technical staffs.
But while the space program's reliance on computing may have been slow to take hold, the developments in computing technology came fast and furious. Digital computers, born just over a decade earlier, would go through their awkward adolescent stages during the sixties. Inelegant inputs, displays, and packaging would later mature into sleek and sophisticated desktop machines.
In any important technology field, standards are a critical element in achieving success. A key early computing standard came in 1963 when the American National Standards Institute, a joint industry-government committee, accepted ASCII 7-bit code for information exchange. ASCII, which was the first universal standard for computers, permitted machines from different manufacturers to exchange data.
The arrival of ASCII set the stage for IBM's 1964 release of its Model 360 computer and accompanying OS/360 operating system, which was the first mass-produced computer operating system. Using the OS/360, all computers in the IBM 360 family could run any software program. System 360 computers also set the de facto worldwide standard of the 8-bit byte for 16-bit and 32-bit machines.
IBM's Model 360 helped establish what stands to this day as the modern computing standard of backward compatibility with earlier generations of computing equipment as well as that of a flexible operating system that could support multiple applications.
While IBM was breaking new ground in the mainframe market, competitors had other goals in mind. The trend toward miniaturization and modularization that had its start with military programs in the fifties made its way into the computing world with Digital Equipment's 1965 launch of its PDP-8 mini computer. Notable for its use of transistor circuitry modules with ICs, the PDP-8 was roughly the size of a two-drawer filing cabinet, making it small enough for desktops. It represented a primitive step in the direction of distributed computing as opposed to the centralized model based on mainframes. Eventually, mini computers would find broad use in manufacturing, scientific labs, and offices.
Earlier versions of the mini computer went even further in portending the future. A predecessor to the PDP-8 had already broken ground as the first commercial mini computer with a monitor and keyboard input. Meanwhile, other innovators were thinking about alternate means of data input. The concept of the computer mouse, patented in 1963, came to fruition in the form of a 1968 X-Y position indicator for a display system. Computer graphics were also being added to text-based display systems. By mid-decade, input and display systems had grown sophisticated enough to support primitive computer-aided design systems.
The sixties saw computer technology take the world by storm. Corporate users, universities, and government agencies made sure that computers proliferated by the thousands worldwide. As telephone communications had already done, it was time for computer communications to go global.
The Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense laid the groundwork for a worldwide computer network. Conceptualized in 1962 as an "Intergalactic Network" by J.C.R. Licklider, ARPA's head of computer research, an ambitious plan was proposed in which every computer user on the globe would be interconnected and would have access to programs and data at any site from any other site. Packet switching, in which information is parceled into packets of data that are sent independently along a network, offered a promising model for such a communication system.
By the end of the decade, the "Intergalactic Network" had been redubbed ARPAnet. Links between the network's first four nodes at the University of California at Los Angeles (UCLA), the University of California at Santa Barbara, the Stanford Research Institute (SRI), and the University of Utah became operational in 1969. Originally conceived as a means for hard-core computer users to share resources and pass messages, the ARPAnet would eventually outgrow its humble beginnings, becoming what we know today as the Internet.
Although computing technology advanced greatly in the sixties, it would be another 20 years before computing made its way into the consumer realm. Meanwhile, the electronics age was impacting consumers' daily lives in other ways. By 1968, there were some 200 million television sets worldwide, forming a mass medium that made deities of pop musicians. That same mass medium served to polarize opinions of the Vietnam War. Millions of Americans tuned in nightly as the war was brought into their living rooms in full color, with the names of the dead scrolling down the screen.
Electronics was becoming pervasive in other ways, too. By 1967, John Q. Public had an IC-based four-function calculator at his disposal, courtesy of Texas Instruments. Stereo FM broadcasts commenced in 1962, bringing audiophiles a new way to enjoy music. The ubiquitous tape cassette was introduced a year later. And touch-tone phones made the scene. It all added up to electronics invading the home, office, and every other aspect of daily life in a process that only accelerated with time.
A decade that had opened with such great promise for America closed in social and political turmoil. But for the electronics industry, the future was extremely bright. It found itself poised to build on what it had started with a new explosion of innovation and integration. Transistors begat digital logic and digital logic begat the beating heart, if not the soul, of the new machine: the microprocessor. The fun was just beginning.