1950s: Transistors Fill The Vacuum: The Digital Age Begins

Oct. 21, 2002
A decade of contradictions: infinite hope for the future coupled with fear of powerful enemies. That is the 1950s, a time of war and then of post-war prosperity. Rock and roll was evolving from rhythm and blues, soon to be heard blaring from...

A decade of contradictions: infinite hope for the future coupled with fear of powerful enemies. That is the 1950s, a time of war and then of post-war prosperity. Rock and roll was evolving from rhythm and blues, soon to be heard blaring from transistor radios from Spokane to Baltimore. Crew-cut kids watched across TV dinners as tales of space travel and futuristic dreams flickered across the screens of RCA consoles. Gas was cheap, tailfins were large, and Americans were consummating their love affair with the open road. The future seemed limitless, and as the decade dawned, few even realized why. But as the 1940s drew to a close, a handful of engineers had made a breakthrough that ultimately would change the world.

The infant born in 1947 to Bardeen, Brittain, and Shockley—the point-contact transistor—came of age in the 1950s. It matured into Shockley's junction transistor, which found a home in countless military and consumer applications. The solid-state age had begun, pushing the electronics industry toward modern digital computers and communications. Transistors ran cooler and demanded far less power than the vacuum tubes they would begin replacing, producing smaller, faster, and more powerful electronics. Initially costly to produce, transistors in the fifties began the trend that the electronics industry has continued ever since: ever-lower cost coupled with greater functionality and integration. Transistor process technology was refined throughout the decade, which culminated in the development of the first integrated circuit.

Necessity is indeed the mother of invention, and never was necessity greater than during the wars of the 1940s and early fifties. The massive efforts stemming from World War II, the Korean conflict, and the ensuing Cold War resulted in the mobilization of America's greatest scientific minds. The tense geopolitical faceoff between east and west found the electronics industry being thrust to the battle's front lines, fervently employing the new solid-state technology in increasingly sophisticated defense and weapons systems. Out of urgent military need came technological marvels that kept the United States in the forefront of science, and chief among these was the advent of digital computers.

It was the digital computer advances in the 1950s that laid the groundwork for the successful commercial mainframe and mini computers that would emerge in the 1960s, and later evolve into the personal computers of the 1970s and 1980s. Digital computer technology that had begun as part of the war effort in the forties was refined and then marketed as a commercial product.

In 1957, J.M. Bridges, the Defense Department's Director of Electronics, told a group of computer manufacturers that digital computers were destined to replace less reliable analog machines in complex military weapons systems of the future. A hallmark of the development effort, and a sure means of achieving reliability, would be to manufacture standard, modular digital functional blocks that could be combined with little or no change to build complex computing systems.

Indeed, even as the Korean War began at the outset of the decade, the concept of the plug-in circuit module had taken form and begun to speed the production of electronic equipment. It permitted circuits to be assembled in different areas and then simply connected together. Modularization, combined with a growing reliance on printed circuits, also brought enhanced reliability, repeatability, and easier servicing when things did go awry.

The earliest digital computers, beginning in the forties and continuing into the late 1950s, were based on vacuum tubes. They were unreliable and difficult to program, used lots of power, required very large rooms, and were constantly in need of maintenance. Storing information was difficult, and the machines could only solve one problem at a time.

A breakthrough in digital computing came in 1951, when the Eckert and Mauchly Computer Company of Philadelphia sold the first commercial computer, the UNIVAC 1, to the U.S. Census Bureau. The massive machine retrieved data from memory by transmitting sonic pulses through tubes of mercury. An additional 45 UNIVAC 1 machines would eventually be sold.

Other computer advancements came in memory. For example, the invention of ferrite core memories would lead to the Massachusetts Institute of Technology's development of random access memory (RAM), which made retrieving information quick and efficient. Also, the RAMAC disc operating system, introduced by IBM in 1957, was the first data processing system to use record-like discs to store digital data. Each disc had a storage capacity of about 100,000 characters and could be randomly accessed.

Although development of the transistor gave electronic design engineers an important building block for the future, it also spurred development of the tools and infrastructure the industry would require for growth. The 1950s saw a broad array of test and measurement equipment emerge as engineers clamored for ways to quantify the performance of their circuits. As the need for improved test equipment grew, advances in cathode-ray tube (CRT) technology were essential for the equipment to evolve. Improved CRTs were put to good use both in test gear and in television sets.

The 1953 development of the digital voltmeter (DVM) by Non-Linear Systems as a separate instrument was a key innovation in test equipment. The DVM, an instrument that was part of many analog computers, increased the accuracy of typical laboratory measurements by more than an order of magnitude over the vacuum-tube analog voltmeters that had preceded it.

The industry also began looking ahead to how it would produce components in the future. To that end, Bell Labs developed an experimental transistor-making robot that presaged later advances in automated semiconductor manufacturing. The machine, dubbed "Mr. Meticulous," accurately positioned and welded gold collector and emitter wires to their respective device layers and then automatically tested the finished devices.

Click here for several examples of the special photos in this picture album.

Automated semiconductor production was pivotal as the technology infiltrated more than military and computer applications. Space represented a new frontier to be conquered, and there was great political and ideological capital at stake for the victor. In 1957, the Soviet Union startled the world when it launched Sputnik, the first orbital satellite. The United States was caught short with its space program barely off the ground. But the U.S. made a comeback, launching its own first orbital satellite, Explorer I, in 1958. The space race was on, fueled by an explosion in engineering ingenuity from the electronics world. More ambitious probing of the moon and solar system quickly followed the early orbital satellites. The combined efforts of many companies to make the necessary equipment lighter, more precise, and more efficient made these probes possible.

Meanwhile, the newborn that was solid-state electronics was making its presence felt on terra firma in a big way. The transistor radio, introduced in 1954, became the fastest selling consumer product of the time. By 1955, car-crazed Americans, fueled by post-war prosperity and cheap gas, were more than ready for a little driving music. RCA's Sarnoff Research Center answered the call with an experimental nine-transistor car radio. Transistor radios came onto the market in droves, giving America its first taste of its own future: electronics everywhere, packaged to travel and ready for deployment at will whether on the street, in the car, or at home.

One of the earliest commercial applications of junction transistors was in hearing aids. As early as January, 1953, NPN transistors saw use in a hearing-aid circuit that ran for six months on a tiny B battery. Tellingly, it foretold of the advances to come in transistorized circuits by being 25% to 30% smaller than earlier models with twice the output power.

Through the 1940s, radio was king and the theater of the mind still ruled in post-war America. But television's golden age was just beginning in the early fifties and again, electronics led the way. RCA's experimental solid-state (except for the picture tube) TV receiver of 1952 used 37 semiconductor devices, had a five-inch screen, and weighed about 27 pounds. An inauspicious start, perhaps. But as with everything else the electronics industry touched in ensuing decades, the challenge of miniaturizing and improving television sets was ultimately met.

Throughout the industry's history, many technologies found their genesis in the military before trickling down into consumer products. Modularization and miniaturization were an example of this trend. In the beginning, miniaturization was driven by the Soviets' success with the Sputnik program and was a major objective of government-funded electronics research programs. But through work like RCA's solid-state TV, the drive to minimize quickly made its mark on consumer products.

As always, the basic building block of the transistor formed the underpinnings for all of the advances in consumer goods. For most of the 1950s, germanium was used to make transistors. But after Texas Instruments introduced the first silicon transistors in 1954, silicon began to replace germanium as a semiconductor material, thereby extending operating temperatures to military ranges.

As the decade wound down, the pieces were in place for the crucial next step along the path of integration. Engineers at both Fairchild Semiconductor and Texas Instruments sought to produce a single substrate of silicon carrying not only transistors and diodes, but also resistors and capacitors—and then join all the components to form a complete circuit.

In 1958, Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently developed just such a device—the integrated circuit (IC). The first monolithic IC was built in 1958 at Texas Instruments when Kilby constructed a phase-shift oscillator from a single silicon bar. The device required no interconnections between one component and another: The electrical path was through the silicon. TI was also the first company to announce a product line of ICs.

By the end of the decade, the transistor had rightfully earned its place in the forefront of technology and, in fact, had begun to fulfill its promise to take the industry much further. Government got into the act in 1958, when President Dwight D. Eisenhower created the Advanced Research Projects Agency (ARPA) to keep the U.S. at the forefront of technology. ARPA would, in the next decade, plant the seeds for one of the greatest technology advances of the century—the Internet.

A decade of contradictions, social and political, managed by its close to coalesce into one of great technological advances. The 1950s set the stage for the computer advances that would be seen in the sixties and on into the seventies. And it was the beginning of rapid change in electronics, changes that would heavily impact the way Americans lived their everyday lives.

Click here for several examples of the special photos in this picture album.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!