Lightning Strikes Twice: First ENIAC and Soon the Personal Computer
What you'll learn:
- AI has consumed/claimed the supply of silicon and magnetic storage for the next two or three years.
- Severe shortages of critical components will lead to massive disruption in a wide swath of products, from cellphones to TVs to PCs to automobiles, leading to bankruptcies and pivots toward AI for many legacy OEMs.
- ENIAC, the first digital computer, was turned on 80 years ago and ran until lightning hit it some 11 years later.
First they bought the RAM
Then they bought up the hard drives
Hype does not compute
There’s a place in the heart of every hardcore EE for early computers, and none embodies this more than ENIAC, the first programmable, general-purpose electronic digital computer, which just turned 80 this past week.
In 1941, John Mauchly, head of physics at Ursinus College, presented a paper suggesting an electronic computer to accomplish that feat [weather forecasting]. A year later, he joined the [University of Pennsylvania’s] Moore School of Electrical Engineering and, with electrical engineer Presper Eckert, drafted a proposal for ENIAC [Electronic Numerical Integrator and Computer].
To find funding [$400,000... $6.7M in 2026 dollars], Mauchly had to shift gears, switching from weather forecasting to ballistics to gain Army backing. Once the project was underway, Eckert, 24, became ENIAC’s head engineer. Mauchly was an idea-generator and booster. — Computer History Museum
Calculations that previously took 12 hours would be completed in 30 seconds with ENIAC.
With more than 17,000 vacuum tubes, 70,000 resistors, 10,000 capacitors, 6,000 switches, and 1,500 relays, it was easily the most complex electronic system theretofore built. ENIAC ran continuously (in part to extend tube life), generating 174 kilowatts of heat and thus requiring its own air conditioning system. It could execute up to 5,000 additions per second, several orders of magnitude faster than its electromechanical predecessors. It and subsequent computers employing vacuum tubes are known as first-generation computers. (With 1,500 mechanical relays, ENIAC was still transitional to later, fully electronic computers.) — Britannica
ENIAC had 20 accumulators, each capable of storing a 10-digit decimal number. It performed addition, subtraction, multiplication, and division through dedicated circuits. The accumulators were the full extent of its memory resources, with input and output via relay-actuated punched cards, datapaths defined by switches, plugboards, and cables. It was used for ballistics, weather prediction, atomic energy calculations, and even early simulations of nuclear fission for Los Alamos in 1948 — considered one of the first modern computer programs.
ENIAC ran continuously for 11 years until it was struck by lightning on October 2, 1955, with its power being shut down just before midnight that evening.
ENIAC Begets...
The creators of ENIAC would subsequently produce the UNIVAC I, the first commercially produced U.S. computer in 1951 and used in census and business applications.
IBM’s 701 was the first commercially successful scientific computer, designed for large-scale calculations in defense and research applications. It marked IBM’s major entry in 1952 into electronic computing — yes, all that “nothing but slide rules” aerospace calcs stuff for America’s moon shot was total nonsense.
Their first mass-produced computer for business, the IBM 1401 in 1959, had transistor-based circuits and magnetic-core memory, revolutionizing data processing for corporations. Digital Equipment Corporation (DEC) produced the first minicomputer, the PDP-8 in 1965, followed by the PDP-11 in 1970, which featured a 16-bit architecture and a UNIX OS.
The DEC machines were known as “department” machines. We used them at Nortel in the early 1980s for circuit board layout. Mainframes like the IBM 360 and 370 were the workhorses on university campuses back when we stood in line at the punch-card readers with our Fortran card decks in school.
In the late 1980s we used an IBM 390 “vector” mainframe at Nortel for circuit board layout, autoplacing and autorouting a 26-layer board overnight if I remember it correctly. We were proud, young, techno-insurrectionists that brought the best IBM had to offer onto its knees.
Computing Gets Personal
The first personal computers emerged in 1975 with the Intel 8080-based Altair 8800. IBM’s entry in 1981 was an 8088-based Intel machine that put Microsoft on the map. The rest, as they say, is history. Now we have multicore, GHz-clock-rate machines and nobody bats an eye at 32 GB of RAM in a PC, even among those of us who had 8k machines back in 1981, a time when nobody would ever need more than 640k of RAM.
Media in the early machines ranged from punched cards; paper tape; front-panel switch bootloaders; cassette tape; 9-track reel-to-reel behemoths; tape cartridges; 8-, 5-1/4-, and 3-½-in. floppy disks (the “save icon” for the younger generation) to hard-disk drives ranging from dishwasher-sized cabinets to modern day 2.5-in. drives, and recently, solid-state drives. A personal computer, these days, is comprised of a processor (usually multicore, >1-GHz clock speed), DRAM (32 GB or more), hard disk, multi-TB SSD, and a graphics processor (GPU) with its own high-speed memory, along with a hefty power supply and an exotic cooling system.
Back in the mainframe days, we usually had a terminal connected by LAN to the mainframe computer. Everyone’s data was stored on the mainframe’s hard drive and programs ran in “core” memory, which was usually allocated to a maximum amount per user as a virtual machine (VM).
The terminals didn’t have computing resources of their own and served primarily as a user interface with keyboard and, usually, an ASCII/EBCDIC display (we used Tek’s storage display terminals with the PDP-11 for board layout in the early 80s).
The PC and networking began to break up the central (“cloud”) computing model in the late 1980s. Computing resources (storage, compute, display) became user-owned and mainframes were relegated to large business and monster-database tasks.
The emergence of Amazon Web Services in 2006, Google’s app engine in 2008, and Microsoft’s Azure in 2010 began a movement to recentralize large, compute-intensive and storage-hungry applications. By running apps in the cloud, software providers could control user access and create annualized, “subscription”-based revenue streams. However, many users were reluctant to bleed cash annually on something they didn’t own, as well as deal with the security concerns associated with moving data on and off a “public” machine vs. “air-gapping.”
The 2020s saw the integration of AI into the cloud, with Amazon SageMaker, Google Vertex AI, and Microsoft OpenAI furthering the argument for herding sheep into the centralized, subscriber-forever, own-nothing, revenue model for Big Cloud. With justifiable security and cost concerns, many companies and individuals have continued to resist moving their computing to a centralized model.
Lightning Strikes Computing: Lights Out Again?
Recent news for those of us sheep who pay attention is extremely grim for personal computer aficionados; news that seemed unlikely a few months ago: An AI lightning strike is about to annihilate a bunch of industries. About two months ago, Jason England at Tom’s Guide coined the term “RAMageddon,” where his analysis of DRAM prices revealed:
“The average price of consumer RAM sticks on Amazon has climbed by over 240% in my check of over 100 listings.
But there are levels to it, and next year [2026], you could see the prices of laptops, games consoles, phones, tablets and more be inflated. It's also affecting Nvidia's next generation GPU plans! And the cause of it? AI....
This drought is going to worsen. In the immediate future, industry analysts and manufacturers predict that prices for DRAM and NAND chips (RAM and SSD) will continue to rise throughout the first half of 2026.
This is because of the long-term contracts these fabrication plants are continuing to fulfil — to the point where in early next year, stockpiles for consumers to buy could run completely dry.”
PC Gamer magazine recently translated a Twitter (“X” if you must) posting by the CEO of memory chipmaker Phison:
"Consumer electronics will see a large number of failures. From the end of this year to 2026, many system vendors will go bankrupt or exit product lines due to a lack of memory. Mobile phone production will be reduced by 200-250 million units, and PC and TV production will be significantly reduced." — Pua Khein-Seng
and
Pua Khein-Seng is further said to have highlighted that memory manufacturers are now "demanding three years' worth of prepayment (unprecedented in the electronics industry)" and that those same manufacturers "internally estimate the shortage will last until 2030, or even for another 10 years." — PC Gamer
The hit on PC, mobile phone, and TV production isn’t only in silicon storage. Tom’s Hardware quoted Western Digital’s CEO, Irving Tan, with an excerpt from the company’s Q2 2026 earnings call:
“As we highlighted, we’re pretty much sold out for calendar 2026. We have firm POs with our top seven customers. And we’ve also established LTAs with two of them for calendar 2027 and one of them for calendar 2028. Obviously, these LTAs have a combination of volume of exabytes and price.” This announcement is on track with the report from late last year that hard drives are on backorder for two years due to massive data center demand. — Irving Tan
But wait. There’s more.
That thing in the driveway with four wheels? It’s chock full of “computers,” memory, and even ruggedized hard drives to support features like Nav and ADAS.
“We are already seeing signs of panic buying [memory chips] within the auto sector,” one Counterpoint Research analyst was quoted by Bloomberg in a story published Monday [last week].” — thedrive.com
Supply Chain Pain Redux?
With procurement sewn up by the AI sector of components that are critical to a wide swath of products, from cellphones to TVs to PCs to automobiles, those industries look to me to be heading for Covid 2.0 supply-chain issues, with an inability to supply production. It will lead to layoffs of production workers, a downturn in consumer spending, and components directed to an industry that likely can’t get the electricity it needs, let alone pay its component order invoices.
I think we’ll see massive semiconductor and storage device inventories on shelves inside two years, destined for a bankrupt AI industry that could not make the leap across the chasm from LLM (which makes their value-add near-zero for consumers and businesses). Consumers won’t be able to get new PCs to be able to perform the intensive amount of edge computing and memory/storage needed to run AI agents, forcing those reluctant sheep into cloud services of zero added value.
The collapse of LLM-based AI in a year or two means a glut of storage and memory with nobody to buy the chips. They all went bankrupt or pivoted to supply widgets to AI data centers. And nobody will have a cellphone that can call in an order for the surplus devices from a liquidator, even if they did still have a factory running, building the iPhone 8 again because of the RAM shortage.
Silicon Valley has always run on hype and, traditionally, 9 out of 10 of those hypesters have failed to pass the steaming pile onto the next bagholder within the confines of the Santa Clara Valley.
This time, though, the lunatics have escaped the asylum. Wall Street has thrown $trillions in capital at the latest SiValley hype-LLM-based AI, with that money being “spent” by four or five key players in AI, primarily centered around NVIDIA and a couple of others.
And regulators have been bought to where nobody is stepping in to say AI only gets a 10% allocation of components and electricity, exposing the world economy to these snake oil peddlers. That means taxpayer bailouts so that the bagholder Wall Street guys are made whole, while the inevitable death of the company(ies) occurs months later.
ALL of the bet — and entire worldwide economy — has been made on a one-trick pony that’s not delivering the goods: LLM AI. Worse, the U.S. government has bought into LLM-based AI completely, and it seems to be calling the shots in many agencies that don’t have the technical chops to call “BS” on the PowerPoint being presented to them. Point a cellphone camera running an app at someone’s face and let the AI decide what happens based on non-reasoning LLM-based algos and flawed databases.
With a post-AImageddon dystopian internet landscape coming inside a year or two, maybe I can talk Electronic Design’s Bill Wong into hosting a BBS on his basement compute farm. At least that way, our readers will be able to dial in to read the latest human-generated content from Electronic Design without our readers needing to go out to get a TB of DRAM or a GPU, Mad Max style.
Unjoy,
-andyT
Andy's Nonlinearities blog arrives the first and third Monday Tuesday of every month. To make sure you don't miss the latest edition, new articles, or breaking news coverage, please subscribe to our Electronic Design Today newsletter. Please also subscribe to Andy’s Automotive Electronics bi-weekly newsletter.
About the Author
Andy Turudic
Technology Editor, Electronic Design
Andy Turudic is a Technology Editor for Electronic Design Magazine, primarily covering Analog and Mixed-Signal circuits and devices and also is Editor of ED's bi-weekly Automotive Electronics newsletter.
He holds a Bachelor's in EE from the University of Windsor (Ontario Canada) and has been involved in electronics, semiconductors, and gearhead stuff, for a bit over a half century. Andy also enjoys teaching his engineerlings at Portland Community College as a part-time professor in their EET program.
"AndyT" brings his multidisciplinary engineering experience from companies that include National Semiconductor (now Texas Instruments), Altera (Intel), Agere, Zarlink, TriQuint,(now Qorvo), SW Bell (managing a research team at Bellcore, Bell Labs and Rockwell Science Center), Bell-Northern Research, and Northern Telecom.
After hours, when he's not working on the latest invention to add to his portfolio of 16 issued US patents, or on his DARPA Challenge drone entry, he's lending advice and experience to the electric vehicle conversion community from his mountain lair in the Pacific Northwet[sic].
AndyT's engineering blog, "Nonlinearities," publishes the 1st and 3rd Tuesday of each month. Andy's OpEd may appear at other times, with fair warning given by the Vu meter pic.

