The past 50 years of electronics history has seen its share of product and technology mistakes. Ideas that didn't quite hit the mark. Some fizzled out, some eventually gave way to better, more successful products, and others found dazzling success in applications for which they weren't originally intended. But it's safe to say that something was learned from every mistake. Failed products don't necessarily mean failed companies. Rather, they often bespeak courage, inventiveness, and a willingness to take risks, forging a path for future achievements.
After looking high and low, searching the Internet, conferring with colleagues, and much deep thought, the editors of Electronic Design assembled the following list of ten notable failures, presented here in no particular order. The list is not meant to be comprehensive. We trust that you will find it informative and fun to read... and that our picks will agree with your own choices and recollections.
Apple Lisa personal computer
Even as IBM was preparing the launch of the DOS-based PC, Steve Jobs was steering Apple in a more creative direction. The result: the Lisa, launched in 1983. It was the first personal computer to use a graphical user interface (GUI). Other innovations included a hierarchical file system and drop-down menu bars. Three years in the making, Lisa was supposed to be the Next Big Thing, but it became one of Apple's greatest flops, done in by an inflated $10,000 price tag, poor performance, and competition from Apple's own successful Macintosh. However, the Lisa computer had a noteworthy measure of success: Her GUI legacy lives on in today's PCs and MACs.
IBM PCjr personal computer
After conquering business computing, IBM set its sights on the home computing market in the early 1980s. In 1984, the standard IBM PC was reworked as the PCjr, with enhanced graphics and sound and a wireless keyboard with rubber "chiclet" keys. The keyboard proved to be its biggest foible: the wireless operation worked poorly and the chiclet keys stymied touch typists. Also, expandability was limited and users suffered with nonstandard interface connectors. IBM discontinued the product less than two years after its release.
In the early 1980s, semiconductor memory was largely volatile and expensive, leading the industry to search for alternatives. Bubble memory was one option. The devices stored data as cylindrically shaped magnetic domains within a thin layer of magnetic material, such as yttrium garnet. Bubble memory looked to be an emerging technology in 1981, but by the end of the decade it had been superseded by other nonvolatile memory types, like EEPROMs, which were both faster and less expensive.
Intel iAPX 432 microprocessor
Debuting in late 1982, the iAPX 432 was a 32-bit, four-chip set that processed 2 MIPS. The object-oriented processor included high-level operating-system support in hardware, such as process scheduling and interprocess messaging. These advanced features weren't enough to overcome the fact that the 432 ran ten times slower than Motorola's 68000 or Intel's own 80286.
Texas Instruments TMS9900 microprocessor
The TMS9900, rolled out in 1976, was one of the first 16-bit CPUs, with capabilities that many 8-bit CPUs lacked. TI used them as the basis for its own 990/4 and 990/5 minicomputers, but their high cost deterred software developers. TI also crafted a consumer personal computer based on the TMS9900, the 99/4A. The home computer was not compatible with the PC, so it ran up against stiff competition from other machines in the early 1980s. Although a few manufacturers used the TMS9900, TI's exit from the PC market was the death knell for the processor. Besides, others were on their way to making faster 16- and 32-bit chips.
Microsoft Bob interface
In 1995 Microsoft released "Bob," an interface designed to replace the existing Windows desktop with one aimed at novice users. The interface featured a big yellow smiley face with glasses and various virtual rooms with objects. There's no debate: Bob was a complete flop. Many computers weren't up to the minimum processing requirements, the overly cute approach was too simple for the average PC user, and the product was far too expensive for the functionality that it provided. Bob disappeared soon after Windows 95 was released later that year.
Touted as the future of computing upon its 1993 release, Apple's Newton was one of the early entries in the personal digital assistant (PDA) market. Its success hinged on handwriting recognition technology. The company boasted in a prerelease brochure that "the Newton can read your handwriting." Who can forget all the jokes about how comical the Newton's misinterpretations of handwritten entries were? Still, it paved the way for the more successful PDAs that followed.
IBM OS/2 operating system
IBM tried to establish its OS/2 operating system as a serious contender to Microsoft's Windows in the 1990s. But OS/2 turned out to be a good idea with weak marketing. The company was working toward an entirely new hardware/OS duo that would combine IBM's PowerPC chip with its OS/2 operating system to become the next-generation desktop platform. Big Blue, however, didn't fully recognize the opportunity to push OS/2 as a software product in its own right. In addition, the company was put in a difficult position by trying to work with Microsoft to support the IBM PC family while promoting the OS/2 software as an alternative to Microsoft Windows. Internal politics and poor business strategies had all but killed OS/2 by the latter half of the 1990s.
Electronic design automation (EDA) frameworks were the talk of the town in the late 1980s and early 1990s. Frameworks were aimed at easing the design tool and data integration problems facing users of software from different vendors. EDA companies threw money and manpower into developing frameworks. A new organization, the CAD Framework Initiative (CFI), set about creating requisite standards. But creating such a product proved to be much more difficult than initially thought, and the framework idea was not widely accepted. By the mid 1990s, frameworks were petering out and the CFI had evolved into the Silicon Integration Initiative.
Charge coupled devices as computer memory
The charge coupled device (CCD), a type of semiconductor that's sensitive to light, was invented at Bell Labs around 1970. It was originally intended to store computer data. Although CCDs were too slow to cut it as memories, they found their true calling in imaging applications. By 1975, they saw use in TV cameras and flatbed scanners. In the 1980s, they appeared in the first digital cameras. CCDs now enjoy success in a broad array of digital imaging applications.