History lesson: chips and oil don’t mix

Remember Exxon’s foray into the semiconductor business? How about Schlumberger’s incursion into the semiconductor test arena? As an EE-Evaluation Engineering reader you’re more likely to remember the latter, and it was only ten years ago that Schlumberger exited the business by spinning off NPTest.

An item by Alexis C. Madrigal in the Atlantic titled “The Time Exxon Went Into the Semiconductor Business (and Failed)” prompted this trip down memory lane. While reporting on the history of Intel, Madrigal discovered that Exxon acquired Zilog in 1981. A billion dollars later, Madrigal reports, Exxon sold Zilog back to some of its employees and an investment firm in 1989.

Schlumberger’s entry into the semiconductor test business also represented an effort to get into the semiconductor chip business, and to that end the oil-field-technology giant purchased Fairchild Camera and Instrument in 1979 for $425 million. It just so happened that, like some other chip makers of the time, Fairchild made semiconductor ATE as well as chips, as my colleague Tom Lecklider points out in last year’s “50 Years of Test Technology” series, in celebration of EE-Evaluation Engineering’s 50th anniversary.

Schlumberger sold the Fairchild semiconductor operation to National Semiconductor in 1987 for $225 million less than it had paid, but it kept the test operations, at least for a while. In 2003, it spun off the test operation, creating NPTest, which Credence (now LTX/Credence) acquired in 2004.

“There was something about the way these [oil-related] companies managed their businesses that seemed destined to run highly innovative chip companies into the ground,” Madrigal writes. He quotes a former Zilog employee as saying, “Exxon essentially choked us with money.” Concludes Madrigal, “Too much money too fast breeds too little focus and too much complexity.”

That may have been the case for the core semiconductor businesses of organizations like Zilog and Fairchild. But I think the situation was more complicated in the ATE business. My impression was that the ATE world was not choked with money but rather with too many vendors, too many very expensive platforms, and too little money spread across too many companies to perform the necessary R&D to evolve the firms’ ATE offerings.

A look at some articles over the years shows how the industry has evolved. Writing in the May 1984 edition of Design and Test of Computers, Rudy Garcia wrote, “Traditional ATE systems cannot accommodate the newly designed, technological advanced devices with large pin counts and high data rates. Fairchild’s Digital Test Systems Division has developed a new-generation tester—the Sentry 50—with a high-speed subsystem that preserves signal integrity and timing accuracy.” The system operated at 50 MHz and offered ±600-ps overall system-timing accuracy. It could accommodate 256 pins per test head, or 512 pins by means of a split I/O option.

ATE costs were rising, and in May 1996 EE-Evaluation Engineering carried this new product announcement under the headline “IC Test System Combines Low Cost, High Performance”:

“The [Schlumberger] SX 100 VLSI Logic Test System, priced below $5,000 per pin, delivers a 100/200-MHz test data rate, up to 448 I/O channels, and an edge-placement accuracy of 225 ps. The tester footprint is only 11 ft2. Using a modular design, it features Schlumberger’s Sequencer Per Pin architecture to enable test-program generation without timing or pattern compromise.

“The SX 100 offers local memory depths up to 8M vectors, and [it supports] looping and complex subroutines with a subroutine memory of 4k. The system can be configured with two test heads. It provides at-speed algorithmic pattern generation for embedded memory applications. Existing test programs can be ported to the SX 100 using existing conversion tools. Prices start at under $1 million.”

With the “low cost” model carrying a $1 million price tag, the push was on for lower cost of test.

In November 2001, Garcia was making the case for structural test as embodied in the Schlumberger DeFT Structural Test System. In an EE-Evaluation Engineering article, he posited a reader asking these questions:

  • Do I invest in structural test, with the added overhead of comprehensive DFT, new EDA tools, rules, and a well-aligned manufacturing process?
  • Do I continue to purchase faster and higher pin-count functional testers and add staffing to generate test programs while still struggling to find ways to catch the tail-end defects?

His answer: “Given a legacy of successful functional test generation, this decision will not be easy for an organization to make. However, the only real choice may be when to convert to structural test. Make it work; use it to your advantage to beat your competition in time to market and time to quality.”

Then, writing in June 2003 under a byline citing his NPTest affiliation, Garcia delved into cost-of-test modeling for SoC devices. He cited factors including cost of test equipment, required facilities improvements (for example, provision of chilled water) to accommodate the equipment, test time, test-floor space requirements, test-equipment utilization, yield, and spares and maintenance costs. He also addressed DFT and structural tests as instrumental in achieving faster time-to-volume (faster test-program generation), improve test quality, and lower cost of test.

The issues Garcia cited in 2003 are still relevant today. EE-Evaluation Engineering will elaborate on them in the upcoming June and July editions.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!