Leveraging Test Data as a Strategic Management Tool

Intelligence—strategically applied data—is more than knowledge, more than information or facts. Whether you're managing products, processes, people, competitive position, a test-floor operation, or a global manufacturing enterprise, comprehensible, actionable data is the key to success.

But for many reasons, our organizations haven't focused on obtaining the most trustworthy and timely data. When they have, they often haven't thought to obtain it or use it on a strategic scale. This has been true particularly in the semiconductor industry and in semiconductor test, my area of expertise for more than two decades.

But now there are many urgent and practical reasons to think about data more strategically: identifying it, accessing it, analyzing it, authenticating it, and acting on it in a big way, on a global scale. Among these practical reasons is a hidden gem. We should consider acting on data because it now is possible. The capability to unearth data of high integrity; use it in automatically optimized processes, controls, and management; and leverage it across multi-enterprises is a reality today.

Managing with strategic intelligence improves control, clarity, communications, and decisions, but its effects go well beyond the day-to-day activities of on-the-fly corrections and tactical improvements to operations, processes, and products. At a mission-critical level, profitability is improved, costs are wrung out, and a company's reputation as a responsive, responsible, and dependable business partner is burnished.

Leveraging data as a strategic management tool also contributes to being a consistently high-quality producer, a must-have since semiconductor-based end products are increasingly part of managing and operating every detail of life.

Clear Visibility to Vital Data Is Needed

More than ever, operational transparency and clear visibility to vital data are at the heart of competitive advantage and maybe of survival itself. Today's turbulent, fast-paced global business environment presents special challenges to the semiconductor industry, which is itself extremely complex.

Requirements for many kinds of engineering and technological expertise, manufacturing and production complexity and challenges, end-product proliferation, consumer-driven demand cycles, capacity fluctuations, and increasingly outsourced and compartmentalized functions make data particularly mission-critical. Among the most pressing reasons are navigating in an unusually dynamic macro-business environment and managing ever-increasing complexity.

Market cycles continue to be more competitive and compressed. We all know too well the challenges of shrinking geometries and smaller, more portable products even as demand for more sophisticated features increases. But that was in the good old days. For the immediate future, there also will likely be unpredictable fluctuation in macroeconomic cycles that further cloud forecasting and decision-making.

While larger economic cycles always have played a role, it usually has been as a more slowly unfolding backdrop. The global economic events of the last 18 months demand even greater flexibility, shorter time horizons to make changes, and the capability to turn on a dime. To achieve this degree of nimbleness, companies need every advantage, every bit of relevant data, quickly and shared among all stakeholders for rapid, adaptive decisions.

For example, as I write this, the San Francisco Federal Reserve Bank has issued a report saying that we cannot expect consumer demand to rebound to previous levels for an extended period. If this is true, will it affect the consumer demand that has driven the semiconductor market for decades? Will a new, dominant semiconductor application such as energy and water control, medical imaging and devices, or massive infrastructure rebuilding emerge to replace it as a major driver of semiconductor production? If there are systemic sea changes ahead, we need to be able to control what is in our power and do it intelligently and strategically to gain every possible advantage.

To better manage complexity, semiconductor companies have increasingly focused on core competencies. In this way, they have gained competitive differentiation and been better able to focus R&D investment to keep ahead of fast-moving technology shifts. And they can use other resources in the most competitive and sustainable ways. But the evolution away from a vertically integrated semiconductor model to a horizontal, geographically dispersed one results in new supply-chain problems that our industry has not sufficiently addressed: control and management of outsourced partners and lack of visibility and operational transparency.

The control and risk issues of the progressively fragmented supply chain hit the front page of the May 18, 2009, Wall Street Journal in its article, “Clarity is Missing Link in Supply Chain,” a look at outsourcing in our industry. Although the article focused on materials and forecasting, the message is a wake-up call made even more relevant when considered in light of a 2007 Global Semiconductor Alliance publication.

Flow of Information as the Lifeline

The Global Semiconductor Alliance took the concept of supply-chain division of labor a step further, beyond issues of managing outsourced materials and processes to include management of critical intellectual property and related information.

In 2007, the Global Semiconductor Alliance, then named the Fabless Semiconductor Association, jointly with Industry Directions, published a research report entitled Building a Better Supply Chain, Successful Collaboration in the Fabless Semiconductor Market. It directly speaks to the test industry:

“Fabless designs are getting more complex, and handoffs to foundries have become more challenging. Further, the sensitivity of IP must be taken into account. IP is not only focused on the areas of product design. Arguments for proprietary advantage can be assigned to manufacturing techniques or testing processes. The complexity and value of these processes could make an IDM or fab-lite company reconsider whether or not to outsource proprietary processes if it affects time-to-market or economic advantage. This situation may already be underway and is worthy of further exploration.”

On page 4, the 2007 GSA study backs up the 2009 Wall Street Journal article: “While fabless companies focus more on core competency, this interdependency drives supply chain risk. The number one risk factor is lack of visibility…”

For all these reasons, the semiconductor supply chain must acquire the capability to manage the flow of knowledge-based information, including information about test engineering. What's more, the morphology of the supply chain as markets and technology continue to change rapidly also is highly dynamic and changing.

The supply chain we know now may not be the supply chain of tomorrow. But the information flow will be the lifeline that links all elements of the chain, regardless of its shape (Figure 1). Leading fabless semiconductor companies clearly understand this as evidenced by Qualcomm's vision of the Integrated Fabless Manufacturing model.

Figure 1. The Role of Information Flow in Integrating a Value-Added Supply ChainSource: Douglas M. Lambert, Editor, Supply Chain Management: Processes, Partnerships, Performance, Third Edition, Sarasota, FL, Supply Chain Management Institute, 2008, pp. 3.

In our business of test and evaluation engineering, the test data that travels across geographically scattered multi-enterprise partners represents opportunities for optimizing processes, products, and operations along the information flow. Every piece of information about the DUT is critical and demands swift corrective action or accurate field-return tracking across a global enterprise, in wafer sort, and in final test. The benefits of early detection and early correction of test issues include the following:
•Improved quality, health control, and data integrity to minimize test escapes.
•Maximized yield by reclaiming devices wrongly labeled bad and speeding time to entitled yield of new products at introduction.
•Increased efficiency and productivity.

This data can be accessed in several different ways, in some cases retrieved remotely and in others via test floor hardware, by using controllers, proxies, or Standard Test Data Format (STDF).

Data: Essential to the Lifeline

Data integrity is key to quality and end-customer satisfaction, but it also is key to sound management decisions. How can you be secure in making the right decision if your data is corrupt? Data of poor integrity adversely affects operational visibility and adversely impacts cost of test.

In semiconductors, forecasting and improving semiconductor yields are critical parts of supply-chain planning. Yield improvements via yield reclamation in steady-state manufacturing for existing devices contribute to keeping costs low as products mature. Yield learning for new devices to enable high-volume production ramp-ups is key to keeping new products in the hands of consumers when demand is high. Yield improvements of 3, 5, or 10% can send waves of efficiencies throughout the value-added supply chain.

Optimized retest, setup, correlations, and management of probe cards and load boards can mean increased efficiencies and productivities of up to 15% or more. Consider that these benefits as well as accelerated learning can be amplified across a global supply chain and global test operations to realize a magnified leveraging of data that translates to powerful competitive advantage.

Today, advances in software technology make it possible for companies to optimize, manage, and strategically leverage test data across the dynamic outsourced business model. If you and your company are not actively investigating and evaluating these latest solutions, you are overlooking modern techniques that can help you perform better, faster, and more successfully.

Integrated, Automated Engineering Solutions

Engineering solutions for better data analysis and data integrity and vastly improved yields, outlier detection, and other critical performance measurements are based on state-of-the-art technology including:
•Second-generation, advanced adaptive test.
•Automated processes across the IC lifecycle, including analysis, simulation, reporting.
•A robust IT backbone and integrated IT infrastructure.
•Centralized data repository designed for test engineering data and rapid data retrieval.
•New approaches to methodologies such as establishing baseline reference die and flexible, sophisticated algorithm engines.

Advanced adaptive test, automated across the IC lifecycle from design and wafer sort to final test and expertly applied robust algorithms to ensure data analysis and data integrity, delivers new standards of reliability. These highly automated processes implemented on data drawn from local or distributed operations can deliver very early detection of issues that provide unbeatable higher ground from which to conduct business. Combined with an integrated IT infrastructure across geographies and partners, it allows companies to gather and organize data from across dispersed supply-chain partners in near time.

These technologies process the data and automatically scan for all kinds of issues affecting yield, quality, test time, and other critical dashboard measurements. Exceptions can be automatically flagged with alerts and reports generated almost immediately for quick corrective action.

The Impact for Test Operations Across
Global Partners

From a regional point of view in a multi-enterprise, value-added supply chain, data logs can be transferred to all local operations, achieving near-time early notifications about quality, yield loss, test time control, and data integrity checks. Offline, data logs can be sent from regional operations to headquarters. Once data is captured in a test-oriented database, sophisticated analyses of test time, yield, and statistical processing controls can be done offline to implement solutions across a worldwide operation.

On a local test floor, real-time efficiencies that round out global operational management can be realized through other, complementary solutions that perform same-time control and monitoring. These achieve additional benefits in test time reduction, outlier detection, prober controls, and other local-site issues.

Leveraging data, processes, and operations in this way allows supply-chain management in near time, compressing days into hours or minutes, and better strategic management of business units, divisions, and partners.

Strategic alignment with trusted partners is the overriding management imperative for an increasingly complex business environment. Consider the role you can play using test data as the lever.

About the Author

Debbora Ahlgren is vice president, sales and marketing at OptimalTest. She has more than 20 years experience in the semiconductor industry in business development, operations, marketing, and sales. Prior to joining OptimalTest, Ms. Ahlgren was vice president and chief marketing officer for Verigy, vice president and general manager for field operations in the Americas and Korea for Agilent Technologies, and vice president of marketing and business development for LTX and held management positions at KLA-Tencor and Schlumberger. Ms. Ahlgren received a B.S. from the State University of New York at Stony Brook and earned the equivalent of a M.S. in managing technology from the SUNY Stony Brook Harriman School. 408-978-8402, e-mail: [email protected] 

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!