ITC keynoter addresses nonlinear validation challenge

Anaheim, CA. Verification and validation are key aspects of supporting a brand and providing customers with a better user experience, according to John D. Barton, vice president of the platform engineering group at Intel. Delivering a Wednesday afternoon International Test Conference keynote speech titled “Compute Continuum and the Nonlinear Validation Challenge,” Barton said validation approaches need to change to deal with higher levels of integration; the need for improved efficiency, lower cost, and shorter time to market; and what he called hyper-segmentation.

When he began working at Intel in 1982, he said, chips had 200,000 transistors and no floating-point or pipelining capabilities. Today, chips can have 8 billion transistors, providing a broad canvas on which engineers can paint. The chips today serve a continuum ranging from servers to embedded systems. The validation continuum needed to serve such varied applications, he said, is nonlinear, and various development stages—from concept through production—exist in silos.

One effort, he said, is to “shift left,” converting serial development steps into parallel ones, and that requires a re-examination of end-to-end development tools, such as software models, FPGA prototypes, and emulators. Tools are noncongruent, and there is a need to collapse silos of tools to support the product life cycle (PLC).

Barton cited as an example the shipping container, which provides a standardized order-delivery function for multiple modes of transportation—by truck, railroad, or ship. Similarly, he said, tools could be standardized.

At the conclusion of the talk, a questioner asked whether there might be some benefit to tool orthogonality to help uncover potential design problems. Barton said there could be value in redundancy, but the economics come out in favor of commonality. He then mentioned bug rates, explaining more than half of bugs are caused by the tools, not the product. Another questioner asked about tool providers' potential resistance to moving away from proprietary tools. Barton cited the disruptive forces of capitalism—if incumbent tool providers don't adapt, others will arise to take their place. Finally, a questioner asked about validation of analog. “It's harder,” Barton said, adding that one approach is to design with margins, but for products pushing the bleeding edge, it may be necessary to build in observability.

In conclusion, he said he sees an opportunity to work as a community to change the way development is done.

See these related ITC articles:        

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!