Logic Analyzers, Then And Now

June 10, 2002
When ICs first became available to electronics engineers 30 years ago, oscilloscopes could no longer do the job. Asynchronous logic designs with multiple pins gave rise to multipath race conditions. On the other hand, synchronous algorithmic state...

When ICs first became available to electronics engineers 30 years ago, oscilloscopes could no longer do the job. Asynchronous logic designs with multiple pins gave rise to multipath race conditions. On the other hand, synchronous algorithmic state designs were register-based, with the binary combinations of data representing the real signal of interest.

To meet designers' needs, two separate classes of test equipment emerged. Logic timing analyzers deal with asynchronous issues, using sampled data and displays of pseudovoltage versus time. Today, we call them digital oscilloscopes. Logic-state analyzers handle the second set of issues. They provided binary-, octal-, or hex-number displays of data registers. These quickly evolved to show assembly-level instruction sets, ASCII character sets, and compiled statements.

Since then, many variants have appeared. Most logic analyzers offer both logic timing and state displays. Network bus analyzers and microprocessor emulators fall into the second category. Yet the test equipment industry has ultimately failed our field. No major test equipment or emulator vendors have dealt with the obvious fact that software has replaced hardware as the quality variable. The result is a shocking lack of appropriate and analogous tools for software developers and testers.

This curious lack is excused on the grounds that it's a different business, one that we don't understand, or one that doesn't have a sizable market. This is merely self-serving. The same statements applied to logic analysis during its infancy. Emulator companies were trying to sell chips, not development systems. But test companies were dedicated to testing the most difficult and important scientific and engineering problems that existed. Logic analyzers are sold to hardware developers on these same projects, and they're taught in the electronic engineering curricula.

How did this happen, and why hasn't it happened in software? Well, several vendors spent 10 years and $200 million dollars educating and equipping universities to teach logic analysis. But no company invested one-twentieth of that for the analogous software measurement problem—a problem that's 10 times larger. Moreover, major software vendors have assiduously avoided dealing with the quality issue. Users are the test beds, and they're losers in such scenarios because the vendors have devoted their resources to getting code developed and sold, rather than validated and verified.

Why haven't the major test equipment companies dealt with this opportunity? It may have mostly to do with maturation and consequent sclerosis of innovative activity, an endemic problem in most corporations as they grow and age. Unfortunately, it's nearly impossible for small startups to change a collective paradigm of design if the large test corporations, users, and software vendors are happy with the status quo.

Software errors have been responsible for numerous major failures, from banking and stock market exchange shutdowns to telecommunications failures, elevator malfunctions resulting in deaths, automobile "seizures" and collisions, and industrial plant explosions. Legislation, already prominent in health-care systems and avionics, trails major catastrophic events. Still, the eventual result will be a major burden on tools vendors to deal with these regulations by designing adapted tools per industry. In the long run, this will be much more expensive than putting the emphasis on the design process in general, and testing for and verifying safety-critical capabilities up front. Where is the software logic analyzer, and the test industry leadership to define, design, and establish it?

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!