Software-Driven Test Highlighted as Vector Signal Transceiver Debuts
More than 3,400 professionals—up 12% from 2011—attended NIWeek 2012 Aug. 7-9 in Austin. Based on Tuesday morning’s keynote by Dr. James Truchard, the event’s emphasis would be on system design—specifically graphical systems design. Following transistors and integrated circuits, Dr. Truchard said that we have entered the software stage of ecosystem evolution. The goal, in his words, is to do for embedded design what the PC did for the desktop. This idea is embodied in a V diagram that places system design activities on the left arm of the V with corresponding test activities on the right.
Various types of software-driven test activities, according to Dr. Truchard, can be grouped under a cyber-physical test heading. These include hardware in the loop (HIL) testing and protocol-aware test. Abstraction is the key to these and other system-level test strategies. For NI, synchronous dataflow has been a key concept, supporting timing within LabVIEW routines, leading to flexible FPGA designs.
LabVIEW is the central layer of NI’s platform-based approach, Dr. Truchard explained, with applications being funneled through appropriate LabVIEW routines to the actual hardware upon which they are deployed. This architecture also applies to the company’s work with the reconfigurable I/O (RIO) hardware platform, still evolving after 15 years.
To sum up his message, Dr. T. recounted the company’s even-handed 1989 offer to both cold-fusion proponents and opponents: Each would receive a free copy of LabVIEW with which the various theories could be proven or disproven. Few took advantage of the offer although it was a good example of one of Dr. Truchard’s closing remarks, “We don’t judge. We measure.”
Eric Starkloff, vice president of product marketing, followed Dr. Truchard and introduced the demos that made up the remainder of the session. Throughout, the theme was very much one of high-level, abstracted, system design. Virtually all products with electronic content have become far too complex to test in traditional ways. It’s key to be able to leverage design IP in the test activities. For example, a luxury car may have more than 100M lines of code running on a large number of processors and controllers.
NIWeek Product Debuts
National Instruments chose the NIWeek venue to introduce several hardware and software products, including LabVIEW 2012, a new version that includes productivity and quality-improvement features. Sample projects and templates are part of this version—they allow a user to start with a common application and immediately run it. Or, that application can easily be modified to better match the actual application. Sample programs are fully annotated.
Associated with these capabilities is the opt-in-only User Experience Improvement Program. If a user selects this option, NI is better able to understand how to further improve the sample programs and templates.
In addition, the CompactDAQ product line now has a four-slot Ethernet chassis and a timing/synch module. And, there is a new stand-alone product based on an Intel dual-core processor. It is aimed at applications like in-vehicle monitoring but also is appropriate for many DAQ jobs in which a PC is not wanted. Once data has been collected, it can be reviewed via DIAdem. Another demo showed the capabilities of CompactRIO to control engines. In this case, LabVIEW code implements closed-loop control on an FPGA.
A key product debuting at NIWeek was the NI PXIe 5644R Vector Signal Transceiver, which Charles Schroeder, director of test marketing, described as the first example of what he called software-designed instrumentation. Although the instrument includes hardware mixers and oscillators, the actual data conversion and all subsequent stages of digital signal processing are described by LabVIEW code, to which users have access. They can drill down in the triggering, acquisition, generation, and analysis areas to accomplish the detailed custom algorithms they need for their applications. In fact, they can redesign the instrument so that the measurement they need to perform happens natively.
Big Physics
Sessions at the event included the Big Physics Summit organized by Mike Dunne, director of the National Ignition Facility, part of the Lawrence Livermore National Lab. In one project he described, individual light beams from a group of absolutely massive lasers—each 300 feet long—are focused onto a tiny sample of duterium or tritium. As Dunne often commented, “If we get this right…” one pulse of the several-MW lasers will heat the sample to the point that it will start a fusion reaction, releasing up to 10-MW of power. Today, they have got to the 3-kW level and estimate that the lasers need to be better controlled by a factor of about 10 to achieve the level of fusion anticipated.
A session within the energy summit highlighted low energy nuclear reaction (LENR), the generic name currently used to refer to cold fusion and similar related effects. One speaker in the session represented Brillouin Energy, one of eight or nine companies, according to the speaker, that have made significant progress in this field. The original cold-fusion experiment was reported in 1989 by Pons and Fleischmann. They claimed that excess heat was generated when palladium was heated in the presence of deuterium. Brillouin Energy claims to have duplicated the result but using only palladium and distilled water.
As the speaker explained in response to a question from the audience, palladium is the only element with a completely filled D shell and an empty S shell—S and D being the lower two electron energy levels, D higher than S. His explanation for the anomalous heat effect is electron capture by the S shell. Regardless of the exact cause, it is difficult to believe that excess heat energy is not being generated: Almost twice as much energy is being measured as is applied to heat the materials. With such a large difference between heat input and output, a measurement error is unlikely. NI equipment is used by many of the experimenters.
Education
Ray Almgren, vice president for core platform marketing, introduced the closing keynote session. Several of Almgren’s previous positions within the company were directly involved with academia, and engineering education also was featured. According to Almgren and reports he cited, engineering education all too often has tended to be a math education, devoid of hands-on project-oriented experience. Without direct involvement, many students enrolled in an engineering curriculum don’t pursue engineering as a career.
The team, led by Dr. David Keeling and Ali Alazmani from the U.K.’s Leeds University, described some of the work they were doing with robotic systems to assist cerebral palsy patients. They were followed by University of Manchester faculty who described the significant improvement in student performance and attitude after experiment-based learning was added to the engineering curriculum. Students work on real projects of relevance to and suggested by specific industries.
And, a team from Olin College in Needham, MA, described the autonomous sailboat they designed, built, and campaigned in a robotics competition. Important parts of the learning experience were organizing the parallel operation of the various groups making up the team and maintaining cooperation and communications among them.
The keynote speaker was Dr. Tom Kurfess, assistant director for advanced manufacturing in the White House Office of Science and Technology Policy. Dr. Kurfess described several interrelated government initiatives directed toward greater manufacturing capabilities. Manufacturing matters because it creates wealth as well as jobs. He said the longer term goal of the National Robotic Initiative is to bring about a closer relationship between robots and people to achieve greater efficiencies.
Education is key, and Kurfess emphasized the need to continue learning throughout your career to stay on the leading edge. As an example, he cited 3-D printing that has become an accepted manufacturing process but is affordable for schools and small companies.
Because of his manufacturing background, Kurfess evaluates innovations by their potential to be scaled. For example, if a new material has been developed, when can it be made available by the ton or by the hundreds of tons? Bridging this gap between innovation and production is the goal of the National Network for Manufacturing Innovation, a collaborative effort between industry and academia engaging in applied research and providing training.