Big Applications Demand Innovative Testing

It isn’t just during the night that things go bump-or splash or bang. Physical testing can present many interesting technical challenges, especially if the test is destructive or simply too expensive to repeat. In these cases, reliability takes top spot on the list of required data acquisition system features.

Launching Lifeboats

One example is the testing of free-fall lifeboats. The overall project is expensive, involving building the lifeboat, installing the boat and its launch system on a ship or at a special test facility, and instrumenting it. In addition, the tests must be witnessed by the relevant regulatory authorities, also paid for by the manufacturer. And, there are a number of volunteers to find, willing to be launched into the sea from heights up to 100 ft.

Free-fall lifeboats have only been used for the last 20 years, primarily on cargo ships, tankers, and oil-drilling and production platforms. They are considered much safer than traditional lifeboats because there are no ropes to become tangled during launch, launch is possible even if the ship is listing severely, and the energy generated by the falling boat propels it away from danger (Figure 1, at right).

Dr. Jim Nelson, professor and chair of the civil engineering department at Clemson University, said, “Current regulations of the International Maritime Organization require that acceleration forces be measured in prototype free-fall lifeboats to infer the potential for injury of the occupants during the launch. This requires that the triaxial acceleration field in the lifeboat be measured at least at two locations in the boat.

“Often, the forces are measured at three locations concurrently to determine the worst exposure,” he continued. “Given the rise-time of the signal, the sampling rate on each channel generally is in the range of 2 kHz. Because the boat falls through the air before it enters the water, and because it moves a significant distance away from the point of release, the entire data acquisition system has to be self-contained and ride in the lifeboat.”

During water entry, the acceleration forces can be as high as 15g. Dr. Nelson explained that this ordinarily would be a very large acceleration for humans to endure, except that the impulse caused as the boat hits the water lasts for only a short time (Figure 2).

Testing Ammunition

Another application that exhibits high transient forces is the testing of M1A1 tank ammunition. B & B Technologies in Albuquerque, NM, recently instrumented a firing range with a modern PC-based measurement system. Lance Butler, a senior systems integrator and a partner in B & B, discussed some of the more difficult aspects of the job.

He said that B & B was confident the company could provide high-speed digitizing and storage required for the 150,000-psi peak breach pressure and could use radar to measure the high shell velocity. The problem that took time to solve involved the 14 downrange weather stations that provided wind and temperature data over a distance of several kilometers. Some form of telemetry was needed, but all options seemed unworkable.

The wiring that was in place was not suitable because of its high cost of maintenance and low reliability. The weather stations were located on top of towers and could be hit by lightning, damaging the wiring. Also, some of the wires had been chewed by cattle and, although subsequently spliced, they were subject to further deterioration from the weather and ground water.

Fiber-optic cables would cost too much to run. Wireless modems created an RF hazard because the test cannon was fired electronically. The last thing wanted was spurious RF signals that could possibly trigger the cannon unexpectedly.

The ultimate solution was, in fact, a wireless modem, but one that operated on less than 1 W. The very low-amplitude RF signal was acceptable. Four groups of three weather stations were hard-wired to modems. The other two stations were close enough to the control bunker to be hard-wired directly.

Figure 3 (right) shows the 10-m fireball that accompanies this test. The sensors and their cabling are designed to withstand the exceptionally unfriendly ambient conditions. Observation personnel and the test instruments are in a bunker at a relatively safe distance.

Remodeling Theaters

A very different kind of data acquisition system is required for opera houses. We’re accustomed to associating audio recording with concert theaters so a data acquisition application may not be entirely obvious. When a new hall is designed or an old one remodeled, the effect of the architectural features upon the building’s acoustics is of paramount importance. Andrew Dawson, an applications engineer at Gage Applied Sciences, described how one of Gage’s customers, Kirkegaard & Associates, approached the application.

“A sound stimulus, S(t), is generated by a loudspeaker, and the response R(t) is received by a microphone where t is time. The impulse response, I(t), is derived from the stimulus and response functions as:

I(t) = F-1 {F[R(t)]/F[S(t)]}

where: F = the Fourier Transform 
F-1 = the inverse Fourier Transform

“The impulse response function is used to evaluate the acoustic response of the room. It is equivalent to the pattern of reflections that would be received by a microphone if an infinitely sharp spike of sound were generated by the loudspeaker,” he continued. “In practice, the stimulus is a swept wave, called a chirp.

“A well-engineered venue has an impulse response that decays over about 2 s and has no sharp features that correspond to dominant echoes. For example, the focusing effect of a domed roof leads to spikes in the impulse response which correspond to undesirable loud echoes.”

Typically, the effect of venue renovation is simulated in a scale model on the order of about 1/20 size (Figure 4, right). Simulations also must reduce acoustic wavelength by the same factor of 20. Even audience members are modeled using wood and fabric since they absorb a significant portion of the sound. The model can be verified for exactness by comparing the shape of its impulse response with that measured in the actual venue. Renovations then are optimized in the model to provide the best possible impulse response before renovations of the actual venue begin.

A data acquisition system for acoustic engineering work of this type includes at least one digital-to-analog channel to generate the stimulus and four analog-to-digital (A/D) channels to accommodate multiple microphones. Each A/D channel must be sampled simultaneously at a 1-MS/s rate with at least 400 kS of memory per channel. By using an arbitrary waveform generator, such a system can provide identical, repetitive chirps. The responses to the chirps can be averaged to improve the signal-to-noise ratio.

Modeling the Bay

Even larger than a concert hall or a tank firing range is the 1,600 sq mi area of San Francisco Bay and the adjoining Sacramento/San Joaquin Delta. The Army Corps of Engineers built a detailed hydraulic model of the entire estuary to test the 1950s Reber plan that proposed damming part of San Francisco Bay for use as a fresh-water reservoir. The model, although scaled 1,000:1 in the horizontal plane and 100:1 in the vertical direction, still covers more than 13,000 sq ft and is housed in a warehouse in Sausalito near San Francisco (Figure 5, right).

Cal-Bay Systems had the job of replacing a 1980s vintage computerized system that provided only partial control because of memory limitations. The company designed a PC-based data acquisition and control system to meet present-day requirements for the model:

  • High channel count to accommodate more than 150 sensors.
  • Implementation of fully automated tide control.
  • Data presentation for all of the Army Corps of Engineers via the Internet.

Dave Weisberg, a project manager with Cal-Bay Systems, said, “The data acquisition portion of the application uses eight 32-channel multiplexers, allowing a total of 256 channels of analog input. This leaves plenty of room for future expansion. All 256 channels are scanned every two seconds. Data is scaled using third-order equations generated by calibration software.

“Laptop computers calibrate the sensors, feeding back data to the central server via wireless Ethernet,” he added. “A local area network integrates the entire model and allows each component to communicate its status and information.”

Controlling the tide was more difficult. A 75-hp pump provided saltwater at a constant rate. Three servo-controlled valves in the Pacific Ocean part of the model opened more to simulate low tide and closed partially to simulate high tide. Because the model may be the only physical hydraulic system in the United States used to simulate precise tidal conditions—the tolerance for tide level at the Golden Gate Bridge is 0.01″—there was a lot to learn.

The outputs from three ultrasonic level sensors were averaged to determine water level. Frequency analysis of the wave motion in the model determined the harmonics of the oscillations. This information was used to help tune the proportional integral-differential (PID) tide control algorithm.

Figure 6 (right) shows an example of an early test display provided by the PID Toolkit from National Instruments (NI), here presenting a desired sinusoidal variation in water level, the position of the servo valves (high for open and low for closed), and the resulting measured water level. Mr. Weisberg confirmed that neither the maximum filling nor emptying flow rates were sufficient to allow tight dynamic control of level-the reason the valve-control signal was limiting. Although the error was within acceptable limits, the hydraulic system subsequently was modified to improve performance.

Because of the model’s history, a manual valve, half open, was shunting the 75-hp pump to divert part of its output. Cal-Bay replaced it with a servo-controllable valve. This meant that twice the flow rate was available for filling. Similarly, because all of the pump’s output now could be returned to its sump, the three draining valves could empty the model much more quickly.

A recent use of the model, which is open to the public, was to determine the need for rebuilding the Sacramento Levees destroyed by El Niño.

Summary

While it’s probable that you may not have the opportunity to work with physically large data acquisition and control systems such as the firing range or San Francisco Bay model, it’s useful to discuss the tools the designers found helpful.

Both of these systems used NI’s LabVIEW. Cal-Bay used NI’s FieldPoint modules to control river flows in the Sacramento delta area of the model and the PID Toolkit for tide control. The firing-range project had no control element associated with it, since data acquisition was the only requirement. Both applications used wireless links: the firing range for weather-station telemetry and the Bay model for communicating calibration data from portable PCs to the central server.

In the free-fall lifeboat tests, the data acquisition system itself had to be rugged. Dr. Nelson said that he either cushions the PC with a large piece of foam and tapes the assembly to a boat seat or he holds the PC on his lap if he is taking part in the test.

A Data Translation DT7102 board is used for the lifeboat test along with three sets of three accelerometers with ±25g working ranges. The aggregate sample rate is about 18 kHz, or 2 kHz for each of the nine sensors.

Of course, you may not be part of the experiment yourself like Dr. Nelson, and you may not be a systems integrator like Cal-Bay or B & B Technologies. If, in addition, you cannot determine your own data acquisition needs as Kirkegaard & Associates did, discuss your application with a number of data acquisition system vendors. Application engineers, application notes, and relevant articles are good sources of information.

In the four applications, data acquisition speed and capacity, control capability, and report-generation flexibility were required. In addition, signal noise, sample rates, crosstalk, and many other technical aspects had to be addressed. Big applications, such as the ones presented here, also may need to access the Internet, accept inputs from and be able to calibrate hundreds of sensors, and incorporate wireless communications links because of the large distances involved or the need for portability.

Consider all aspects of your data acquisition and control application when determining how best to approach its solution. Plan for anticipated changes, address known problems if you are upgrading an existing installation, and don’t neglect the project’s physical size. Try to determine the effect of a consistent software development environment on subsequent system maintenance. The approach you ultimately adopt will be a compromise, but by planning for all the foreseeable difficulties, few major surprises will be lurking at later stages of the project.

Return to PC-Based Test Online – Return to EE Home Page

Published by EE-Evaluation Engineering
All contents © 1999 Nelson Publishing Inc.
No reprint, distribution, or reuse in any medium is permitted
without the express written consent of the publisher.

November 1999

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!