Never Enough Data and How to Handle More

Today, all automotive companies must generate, analyze, and archive vast amounts of test data. The data is used to justify their critical decisions made at the design stage, particularly where it applies to safety-related issues.

Federal law requires that occupant safety (OS) testing be conducted successfully before a vehicle can be sold in the United States. Other countries such as Japan and Canada also require confirmation that the design meets the current standards.

The Federal Motor Vehicle Safety Standards are set by the Department of Transportation and administered by the National Highway Traffic Safety Administration. The tests range from those involving small, individual components such as air bags or inflators to one or more complete vehicles. Components for fuel systems, brakes, and safety are among the items tested.

The OS-related tests deal with seatbelts, instrument panels, seats, and other interior trim parts that may interact with the air bags or occupants. Major vehicle tests include barrier crash, sled, rollover, and car-to-car impacts.

All of these tests can cost many thousands of dollars and require significant setup time and effort. If the equipment to be tested is a one-of-a-kind engineering prototype, companies especially do not want to repeat costly testing due to the loss of data. In addition, corporate legal departments now require test data to be available for many years should it be needed as evidence in litigation.

An Example in Progress

A large automotive company recently decided to upgrade a significant portion of an OS test facility and laboratory. This included retiring the older, but rugged, film-based cameras and replacing them with state-of-the-art high-speed digital imagers.

A major requirement driving the decision was the need to make test results available more quickly to design engineers. The company wanted to shorten the delay after completion of a test from the several days associated with film-based cameras to minutes.

The pertinent visual and sensor data could be distributed electronically to personnel who needed to see it, no matter where in the world they were located. This also would permit the test engineer to make immediate changes to the next test in what could be a series of tests to correct problems or to make the test more appropriate.

Data integrity is paramount. The test data must be safely and securely archived, and the raw data cannot be modified, only copied and analyzed.

One key concern was management of the large amounts of image and sensor data collected, a situation frequently encountered in today’s complex test environment. Photographic data alone could exceed many megabytes of image data per camera per test. In a typical OS test facility, there could be anywhere from 1 to 20 cameras or more, each recording from a few hundred to a thousand or more images or frames.

To get a better idea of the volume of imaging data that must be managed, consider a barrier crash case where 15 imagers are used. Each imager may record as many as 400 frames during a test. An example of a popular digital imager is the high-speed Roper MASD HG2000 Camera, which records images up to 2,000 frames per second.

At the end of a test, the images are downloaded to a PC either in a proprietary compressed format as a file of about 192 kB or in a standard TIFF format as a 576-kB file. The compressed format images are not directly viewable but take just one-third the memory of the same image stored as a viewable color TIFF image.

The amount of data generated is summarized in Table 1 (see June 2001 issue of Evaluation Engineering). A typical test could generate more than 1 GB of image data. In addition, another 8 MB or more comprise general test information and sensor data generated and captured by a multichannel, high-speed data acquisition system. General information includes items such as test-header or description data, test run number, test series ID, car model year, vehicle model and options, vehicle serial number, type of test, and comments.

Performing just one such test per workday will generate around 255 GB of data a year that must be securely archived. In addition, design engineers will want to create annotated analysis of specific images and, perhaps, audio video interleave (AVI) formatted movie loops for distribution. Because raw data cannot be modified, all the processed data also must be saved separately, further increasing data storage.

System Specification

Based on experience with its current system, the car company determined that a custom test system was needed to handle the new requirements. The proposal developed by Microsys specified a configurable but tightly integrated system including an easy-to-use operator interface that simplified the control, use, and location of multiple high-speed cameras.

Many configuration issues operational in nature and specific to this car company were incorporated into the specification. For example, one key requirement was the need to quickly see results at the conclusion of the test. Much concern was expressed for correct and secure data handling of these images that would be costly, if not impossible or difficult, to recreate.

The resulting specification called for a web-accessible database for profiles to track the camera-configuration data. The database would include details such as serial number, location, status, current lens information, and usage.

In addition, the specification called for a separate web-accessible database to store existing or newly created profiles for various tests or applications. This allows for quick and easy imager setups by less skilled staff and ensures that the right camera is used in the right spot with the right settings.

The capability to easily recreate the setup for a previous test was very exciting to the setup staff and meant that photographic experts weren’t required as often. The new system could capture high-speed video from the many imagers situated throughout the facility, even from on-board the crashed vehicle.

To simplify data handling, Microsys developed a plan to track all test data, using pointers stored in an Oracle database, for locating where the data was physically stored or archived. The car company’s Information System/Information Technology (IS/IT) group still would be responsible for system maintenance and security.

Software would control the setup and acquisition of all imager and sensor data and determine where to send the data. The data was to be stored locally in a lab server for direct access by the test engineer and as a temporary backup. The data also would be sent to a corporate server for global access that requires password protection.

Without any operator interaction, the digitizer images, perhaps exceeding 1 GB, automatically are downloaded and sent to a secure corporate database server. As soon as this download is complete, the design engineer is notified by e-mail that the data is available for viewing and analysis. Within minutes of the end of a test, the test engineer can view results to ensure that all the cameras caught the right detail and were not hampered by poor light or obstructions. If necessary, the test engineer then can make changes for the next test in the series.

A key to solving the storage problem was a well-defined file structure that could be searched. For this system, a directory structure was based on test ID numbers as well as on searches of multiple fields such as model year and test type.

To minimize storage needs, only the last 12 months of data would be kept online. Older data still would be available upon request.

Once test data was archived, the IS/IT group would update a simple table to ensure that the data still could be tracked, located, and accessed. Another novel approach was to make results available on the web via the intranet so that internal personnel or external password-enabled customers could search for and see the data from their test as soon as reports were available.

For engineering convenience, a post-acquisition analysis and viewing tool would be installed in the labs and on multiple or networked workstations. Once images have been viewed and analyzed, the design engineer could easily create and distribute movie loops in the form of AVI files that show the visual results on which management or the legal staff may base their decisions. The system specification also was written to permit design engineers to make simple linear measurements directly from the displayed video images.

The system proposal was targeted at critical OS applications to accommodate testing of most automotive OS interior components. These tests included barrier crash, pendulum, and sled (crash simulation) and specific tests of static air-bag deployment and burst, gas tank, head impact, and bumpers.

The proposed system could be easily adapted for other types of tests by making the appropriate hardware changes and loading the corresponding test profile. For example, a lab might use the system to perform a static air-bag deployment system and then use the same system to perform a pressure vessel test. Reusing hardware and software would make the system very cost-effective.

Summary

At this point, the program is underway to install all the cameras and software. The car company’s design and test staffs look forward to seeing results immediately. Wide distribution of AVI files will support sound decisions and allow the company’s OS test program to move forward with confidence.

About the Author

Bryan Webb is the marketing manager at Microsys Technologies. Since the 1970s, he has worked in the test and measurement markets for companies such as National Instruments, Fluke, Gould, and Wavetek. Mr. Webb earned a degree in electrical engineering from the University of Waterloo and is a graduate from York University, Canadian Institute of Management. Microsys Technologies, 3710 Nashua Dr., Unit 1, Mississauga, ON L4V IMS Canada, e-mail: [email protected].

Return to EE Home Page

Published by EE-Evaluation Engineering
All contents © 2001 Nelson Publishing Inc.
No reprint, distribution, or reuse in any medium is permitted
without the express written consent of the publisher.

June 2001

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!