The data acquisition business is rife with age discrimination. No, not between companies and employees, but among users of dataloggers and recorders. Depending on a user’s age and recent test-equipment experience, the perception of instrument capabilities may be very wrong.
Oscillographic recorders traditionally acquired analog waveforms by writing them directly to a paper chart. For many years, the choice of ink or thermal writing systems was a continuing debate. Before that, rectilinear writing was a major advance because it eliminated the need for specially-printed curved-line chart paper.
More recently, fast sampling with semiconductor memory and the inclusion of hard disk drives offering massive amounts of storage have changed the role of recorders. Some of these products are called data recorders, although this term also can apply to magnetic tape-based recorders. The high sampling rate combined with solid-state memory is key to transient recorder operation.
Similarly, dataloggers no longer only print long lists of measurement values on a strip chart. They still can do this, but more often a datalogger includes a temporary storage capability that retains measurement values until they can be transferred to a PC. Low-power memory chips have made possible very small form factors and unattended, remote operation for weeks or months.
Today, the availability of low-cost, powerful microprocessors and large memory chips has further confused recorder and datalogger identities.
- When does a recorder become a data acquisition system with a display device? There is a class of data acquisition systems with displays called paperless recorders by the industries that use them.
- If a data acquisition system uses software that displays signals as scrolling waveforms, is it a recorder? Many data acquisition systems display waveforms as oscillographs, but that capability doesn’t make them oscilloscopes.
- How does a real-time or post-acquisition analysis capability affect product definition?
Data sharing and online custom analysis are two major trends that Ryan Wynn, a data acquisition product manager at National Instruments, currently sees as defining the role of a datalogger. “Data sharing allows users to view and distribute logged data via communications methods such as Ethernet or RS-485. PC-based dataloggers take this technology a step further by providing the capability to immediately view data as it’s acquired as well as online custom analysis tools.
“Online analysis saves users time and money because decisions can be made based on immediate analysis of data as it’s acquired,” he explained. “This capability is very important for monitoring and test applications that last for extended periods of time, where decisions must be made prior to completion of the application.”
Another trend includes sufficient analysis capability so the datalogger can determine which information to retain and which to discard. The very low cost of large amounts of semiconductor memory has facilitated long-time, high-resolution data recording. “But,” as Mark Albert, the sales and marketing manager at Logic Beach, said, “the cost per byte stored is insignificant when compared to the cost of collecting and analyzing unnecessary data that only was stored because it could be. The need is to collect the right data for a specific application.
“This is when a degree of intelligence in the datalogging instrument is important,” he continued. “Conditional and intelligent datalogging scenarios executing within the instrument can reduce the amount of data recorded while delivering critical information for evaluation.”
Another view of the PC-to-datalogger relationship was provided by Tom DeSantis, president of IOtech. Until recently, dataloggers downloaded data to a PC via slow RS-232 or IEEE 488 interfaces. It made sense for dataloggers to be stand-alone instruments because of the data transfer speed and physical proximity restrictions imposed by these buses.
“New measurement hardware having Ethernet as the indigenous interface overcomes both speed and distance restrictions,” Mr. DeSantis said. “[If the datalogger is tethered to a PC via Ethernet,] the PC has immediate access to acquired data and, in the event of an alarm condition, can initiate some action. Also, it is inherently less expensive to leverage the data storage capabilities already available in the PC vs. replicating them in an external box. In the near future, except for field applications, all dataloggers will have an Ethernet link to a network or PC.”
This theme was elaborated on by Bruce Fuller, a product marketing manager at Fluke Precision. “We see Ethernet and TCP/IP as the de facto standards for transferring information from data acquisition systems to virtually anywhere. For example, when sending data to servers, PC desktops, or the web, these methods and protocols can achieve the universal data access that data acquisition system technicians always have desired.”
The adoption of Ethernet communications is only one aspect of a broader shift from proprietary platforms to commercial-off-the-shelf (COTS) hardware, according to Richard Renck, vice president of engineering at R.C. Electronics. Hard disk drives as well as Ethernet protocols and hardware provide opportunities for recorders and dataloggers even though they were developed for the computer and communications markets.
During the period of change from paper to various types of electronic storage, the capabilities of widespread data sharing, custom analysis, and real-time data reduction have developed. However, some of the advantages of paper have been difficult to replicate in all-electronic systems.
For example, data security isn’t a high priority for elevator companies that use rugged, portable, paper-based recorders when commissioning new or refurbished installations. In pharmaceutical companies, however, for many years original paper recordings of drug test results were required as evidence should problems such as dangerous side effects arise in the future.
Changing from a paper-based system to an all-electronic one requires an assessment of the importance of data protection. Some means of ensuring that original data files cannot be altered are mandatory in many businesses. Koji Komatsu, a recorder and data acquisition product marketing manager at Yokogawa, commented that “electronic data entails a concern for data tampering, which is not the case for data recorded on paper.
“In relation to this risk, protective provisions can enhance data security,” he explained. “These include restricting the level of operator access by requiring a log-in name and password, encrypting the data into an unchangeable format, and changing the color of the modified electronic data so that tampering is readily detectable.”
Working with electronically stored data also requires different methods of data retrieval and analysis. Retrieval is easy enough: simply select the proper data file. Having the results of many tests readily available in a form that can be shared is a major advantage of electronic storage.
On the other hand, having selected the correct data file does not mean that a specific event is easily accessible. For example, trained operators using paper-based recording could quickly pick out abnormal EKG signals. In this case, expert software has been developed to mimic the screening action performed by a cardiologist. Abnormal waveforms can be highlighted automatically, and the reviewing doctor needs only confirm the program’s finding.
Transferring and Searching
In many other recording application areas, however, it is much more difficult to describe what constitutes a faulty waveform. Automatic data mask or template-matching functions can compare events to their expected parameters very quickly, but only if suspected artifacts can be described mathematically or graphically.
Part of the problem of data review has been reduced by faster microprocessors. A few years ago, it could easily take hours to search through hundreds of megabytes of data to find matching, or nearly matching, events. As a result of ongoing microprocessor speed improvements, searching based on what-if scenarios has become more practical.
“As a byproduct of network provisioning, we have seen a sharp reduction in the time required to transfer a large volume of stored data because of Ethernet’s high-speed capability,” added Yokogawa’s Mr. Komatsu. “A data transfer that took several hours with conventional GPIB or RS-232 now can be achieved in several minutes of processing.”
The falling price of data storage emphasizes the need for fast file transfer and data search capabilities because it now is feasible to acquire truly huge amounts of data from multichannel tests. “The cost per unit of commercial data storage drops by an order of magnitude every three to five years,” said R.C. Electronics’ Mr. Renck.
“The cost of a 4-GB SCSI data drive in 1998 is on a parity with the current cost of an ultra SCSI 72-GB drive that offers 20× capacity at an 8× speed improvement,” he continued. “A data storage capacity of 35 GB available in Sony’s AIT-1 8-mm tape format in 1997 has been replaced by the AIT-3 tape with 100-GB capacity and 12-MB/s sustained transfer rate.”
How the acquired data is presented also affects the rate at which a user can understand the information. Jay Roberts, a product manager at Gould Nicolet Technologies, commented on the operation of the company’s new ViewGraf recorder. “Lower resolution display data based on max-min pairs is displayed when you open a file. As you zoom in to finer details, the software opens only the portion of the raw data required to fit the display. This technique lets gigabyte-sized recording files be opened in only seconds, plus the user enjoys much faster and more responsive zooming/scrolling cursor movement and measurements.”
Configurations and Capabilities
An example that demonstrates many of the opportunities afforded by today’s instruments is the application of networked dataloggers to power-plant exhaust-gas monitoring. Regulatory laws require the opacity of the exhaust gas to be logged. In addition, for control purposes, it is necessary to measure the water flow through the stack scrubbers and the temperature and fuel flow to the boiler.
A distributed data acquisition system was chosen rather than a centralized system with distributed sensors. “Programmable logic controllers (PLCs) often are networked to a supervisory control and data acquisition system for routine control but maintain the capability to run critical system controls if communications are lost. In the same way, a distributed, networked data acquisition system can assure loss-free data logging,” said National Instruments’ Mr. Wynn.
“In this system, the data acquisition and logging portions are combined in each node of the network. These nodes still may be connected to a central computer for integration of multiple signals and to provide a human machine interface,” he continued, “but if communications are lost, each node can log data independently. Once the network connection is restored, the separate nodes can pass information back to the central computer.”
Datalogging is an important part of this application, but the central PC also changes the fuel-air mixture and the scrubber water flow to control the opacity of the exhaust gas. Although data must be logged to satisfy the regulatory-agency requirements, only data values outside an acceptable band are stored with the PC performing the real-time filtering.
In another application, the battery operation and small size of a temperature/humidity datalogger helped workers determine the cause of excess humidity in the Westfield, IN, public library. Kathy Donovan, a marketing analyst at Dickson, related details of the story as reported in the Indianapolis Star.
In September 2000, the library closed because toxic mold was discovered in the library walls. Consultants used Dickson’s portable TP120 Logger to pinpoint the areas of high moisture that typically would lead to mold growth. In this case, the problem resulted from incorrect construction methods used during a 1995 building expansion. According to the consultants, had the library monitored the humidity level, perhaps using a small instrument like the TP120, the mold outbreak likely could have been prevented.
Like other small, stand-alone dataloggers, the TP120 is intended to download data to a PC for archiving, display, and further analysis. However, many stand-alone instruments also have integral displays so the data can be reviewed as it is being logged.
Whether stand-alone or requiring a PC, there probably is a recorder/datalogger to suit your needs. When making your selection, don’t forget to fully define your instrument interfacing requirements. For example, in a fuel-cell development project, measuring the voltages developed by each cell is important. In this application, the test instrument’s many channels must accommodate large common-mode voltages while providing accurate measurements of the differential-mode voltage.
The output format also must be considered. Should the data be provided in a comma-separated variable form for entry into an Excel or Lotus 1-2-3 spreadsheet? Are you using an analysis program that requires input in some other format? In short, completely define your application, not just the amount of storage required and the sample rate.
“It should go without saying that as a result of cheaper/faster/better microprocessors, more and more functionality can be crammed into a smaller package,” commented Rick Coleman, a Eurotherm product manager. “This has produced some great stand-alone, remote data acquisition/datalogger systems.”
Recorder and Datalogger Products
Machine Run-Time Monitor
Modular Recording System
The stand-alone, four-channel OM-CP-QUADTEMP Temperature Datalogger takes and time stamps up to 24,575 thermocouple readings per channel, a total of 122,000 at a rate selectable from 12 S/s to 1 S/day. External thermocouple accuracy is 0.1°C for types J, K, T, and E and 0.5°C for types R, S, and B. In addition, an internal sensor measures the ambient temperature to ±0.5°C from 0 to 50°C. The 3.2” × 4.5” × 1.1” battery-powered instrument features automatic cold-junction compensation and linearization. Data output is via a PC serial or RS-232-COM port. $599. Omega Engineering
Virtual Logging Software
Tiny Temperature Logger
The six-channel Model TCChart Temperature Chart Recorder is the size of an extended type II PCMCIA card and handles thermocouple types B, E, J, K, N, R, S, T, Omega OS-88000-k-1200, and Exergen IRT/C_k-80. Sampling rates from 2 S/s to 0.5 S/s are user selectable. Features include a 0°C to 55°C operating temperature range, cold junction compensation, floating or ground referencing, open channel detection, and channel calibration. Included are Windows-compatible logging and graphic software and an API for user-developed applications. The 10-mA typical load current is drawn from the host computer. $895. Nomadics
Fast and Deep Recorder
Universal Input Recorder
Acquisition + Analysis
The no-mainframe EX Series Data Acquisition System is assembled by combining I.LINK interface modules and input, function, and recording modules to suit your requirements. From four to 192 channels can be provided with microphone, charge, DC strain, pulse, and thermocouple inputs; analog output, signal generator, and analysis DSP functions; and AIT tape drive and PC card hard disk drive recording modules. Features include an 8-Hz to 65,536-Hz sampling rate, 16-b and 24-b digitization, and the capability to physically separate groups of modules. PCscan IV software controls the system and supports analysis. Contact company for price. Sony
Power and Signal Logging
High-Volume Data Recorder
High Channel-Count Logger
Return to EE Home Page
Published by EE-Evaluation Engineering
All contents © 2002 Nelson Publishing Inc.
No reprint, distribution, or reuse in any medium is permitted
without the express written consent of the publisher.