It takes more than a smart sensor

Robust, accurate, and low-cost sensors deployed in countless applications continue to improve our quality of life. If your car dates from 2007 or later, you don’t have to guess about tire pressure. The pressure sensor in each wheel automatically notifies the central tire pressure monitoring system (TPMS) if the pressure drops more than 25% below the recommended inflation level. 

Of at least equal concern is the possibility that your local pub may run out of your favorite brew: How would you know before setting out for an evening of imbibing? As described in a New York Times blog by Michael Roston, the SteadyServ company has come to the rescue. A wireless load cell under each keg keeps the barkeeper informed of the state of the container. The company suggests that this knowledge can be used to advantage in the form of tweets encouraging customers to hurry up and get what’s left.

These are only two examples of systems that rely on sensor performance and turn that raw data into actionable information via specialized processing. And, to be most useful, that information must be communicated in terms familiar to the user. The TPMS doesn’t list numerical pressures but instead shows the driver an intuitive tire-related symbol. Similarly, the beer keg system displays the remaining servings, not the actual keg weight.

The SteadyServ website description of the company’s system shows how the raw sensor data is just the first step in a comprehensive application solution: “The SteadyServ iKeg system includes a mobile app that provides real-time information about your beer supply. You’ll know exactly how much you have, when it will run out, and how much you need to reorder. Plus, it provides crucial information about how quickly you are going through all of your beers, which are most popular or profitable, and which ones aren’t.

“The keg sensor collects and sends a host of information to… SteadyServ’s cloud-based software, where it is paired with other information, such as how much safety stock remains, the next delivery date, previous order information, past consumption trends, event information, beer consumption trends nearby, and even local weather forecasts which may impact patronage trends.”

Further sensor-based systems

Bed Sores

If you had attended Dr. Pete Salgo’s presentation titled Smart Fabrics Changing how we feel the World: Insight into Medical Application at the recent Sensors Expo, you would appreciate just how serious bed sores can be. According to Salgo, a medical doctor and professor at Columbia University, one million patients in the United States develop bed sores each year, and 30,000 die from related illnesses. 

Bed-ridden patients who cannot reposition themselves are subject to prolonged pressure between parts of their anatomy and the bed’s mattress. Eventually, the affected skin breaks down, allowing infection to enter. To avoid this problem, the standard nursing approach has been to periodically turn patients to a different position. However, there are only so many positions, and bed sores still often develop.

Alleviating the high pressure could solve the problem. Robert Golden, CEO PatienTech/Vista Medical, also participated in the presentation and described BodiTrak, the stretchable, fabric-based large-area pressure sensor that his company together with Salgo developed.

Basically, it consists of top and bottom orthogonal grids of conductors separated by a semiconducting rubber layer. That’s a lot easier to say than make, according to the speakers. The top and bottom grids are made of a stretchable fabric that comprises alternating conducting and insulating strips. When asked, Salgo described the rubber separator as semiconducting, although he may have meant resistive. In any event, when pressure is applied to the fabric/rubber sandwich, the location and degree can be found by scanning the X-Y conductors.

Two of the more important applications using a distributed fabric sensor are medical mattresses and pillows. PatienTech’s PREVAIL responsive sleep technology (ReST) bed has numerous air-filled support cells and a top layer of the smart fabric. When high pressure is sensed in a location, the bed automatically adjusts the air pressure to redistribute the patient’s weight, as shown in Figure 1. The pillow uses a similar strategy but is intended for use by sleep apnea sufferers. In this case, changing the pressure in the air cells gently shifts the user’s head position, alleviating breathing restrictions. Another application similar to the mattress is a smart truck seat for long-distance drivers.

Figure 1. ReST smart bed performs pressure redistribution

Courtesy of PatienTech

For these applications, the actual sensor technology had to be developed—there were no suitable long-wearing, flexible, large-area pressure sensors. Of course, suitable algorithms also had to be designed to optimize pressure distribution across the mattress and seat and to affect subtle changes in head position for the sleep apnea pillow. Now that such a sensor exists, a number of additional uses are being explored. A golf training mat was mentioned as an example in which a golfer’s weight transfer could be measured as he stood on a pressure-sensing mat while swinging a club. Correlated video and weight data could be linked to identify problems with different parts of the swing.

Helicopters

For those of us who only experience flying from the passenger compartment, it may seem obvious that a better radar system could enhance a pilot’s view while landing. To a degree, that’s correct, but for helicopters operating in extremely dusty environments, LiDAR is a much better bet for distances less than 200 meters.

The term LiDAR combines light and radar to describe a ranging system that operates using reflected laser light instead of radio waves. Typically, the laser wavelength is between 0.5 µm to 1.5 µm, and the obscuring dust particles are of a similar size up to about 10 µm. Because of its very narrow beam width, LiDAR has much better spatial resolution than radar. As explained during Dr. Philip Church’s 3D LiDAR Imaging in Obscurants presentation at the Sensors Expo, LiDAR makes a truck look like a truck while radar displays a blob. Dr. Church is vice president of sensors at Neptec Technologies.

On the other hand, except for a few frequency bands with high atmospheric signal absorption, radar generally performs well with obscurants such as dust, smoke, and snow. The initial concern was whether the obscurants would cause significant light scattering and hence low SNR in a LiDAR system. 

Radar has a very low scattering efficiency because the signal wavelength is so much bigger than the size of the obscuring particles. A 94-GHz radar (3.2-mm wavelength) used with 10-µm obscurant particle size results in a scattering efficiency of 1E-9. Even 100-µm particles increase scattering efficiency to only 1E-5. In contrast, 1.57-µm LiDAR has a scattering efficiency of about 2.0 with 10-µm obscurants.

Neptec designed a range of obscurant penetrating auto-synchronous LiDAR (OPAL) systems to deal with the problem. As described in a 2012 online article, “It’s a combination of a special time-of-flight laser scanner that has a clear mode and an obscurant mode and some special software that switches between these two modes,” explained Philip Church…. “As well as looking at the returns that we receive and what is an obscurant return and what’s the real target… it’s essentially waveform technology, but instead of solving for the multiple returns in the post processing, you’re doing it in real time.”1

Distinguishing between obscurant returns and real target returns is the job of “advanced real-time spatial and temporal filtering,” according to information presented at the Sensors Expo. In addition, Church said, the OPAL systems include “sensitive detection electronics with detector saturation mitigation.”

Extensive testing in well-controlled but artificial environments confirmed performance of the system in 10-µm fog and separately in 20-µm dust. In both cases, the OPAL system provided about twice the penetration range compared to naked-eye visibility when the dust or fog density was high. 

Field tests were performed using two approaches. In one test, the OPAL-360 panoramic LiDAR was at a static position on the ground, and a helicopter created an obscurant cloud by hovering over a heap of Arizona road dust positioned between the instrument and a group of targets. In the other scenario, the LiDAR was installed in a helicopter, and several containers of Arizona road dust were positioned along its landing path. As in the first test, targets were located beyond the dust containers. Figure 2a and Figure 2b show the before and after views of the targets, respectively.

Figure 2a. Raw data with dust points colored yellow

Figure 2b. Dust penetration and removal

Courtesy of Neptec Technologies

Smart storage bins

Bins are used to collect many different materials, ranging from waste oil and yellow grease to textiles in the form of used clothing deposited at local charity locations. SmartBin, a Dublin, Ireland-based intelligent remote monitoring solutions company, has combined the rugged UBi wireless ultrasonic sensor with the full-featured supporting SmartBin Live software platform. The result has been greatly improved efficiency for a large number of businesses with fixed or patterned operations, for example, companies that manage a large number of collection bins.

The UBi is mounted inside the bin under the lid and reports the container fill-level up to a depth of 3 meters, temperature, geo positioning, and tilt-level to the SmartBin Live platform. The sensor can be remotely configured to report these measurements once per day or more frequently as required.

As explained by the company’s Brendan Walsh, sales and marketing director, “It is predominantly the fill-level data that is offering companies huge collection cost savings by doing away with the traditional ‘milk run’ way of servicing containers.  Thanks to the SmartBin integrated route optimization capability, clients in 25 countries have optimized servicing of their container assets by only dealing with those containers that are full.”

Figure 3 is a typical SmartBin Live display with consolidated, color-coded bar graphs on the left showing the status of container groups at various sites. Using the Dublin North and Dublin South rows as examples, about half of the 25 containers at the north site are full and the remainder empty. However, of the 11 at the south site, two are part full, about half completely full, and the rest empty. The map on the right indicates locations with color-coded markers corresponding to sites that must be visited for collection, those that could be, and those that do not need to be.

Figure 3. SmartBin Live status display

Courtesy of SmartBin

Users can drill down to examine sensor performance data as well as greater detail about the statistics related to a particular site or bin. Walsh said, “Clients log in to the SmartBin Live online interface to view the fill-level of all container assets, generate their drivers’ routes for the day, and manage all container assets with a war-room style overview.” SmartBin Live includes a Plan My Route tab that calculates the most efficient collection route.

The UBi itself is a wireless cellular sensor powered by a 3.6-V C-Cell lithium thionyl chloride battery giving up to 10 years reporting life. Although only about 0.5 lb in weight, the UBi is protected by an IP66 weatherproof, noncorrosive HDPE shell ensuring it operates in the harshest of container environments. Fitted with an internal antenna, the UBi uses GPRS to report to the SmartBin Live platform over both 2G and 3G GSM. The data payload from the UBi is delivered to the SmartBin cloud via cellular towers in the vicinity of the container asset fitted with the UBi.

Walsh concluded, “SmartBin is not simply a sensor but rather a comprehensive solution that has been built around the capabilities of the UBi sensor. It is the SmartBin Live web portal that processes and presents the measurement data clearly for SmartBin clients, using precise algorithms. A dashboard providing an at-a-glance real-time view of how a client’s sensor portfolio is performing, a detailed collection route planning, KPI data, and both SMS [text] and email alert notifications are all part of a complete remote-monitoring solution.” 

Conclusion

Basic sensor technology exists for most physical phenomena. But by itself, a sensor can only determine a change in value. Appropriate compensation generally is required to ensure that the parameter being measured has caused an output variation rather than drift or noise in the sensor itself. Nevertheless, even if the sensor and its associated signal processing provide high accuracy, the information isn’t in a useful form.

A complete application solution begins with the user’s requirements expressed in his terms: Does he need to know how many servings of beer remain? Or, does he need to know when a bin is about to overflow? Or, is an innovative approach needed to avoid bed-sore formation? The sensor is a fundamental but small part of the overall system developed to address the application. The improved performance the solution provides is directly linked to how well the raw sensor output has been converted to actionable information.

Reference

Pfeifle, S., “New player in laser scanning hardware, software,” 3D Scanning Technologies, Dec. 5, 2012.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!