Understanding Virtual Sensors: From Sensor Fusion To Context-Aware Applications

July 10, 2012
Sensors are increasingly important in mobile devices. Sensor fusion allows designers to create virtual sensors that bridge what can be measured to what developers want to detect.

Sensors are increasingly important in mobile devices. Combining data from cameras, microphones, inertial sensors, beacons (e.g., GPS) and proximity-based sensors can greatly improve the accuracy and reliability of sensor data. This sensor fusion allows designers to create virtual sensors that bridge what can be measured to what developers want to detect. New and highly sophisticated applications become possible.

Table of Contents

  1. Problems With Mobile Sensors
  2. The Fusion Solution
  3. Virtual Sensors in Android and Win8
  4. Derived Virtual Sensors For Context Awareness
  5. References

Problems With Mobile Sensors

Smart phones and tablets are far from ideal sensing platforms. Their manufacturers need to keep the product compact and inexpensive, which can compromise sensor reliability and increase coupling to the electronic environment they operate in—RF signals, electrical noise. In addition, portable devices are subjected to non-ideal environments such as magnetic anomalies, temperature variations, and shock and vibration. This adds noise to the sensor measurements.

Further analysis of this sensor data from mobile platforms indicates other problems. For one, mobile operating systems including Android, iOS, and Windows Phone are not designed to handle real-time tasks such as on-demand sensor sampling. This leads to unreliable time stamps of sensor samples.  Another problem is that a common method to mask sensor inaccuracies is to apply low-pass filtering or dead-banding of the underlying data. Such crude data manipulation can discard information that would be useful if carefully analyzed. This makes the sensing less reliable and less responsive than it could be.

In all, developers have the impression they cannot trust sensors on mobile platforms. Subsequently, they often decide not to write applications that take advantage of sensor data.

The Fusion Solution

Sensor fusion processes multiple sensor measurements of the same event to make it easier to separate the “real” data from noise and sensor errors. Done correctly, there is no loss of responsiveness. Sensor fusion can overcome the shortcomings of individual sensors and provide useful, reliable results.

For example, a gyroscope with 100°/second bias appears to be spinning wildly when the device is at rest. Sensor fusion can zero-out this bias through comparison with the accelerometer and magnetometer data. Sensor fusion thus allows designers to use such a gyroscope, gaining greater flexibility in component sourcing.

Virtual Sensors In Android and Win8

Sensor fusion produces richer data than a single sensor can provide. The main use-case is the fusion of accelerometer, magnetometer, and gyroscope measurements to determine the device’s absolute orientation (the 3D rotation angle). For convenience, “interpreted” events such as these can be represented in the same form as sensor events, and treated as outputs of “virtual” sensors.

Virtual sensors form a bridge between what can be measured and what developers would like to measure.1 They convert data from multiple sensors into useful information that cannot be obtained from a single sensor. Android offers four principal virtual sensors:

  • TYPE_GRAVITY: The direction of gravity as measured in the device’s frame of reference
  • TYPE_LINEAR_ACCELERATION: The dynamic acceleration of the device itself, found by subtracting the influence of gravity from the accelerometer measurement (note this still includes angular acceleration, so the name “linear” acceleration is a misnomer) TYPE_ROTATION_VECTOR: The vector component of a quaternion representing the three-dimensional orientation of the device
  • TYPE_ORIENTATION: The quaternion interpreted in Euler angles (yaw, pitch, roll); though this virtual sensor was previously deprecated, developers continue to use it; it will most likely continue to be supported

Similarly, Windows 8 has four virtual sensors:

  • Orientation sensor: The quaternion representing the device’s three-dimensional orientation (although confusing, this is not equivalent to the Android TYPE_ORIENTATION, but instead is the same as the TYPE_ROTATION_VECTOR)
  • Inclinometer: The Euler angles (yaw, pitch, roll) representing the device’s three-dimensional orientation
  • Tilt-Compensated Compass: The device’s heading in the plane perpendicular to gravity
  • Shake: The discrete event that occurs when the device is shaken (in any direction)

Both Android and Windows 8 interfaces provide a continuous representation of 3D orientation as a quaternion (TYPE_ROTATION_VECTOR and orientation sensor) and the derived Euler angles (TYPE_ORIENTATION and inclinometer have the same conventions up to a sign change). The decomposition of acceleration into two virtual sensors (linear acceleration and gravity) is exposed only in Android. Shake and tilt-compensated compass are exposed only in Win8.

These virtual sensors are enabled through sensor fusion and are useful for developers creating motion-based applications. They also can provide further information about the user or their current activities. Deriving such information falls under the general category of context awareness.

Derived Virtual Sensors for Context Awareness

Context awareness goes beyond presenting the raw motions of the device. It includes determining how the user is carrying the device, what the user is doing, and in what environment. Table 1 presents four virtual sensors that can describe these contexts. They are mutually exclusive: each can, at the same time, have a value independent of the others. For example, a device can be in the user’s hand (1), while the hand is still (2), as the user is standing (3) on a train (4). Mutual exclusivity among contexts is highly desirable. It makes virtual sensors truly primitive, and eliminates interactions among contexts.

Using context to improve interactivity is not new.2 The objective in detecting user context is to allow electronic devices to adapt to their users, instead of requiring users to adapt to the device interface. Once context is available to the developer, applications can become much more intelligent.

Take the simple case of adjusting ring volume.3 If the phone is in a pocket, the ring tone could be replaced with vibration. If the phone is in a hand, the ring tone could be set just loud enough to ensure it is heard. If the user is outdoors, the ring tone could be turned all the way up to offset ambient noise.  The CARRY virtual sensor provided by an operating system can produce this level of information for an intelligent application to utilize.

Posture determination using inertial sensors has also been analyzed, with reasonable accuracy for standing and sitting.4 The POSTURE virtual sensor provided by an operating system can also generate callbacks for each pedestrian step for use by activity based applications. This reduces the burden of the developer to understand and process each change in the raw sensor data, which occurs on the order of milliseconds, instead registering for the event of interest, which occurs on the order of seconds.

A strong point of virtual sensors is that they provide a general interface of functionality that can be enhanced. For example, if a camera is available, it may also be able to detect posture and contribute to the POSTURE virtual sensor, increasing the output accuracy without requiring applications to change their sensor application programming interface (API).

Some smart phones already provide a basic level of context awareness. For example, Samsung’s Smart Stay keeps the display at high brightness as long as a person is looking at the screen. (Every 15 seconds it turns on the camera and looks for a pair of eyes.) Motorola’s Smart Actions starts car mode when you enter the car and silences the ringer when you are at work. (It detects user-defined beacons, such as Bluetooth and Wi-Fi, that are unique to an environment.)

However, these applications rely on power-hungry sensors (such as a camera), and they do not always work the way expected. A phone left in front of a picture of a face will stay on and too rapidly drain the battery. A phone might detect the car’s Bluetooth even when it’s outside the car. Furthermore, it would be convenient if such features did not have to be configured by the user.

Table 2 gives examples of enhanced context virtual sensors that reduce or eliminate these problems.

Adopting the additional virtual sensors introduced here provides an extensible, easy-to-use framework for combining data from a multitude of sensors to advance to the next level of context awareness.

References

  1. Steele, J. “Configuring mobile device sensors to ensure optimal performance,” EE Times, Feb 2012.
  2. Schilit, B.N., N. Adams, and R. Want, “Context Aware Computing Applications,” IEEE Workshop on Mobile Computing Systems and Applications, Dec. 1994.
  3. Schmidt, A. and K. van Laerhoven, “How to Build Smart Appliances?” IEEE Personal Communications, Aug. 2001.
  4. Kern, N., e. al., “Wearable sensing to annotate meeting recordings,” Pers Ubiquit Comput, Sept. 2003.

Sponsored Recommendations

TTI Transportation Resource Center

April 8, 2024
From sensors to vehicle electrification, from design to production, on-board and off-board a TTI Transportation Specialist will help you keep moving into the future. TTI has been...

Cornell Dubilier: Push EV Charging to Higher Productivity and Lower Recharge Times

April 8, 2024
Optimized for high efficiency power inverter/converter level 3 EV charging systems, CDE capacitors offer high capacitance values, low inductance (< 5 nH), high ripple current ...

TTI Hybrid & Electric Vehicles Line Card

April 8, 2024
Components for Infrastructure, Connectivity and On-board Systems TTI stocks the premier electrical components that hybrid and electric vehicle manufacturers and suppliers need...

Bourns: Automotive-Grade Components for the Rough Road Ahead

April 8, 2024
The electronics needed for transportation today is getting increasingly more demanding and sophisticated, requiring not only high quality components but those that interface well...

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!