The human body embodies an amazing array of senses. Musicians can hear the difference between a C and a C#, despite the fact that the two tones are only 15 Hz apart. Olympic rifle shooters can detect when their sights are misaligned by as little as two-thousandths of an inch. Sightless persons can read Braille—dots raised above the paper just two-hundredths of an inch and spaced one-tenth of an inch apart—at 400 words per minute using touch sensors in their finger tips.
When it comes to the design of human interface devices for today’s technology, senses are important factors. Device manufacturers are faced with the ongoing challenge of accommodating a conditioned expectation—that buttons or keys will move when depressed (perhaps as much as one-fourth to one-half of an inch), leading the brain to decrease the velocity of the finger to zero at the moment of contact.
The growing use of rigid flat screens, such as LCDs, under glass or plastic protective overlays has increased the need for improved touch solutions. The wrong sensory feedback can have sweeping negative effects, including difficulty of use, inaccurate communication of data, and—most ominous—physical impairments, such as repetitive stress injury (RSI), which are caused by the failure of buttons or keys to respond to the brain’s expectations. In addition, manufacturers of mobile consumer electronic (CE) devices in particular must consider other technical factors, including the need to reduce latency, reduce dependency on baseband processors, and lower cost and improve power consumption.
Recent trends in the integration of components have helped advance the CE industry toward capacitive-touch capabilities that address the conditioned expectations of users and the industry’s trend toward lower-cost, higher-performance solutions. This article will discuss the need for improved touch solutions, the alternatives available, and the benefits of an integrated approach to capacitive touch.
As technology evolves, so do consumer visions of what is “cutting edge.” A mobile handset or MP3 player with mechanical keys isn’t the trend of the future, and one with bubbled membrane keys would be an oddity at best. One human-machine interface (HMI) gaining considerable traction today in consumer and commercial/industrial segments is touch-centric input, whether display-based (e.g., touch screens), or non-display-based (e.g., buttons, sliders, and scroll wheels).
The research generated from membrane-type keys provides significant design findings for this latest generation of touch-based HMI due to the same fundamental challenges. When touch inputs are implemented on rigid flat screens, such as LCDs, or under glass or plastic protective overlays, such as in white goods or mobile handsets, there’s no “travel” to provide feedback to the user that a valid touch event has occurred. However, unlike membrane keys, it’s not practical to add travel through a bubble or other physical technique. New methods of feedback are required. With these touch inputs controlling more complex and essential product functions, this sensory feedback becomes even more critical in determining whether the new input methods enhance the users experience or detract from it.
A VARIETY OF FEEDBACK METHODS
Early on, membrane keys incorporated tactile feedback. But soon after, other types of feedback, such as visual and aural, were added. Today, multiple sensory feedback methods are commonplace; in fact, so much so that they’re often so subtle that they practically go unnoticed. However subtle, sensory feedback is important for many reasons.
The three senses involved in product sensory feedback—visual, aural, and touch—are used singularly and in combination. Depending on the application, one of these methods may be more effective than another. Often, it’s most effective to involve multiple senses, or all three. Studies conclusively show that sensory feedback improves user accuracy, makes using complicated products easier and quicker, and provides a better “emotional” response for the user.1
• Visual feedback: Human beings’ most developed sense is sight. Thus, incorporating visual feedback is desirable to producing positive results. Visual feedback appears in many forms, from simply illuminating an LED when a button press is detected to the more complex presentation of a scrolling phonebook display on a cellular phone in response to a touch slider.
The availability of high-resolution LCDs incorporating touch sensors has led to a plethora of visual responses to button presses and slider applications. Future technology applications have been announced by Microsoft and others that allow touches to perform heretofore complicated actions, such as sizing an image by simply “stretching” the image on the touch-sensitive display.
• Aural feedback: Another highly developed human sensory perception is hearing. Sounds have been used as a primary communication tool throughout history. Alarms virtually always incorporate a shrill sound to quickly grab everyone’s attention. Bells have been used to announce the current time to entire villages. And, of course, spoken language is the king of communication.
It follows that audible stimuli are important sensory feedback methods. In the case of the first membrane keypads, the bubbles (by design or accident) emitted a clicking sound that proved to significantly augment the positive impact of the spatial travel of the bubble over the button. This research evolved into today’s systems, some of which incorporate sophisticated sound-producing devices that not only provide feedback (and entertainment), but also can mimic other sounds. Sometimes this is simply for enjoyment, but certain sounds—such as that made by a mechanical button or dial—enhance a product’s usability as well.
• Tactile feedback: Sight- and sound-directed sensory feedback can be extremely important, but no design should overlook the importance of tactile feedback. In addition to resolving situations in which visual and aural alerts are inappropriate, such as the need for vibration when cell-phone ringers are silenced, tactile feedback can also reduce the risk of repetitive stress injury.
Touch sensing has been looked at as a means to expand product capability and enable differentiation through sleek and attractive industrial designs. As new touch inputs are implemented, most often on rigid surfaces, the need for effective tactile feedback becomes more visible and critical.
The most sophisticated touch response mechanism is called haptics. Haptics’ effects—supplying forces, motions or vibration to the user via touch—can be configured to provide not only simple sensory feedback in the form of basic vibration, but also enhanced effects. The latter are possible through shaping and timing of vibration effects to realistically simulate the touch sensations associated with mechanical inputs, such as buttons and sliders.
Haptics feedback can be utilized for far more than simple validation of the touch itself. A button press or slider sweep also can incorporate different haptics feedback effects, depending on the results of the key press. For example, a positive response could have one type of touch feedback and an error response could have a different feedback. Haptics feedback may also be tailored to convey other messages. Sensations can be presented to the user that represent such things as the amount of pressure on a touch-button or to allow third-dimension (z-axis) input.
In general, higher input tactility can also vastly improve the experience and accuracy of using the device compared to products with no or low tactility. One 2003 study by the Nokia Research Center examined the effect of tactility on numerical entry tasks in mobile handsets.2 Tactility was varied by using protruding, separated keys with less than 1 mm travel (high tactility) and flat, horizontally connected keys with only about 0.5 mm travel (low tactility). The resulting error rate in the test case when subjects weren’t allowed to view the phone (“no visual feedback”) shows a greater-than-sixfold improvement for the high-tactility condition (see the figure). Even when subjects were allowed to view the phone during numeric entry (“visual feedback”), the error rate was almost three times lower for the high-tactility phone.
TOUCH TECHNOLOGY CHOICES
Several different touch technologies are available today. Each technology, of course, has advantages and disadvantages, which define the appropriate applications. Some of the most popular technologies include resistive touch for portable navigation device touch screens, surface acoustic wave (SAW) or infrared for kiosks and ATMs, and capacitive touch for mobile handsets and portable music players.
For most high-volume consumer applications, resistive and capacitive touch technologies tend to dominate. Resistive technology has been implemented for some time in touch-screen displays of smart phones and portable navigation units. However, recent product introductions, such as the Apple iPhone and LG Chocolate, suggest that capacitive touch is the technology of choice for devices where differentiation and attractive industrial design are important. Two of the primary reasons for the popularity of capacitive technology are implementation flexibility and multi-touch capability.
Capacitive touch inputs, such as buttons and sliders, may be implemented using a variety of materials, including printed-circuit boards, flexible printed circuits and ITO on film or glass. Furthermore, the sensor array may be covered by various thicknesses of glass, plastic (clear or opaque), or other protective or decorative overlays without affecting touch performance. This enables a vast array of choices in industrial design to differentiate product appearance and functionality in an end market often crowded with an overabundance of options.
Multi-touch capability is an enabler in creating unique, differentiating user interface designs. The iPhone introduced multi-touch gestures to the masses, replacing the traditional “nested” menu system with intuitive multi-touch gestures for commonly used commands, such as zoom or advancing to the next screen or picture. This revolutionary user interface design, enabled by capacitive touch technology, has been a primary reason for the iPhone product line selling over 20 million units since its introduction.
Another advantage of capacitive technology is durability. Consumer products aren’t handled like lab instruments, so durability is a chief concern. With capacitive technology, there are no films to wear out or be damaged, as is the case with resistive touch.
CAPACITIVE TOUCH AND INTEGRATION
Capacitive buttons and sliders use a fairly simple architecture. However, because of the very small capacitances measured with the architecture, it’s important that the touch detection circuitry is closely situated to the sensor itself. As a result, there’s greater interest in capacitive-touch controllers that integrate multiple architectural elements and are optimized to provide a rich selection of integrated sensory feedback options.
While the long-standing principle in electronics is that integration is good, there have been instances of applications being “de-integrated.” Sometimes this is related to compromised performance that results from excessive integration in systems on a chip (SoCs) that already contain dozens of integrated functions. Other times, de-integration occurs simply because new features and requirements emerge faster than suppliers can integrate this functionality. What has become obvious is the importance of determining what to integrate, and why, rather than integrate as much as possible onto a single chip.
When using capacitive-touch controls, it’s important to have the driving and sensing circuits in the controller close to the touch sensor. Because the measured capacitance is quite small, lengthy leads or PCB traces may increase susceptibility to noise to the point where valid touches can become indistinguishable from the noise. Therefore, incorporating capacitive sensing into the baseband or application processor, either of which is often located away from the touch sensor array, may not result in reliable touch performance.
For this reason, a capacitive-touch controller separate from these very highly integrated SoCs is the most robust and practical approach today. Though separate, a capacitive-touch controller needn’t be a single-function device. Given the importance of sensory feedback to augment the touch experience, integration of feedback functionality into this touch controller often makes functional sense and provides clear advantages in system design.
While some of this sensory feedback logic and control could certainly be implemented in software on the baseband or application processor, doing so places an unnecessary burden on these already highly utilized processors. Depending on the priority level of the different processor interrupts, latencies to these feedback functions may also be introduced. This will degrade the user’s experience by creating a jumble of mistimed feedback mechanisms and realizing the dreaded conflicting sensory input scenario.
Therefore, incorporating every feedback control function into the baseband processor in all but the simplest devices isn’t the best solution. In addition, to achieve enhanced sensory feedback, such as flexible dimming options for LEDs and realistic haptics effects, external components, such as LED drivers and haptics drivers, are also often required.
THE MULTI-SENSORY FUTURE
As more research relating to the HMI is conducted, the more we realize that the involvement of multiple human senses contributes greatly to the ease-of-use and efficiency of a product. This feedback not only provides an enhanced experience for the user, but also provides necessary information for the brain to properly use the product. Indeed, absence of sensory feedback has been even shown to result in injuries. But beyond ease of use and injury prevention, sensory feedback—especially tactile feedback—has been proven to improve user accuracy.
With these findings, the thought process for user interfaces is turning from concentrating on visual and aural feedback with tactile feedback as an option, to realizing that touch can be the most important user feedback in many applications. Such thinking is leading to “way outside the envelope” solutions, including using touch sensations to identify particular callers, inform the user of a correct or erroneous input, or even influence the user’s mood. Touch is also quiet and unobtrusive, which conveys privacy to the user as well. Research is being conducted on how to use touch for private, quiet messages within a non-private environment.
Integrating the touch control and sensory feedback circuitry onto one chip not only frees up the main processor from the mundane tasks of driving LEDs or haptics devices, but also provides significant cost and manufacturing advantages. An integrated touch controller can be located optimally close to the touch-sensor array, while the main processor can be located wherever appropriate without constraining it to being in close proximity to the sensor. Further, these fully integrated touch chips consume less power and use fewer external components than would be required of a discrete solution or using the baseband processor for control.
Without question, touch control and user feedback will be a significant area for change and growth now and in the near future of consumer and commercial/industrial products.
1. MacLean, Karon, Designing with Haptic Feedback, Proceedings of the IEEE Robotics and Automation, 2000.
2. Silfverberg, Miika, Using Mobile Keypads with Limited Visual Feedback: Implications to Handheld and Wearable Devices, Mobile HCI 2003, LNCS 2795: 76-90.