Ultrahaptics
Electronicdesign 17295 Promo Fig 7 0

Turn Your Car into an Office

Aug. 9, 2017
The Next Big Things in auto electronics: integrating VPAs like Alexa into Infotainment systems, and combining gesture recognition with Haptic Touch Feedback.

Introducing a game-changing advance in technology is a bit like a golfer getting a hole in one. Although relatively rare, everyone knows there’s another one coming (if not exactly when or by whom).

And so it is with speech-activated virtual personal assistants (VPA)—Amazon’s Alexa, Microsoft’s Cortana, Google’s Assistant, and the newly minted Samsung Bixby. Amazon’s Echo and Dot, speakers powered by the voice service Alexa, surprised even veteran industry watchers with how quickly they evolved from novelty to necessity in the home; more than 10 million devices have been sold since Echo’s launch in November 2016.

Consider these statistics, and note that a wise man once said stats should be used in an argument much the same way a drunkard uses a lamppost: for support instead of illumination.

  • An estimated 35.6 million Americans will use voice-controlled speakers at least once a month this year, a 128.9% increase over 2016, according to the research firm eMarketer.
  • VoiceLabs, a company that develops voice applications, predicts there will be 24.5 million devices shipped, leading to a total device footprint of 33 million voice-first devices in circulation.
  • According to the research firm Gartner, consumer spending on VPA-enabled wireless speakers will reach $2.1 billion worldwide in 2020, up from $360 million in 2015. The research firm also expects that by the year 2020, 3.3% of global households are expected to adopt a VPA-enabled wireless speaker.
  • The investment bank RBC Capital Markets predicts there will be 60 million Alexa devices sold in 2020, bringing the total installed base to around 128 million.

Impressive. But what’s next for VPAs?  As Billy Ocean put it in his hit song “Get Outta My Dreams, Get Into My Car”: “I said hey (hey), you (you)...get into my car!”

VPAs seem to be doing just that. Automakers including Hyundai, Ford, Volkswagen, and BMW now or will soon be integrating the Alexa voice assistant into their vehicles. By 2022, according to the research firm IHS Markit, nearly 90% of new cars will have some type of speech-recognition capability, and 75% of those cars will also have cloud-based voice control provided by companies such as Microsoft, Amazon, and Google.

Hyundai was first to connect its cars to Alexa via the company’s Blue Link app, enabling owners to issue commands (start the car, lock the doors, etc.) from home. “Our customers increasingly want more ways to interact with their vehicles, especially when they are hustling to get out the door,” said Barry Ratzlaff, executive director, digital business planning and connected operations, Hyundai Motor America. “Linking smart devices like the Amazon Echo and Apple Watch to vehicles via Blue Link continues to fill that desire. Allowing consumers to send commands to their car is just the beginning.”

Hyundai has also teamed with Google to make its cars voice-operable via Assistant, the search giant’s virtual helper. For security reasons it requires that a Hyundai car owner provide Google Assistant with their Blue Link PIN to ensure they actually have the authority to control the vehicle.

The 2017 Ford Fusion Energi is the first car to use Alexa for car-to-home and/or car-to-internet purposes (Fig. 1). Amazon’s Echo is now part of Ford’s Sync 3 platform, taking advantage of Synch’s 4G LTE data connection. Sync 3 owners can ask Alexa to provide weather reports, check news, and even ask for directions and transfer the guidance to Sync 3’s built-in navigation system. And Synch 3 won’t only allow personalized music stations to play via a voice command; it now also can enable drivers to juggle chores like adding items to an existing grocery list with just a few words.

Volkswagen is combining the Alexa voice service with its Car-Net functions. Controlling Car-Net from home or the office will make it easier for users to schedule appointments or plan errands while staying connected and up-to-date in their vehicles. If users decide to buy flowers on their way home, the navigation system can be programmed with a voice command like “Alexa, ask Volkswagen the way to the nearest flower shop.” Similarly, asking “How much gas is left in the tank?” can determine whether they’ll also need to fuel up along the way.

At BMW, the newest version of the company’s iDrive supports integration with Amazon Alexa. It is now possible for owners to use voice commands to check the vehicle’s battery charge and fuel level while at home, as well as lock the vehicle remotely (to give two examples). The BMW Connected Alexa skill also allows users to learn about their next scheduled trip, determine what time to leave, and send the destination to their BMW. The skill works by using the activation word “Alexa,” followed by the invocation name “BMW.” (For instance: “Alexa, ask BMW when I should leave for my next appointment.”)

Google and Microsoft are not about to sit idly and watch Alexa steal a march on their Cortana and Google Assistant VPAs, respectively.  At its I/O developer conference in May Google gave demos of two new automobiles with Android integrated directly into the car, so that voice could be used to do things like adjust a seat, control air conditioning settings, pick music tracks, make a phone call, or access apps like Google Maps. Called Android in the Car, the system was demoed in a specially outfitted Volvo V90; Volvo expects to debut Android in the Car within two years.

The second Android demonstration car was an Audi Q8 sport concept technology platform (Fig.2). New Android in the Car applications included the streaming service Spotify, Google Play music, and Google Assistant all running on a large touch display in the center of the dashboard. The information is also visible in the Audi virtual cockpit—a fully digital instrument cluster in the driver’s direct field of view. While the navigation map continues to use the Audi HERE maps database it acquired from the Nokia Corporation, the driver can also choose to navigate with Google Maps. A message center for incoming messages and calls rounded out the new services in the show car.

Similarly, Nissan and Renault have partnered with Microsoft for their connected vehicle platform and have announced plans to integrate the voice-assisted digital assistant Cortana into their vehicles. The Microsoft Connected Vehicle Platform consists of a set of Microsoft services and products that span the physical (car) and digital (cloud) realms. The platform is also architected to integrate with productivity tools like Microsoft Office 365 and Skype for Business.

In July, at BMW’s annual “Innovation Days” event in Chicago, the company announced it is putting Microsoft’s productivity suite into every BMW Connected car. As a result Microsoft Exchange customers will be able to integrate their calendars, to-do tasks, and contacts with vehicle voice and navigation services. Exchange email and calendar accounts can be accessed on the go; drivers can have e-mails read out or compose messages; and starting this autumn, Skype for Business (Fig. 3) can be used to connect to Skype conference call meetings from the driver’s seat. Skype will allow users to join online meetings directly, receive alerts about upcoming meetings, get alerts when meetings are changed, and start meetings automatically without having to do any manual dialing in.

As Dieter May, BMW senior vice president, Digital Products and Services put it: “We seek to continuously extend and enhance the customer experience with regard to all aspects of their mobility. The integration of productivity features in-car (like Exchange and Skype for Business) and personalized and contextual services will help customers with their day-to-day work and planning while they are on the road.”

For owners of other vehicles, aftermarket supplier Logitech has developed ZeroTouch mounts and a companion app to bring the hands-free Alexa experience to more drivers. You simply place your hand near the smartphone’s screen (Fig. 4) and command Alexa like you would at home. The ZeroTouch/Alexa combination will read and send text messages and e-mails and review calendars. It is dependent on the internet connection provided by the smartphone. There are dashboard and air vent mounts.

Phone-Mirroring Solutions

Alexa and Goggle’s Android in the Car complement a car's infotainment system, rather than replace it like Android Auto and CarPlay do. Phone-mirroring solutions such as Apple CarPlay and Android Auto can reconfigure an Apple or Google phone’s home-screen interface and select apps to appear and be controlled via the car’s built-in display.

First introduced in 2014, Apple CarPlay provides iPhone users with a way to connect their device to their car's infotainment system via its Lightning Connector plugged into a USB port in the car. Apple introduced wireless support for CarPlay in iOS 9 back in 2015, but so far only the 2017 BMW 5 Series offers this functionality. CarPlay connects a supported vehicle to a compatible iPhone, so you can get directions, make calls, send and receive messages, and listen to music, among other things.  Every major automobile manufacturer has partnered with Apple in supporting CarPlay. There are more than 200 models to choose from, with more on the way.

Apple's next software build, designated iOS 11will have a Do Not Disturb mode (Fig. 5). The Do Not Disturb While Driving feature can be set to come on automatically, when connected to a car's Bluetooth, or else manually. Your iPhone will know when you are driving (whenever your iPhone detects the acceleration of a vehicle), and you will be prevented from call, text, and notification distractions until you stop.

The feature can be turned off, but when on, those who try to contact you will be automatically notified that you are driving. In an emergency, a person who is attempting to contact you via text while you're driving can break through Do Not Disturb by sending a second “urgent” message. CarPlay functionality still works, but when the car is in motion, anyone else who texts will get an automated response that reads: “I’m driving with Do Not Disturb turned on.”

Beyond phone connectivity, owners of a Samsung Gear S2 or S3 smartwatch will soon be able to integrate their device to a Ford SYNC-equipped vehicle for convenient parking reminders and alerts to help drivers remain attentive in the car. “As wearable technology advances, connecting your vehicle to these devices holds immense potential to help make life more convenient and more personalized.” noted Dave Hatton, manager of Ford’s mobile applications for connected vehicles, adding “the power of Ford SYNC is its flexibility to connect to new devices and services.” Developers can easily make their smartphone apps compatible with SYNC by downloading and integrating the AppLink software from the Ford Developer Program website.

Is Voice the Answer?

The AAA Foundation found in a 2015 study that technology allowing consumers to interact with their phones and their cars by issuing voice commands, rather than by pushing buttons on the dashboard or phone, may have unintended consequences that adversely affect traffic safety. The research discovered that the most complicated voice-activated systems can take a motorist’s mind off the road for as long as 27 seconds after he or she stops interacting with the system.

Given that U.S. Department of Transportation statistics show 10% of fatal crashes, 15% of injury crashes, and 14% of all police-reported motor vehicle traffic crashes were attributed to distracted driving, voice may not be the ultimate answer. Recognizing this, automakers are conducting parallel technology development programs centered on gesture recognition. Gesture recognition determines whether the driver has performed recognizable hand or finger gestures within an allotted space without contacting a touchscreen.

The automotive segment is already adopting gesture control for infotainment systems as well as serving as a human-machine interface (HMI) for subsystems such as climate control. BMW for example, has gesture control in production on its 7 and 5 Series cars. The driver can change the audio volume via a hand-circle motion, answer or dismiss a phone call using a left and right swiping motion (Fig. 6). BMW 5 series gesture control works via a small sensor located in the control panel set into the roof module that has the cabin lights and is constantly monitoring the area in front of the infotainment screen.

A camera placed in the steering wheel or on the dashboard also can be programmed to watch for certain gestures. When it sees them, it sends a signal to the processor that handles the connected infotainment hardware. The data is analyzed to determine what the driver is doing, ascertain which display controls the driver wants to adjust, and then activate the appropriate features.

For engineers developing and integrating new HMI, one of the challenges is providing the driver with the ability to manipulate gesture-controlled equipment while driving, without losing sight of the road for precious seconds while manipulating the infotainment system controls taking his or her eyes off the road. Ultrahaptics, a startup founded in 2013 based on technology originally developed at the University of Bristol, U.K., uses sound waves from tiny speakers which work together to create sensations on the skin of the user.

Sound waves are just pressure waves moving through the air so all the pressure from all the different waves can add together, creating a very localized area of very high pressure. The result of that is that enough force will be emitted to push on your skin. That vibration can then be manipulated to create different types of textures and feelings (Fig. 7).

These sensations can work to create a virtual button; slider or control, meaning the user gets the sensation of feeling an object that isn’t actually there. The Ultrahaptics technology works with motion sensor cameras which tell it where the hand is at any one time so it can judge what sensation the speakers need to send.

According to the Research and Markets report “Gesture Recognition and Touchless Sensing Market - Global Forecast to 2022,” the gesture recognition market is expected to be worth $18.98 billion by 2022, growing at a CAGR of 29.63% between 2017 and 2022.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!