A laser scanner has been developed that can project a virtual touch screen onto any surface, turning hand gestures into the equivalent of mouse clicks. It is one of the latest parts that offers to change how people interact with computers and toys to automotive displays and factory robots.
The sensor, announced by Bosch Sensortec, operates on a similar principle as bar code scanners. It contains a pair of tiny quivering mirrors to project colored lasers onto a surface. An optical diode measures the intensity of the light reflected when someone’s hand hovers over the virtual display.
When combined with specialized software, the device would let kids, for instance, control toy robots without needing a touch screen. Bosch also pointed to applications in augmented reality and automotive displays.
The product release shows that engineers are thinking about how smaller sensors and more powerful computers can transform our working relationship with machines. Television remotes and keyboards are giving way to voice controls, motion control for toys, and brain implants for moving robotic limbs.
The smartphone has also evolved into an interface, controlling other devices over the cloud remotely, such as thermostats or manufacturing equipment. The changing face of how people interact with computers is one of the topics at Mobile World Congress in Barcelona this week.
The device, BML050, represents more of an incremental step in the evolution of the interface than voice control popularized by Amazon's Alexa and Google's Home devices. Bosch Sensortec is offering software support for major operating systems and a reference design to help design projectors.
Other technologies are going for a clean break from tradition. Google is developing a gesture control system for computers using a radar chip called Soli. It has been designed to track slight finger movements, and it has also been shown to recognize variations in the surface and composition of objects placed against it.
(Image courtesy of Bosch Sensortec).
An early commercial success for gesture control came out of PrimeSense, a company founded by two Israeli engineers Aviad Maizels and Alexander Shpunt in 2005. The company built three-dimensional cameras for systems like Microsoft’s Kinect, before it was sold to Apple for $360 million in 2013.
Their technology casts an invisible grid of infrared light in front of a computer or television, measuring the how hand gestures distort the light. Algorithms process those distortions to make out how a person is moving. The process has been emulated by companies like Intel trying to light a fire under the personal computer industry.
Others are thinking further outside the box. Thalmic Labs, a startup based in Waterloo, Canada, has designed an armband studded with sensor modules that measures the electrical pulses that occur when people move their muscles. It can translate those signals into computer commands. Myo, as the device is called, has flushed the startup with cash from powerful investors.
In September, Thalmic Labs raised $120 million from Intel Capital and Amazon’s Alexa Fund, which provides venture funding for voice control technologies like those found in its Alexa platform. The fund has invested in smart intercom systems and microphones that better identify voices in noisy environments.
These technologies are all targeting interfaces that can be taken anywhere, adding new layers of connectivity to the Internet of Things. “It is not just about how devices communicate or sense their surrounding environments, but increasingly about how technology interacts with human beings," said Stefan Finkbeiner, chief executive of Bosch Sensortec, in a statement.