While traditional forms of device input—keyboards and mice—remain commonplace, perceptual computing enables “computers to work around us, rather than us continuing to work around them” by understanding our intentions in a more natural way (see “Interview: Barry Solomon Discusses Intel's Perceptual Computing SDK”). Now, the adoption of tiny microelectro-mechanical system (MEMS) mirrors could further developments in human-computer interaction. The mirrors use the same electrostatic principle that causes our hair to stand on end to sense motion.
The micro-mirrors, provided by STMicrolectronics as part of the perceptual computing initiatives at Intel, move thousands of times per second to scan infrared light beams, painting an invisible grid on objects in front of it. The light is then reflected back from the object and captured and analyzed for 3D imaging and gesture applications.
MEMs technology fosters smaller, more robust systems for a variety of consumer devices, ultimately delivering more immersive experiences. It also has the ability to sense other environmental factors and actuate or move liquids, further integrating devices.
Previously, STMicroelectronics helped develop an extremely thin projection engine that fits into the screen of a laptop or tablet computer. It offers an ultra-wide field of view of almost 90°.