There's a scene in the first Iron Man film (2008)—as well as many in its sequels—where Robert Downey Jr.'s character of Tony Stark moves around a digital 3D model of a robot suit by simply moving his hands as if he's physically touching it. Essentially, he's using hand gestures to manipulate something. Take a look here:
I had seen similar such examples in other films before then, but Iron Man was perhaps the first that really made me think, 'man, if this technology were ever real, it would work wonders for engineering and product design,' or something to that effect.
As expected, I recently learned engineers have been working on making this technology a reality for at least a handful of years. One of the engineering teams on the case is out of Google's Advanced Technology and Projects group (Google ATAP) in the form of its Project Soli. Soli as a new sensing technology that uses miniature radar to detect touchless gesture interactions. Here's how Google ATAP describes the endeavor on Project Soli's website:
"Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand. The sensor tracks sub-millimeter motion at high speeds with great accuracy. We're creating a ubiquitous gesture interaction language that will allow people to control devices with a simple, universal set of gestures. We envision a future in which the human hand becomes a universal input device for interacting with technology."
"The Soli chip incorporates the entire sensor and antenna array into an ultra-compact 8mm x 10mm package. The concept of Virtual Tools is key to Soli interactions: Virtual Tools are gestures that mimic familiar interactions with physical tools. This metaphor makes it easier to communicate, learn, and remember Soli interactions."
Interesting Engineering posted a 1-minute video overview of Project Soli in this Tweet from June 10
Project Soli has actually been in the works since it was first announced in 2015, and formally at the Google I/O 2016 Conference. In March of 2018, Google asked the FCC for permission to operate its Soli radar int the 57- to 64-GHz frequency band—consistent with European Telecommunications Standards Institute standards, and up from the +10dB level previously allowed. Google said field testing showed that without the higher power levels, blind spots could occur close to the sensor location.
Google finally got its wish eight months later when the FCC stated on Dec. 31 that it had granted Google a waiver to operate the Soli sensors at a higher radiated power of +13dB, and that the sensors could also be operated aboard aircraft. The FCC said its decision “will serve the public interest by providing for innovative device control features using touchless hand gesture technology.” The decision allowed Google ATAP to take the next step in engineering and testing the technology.
Google ATAP posted this 4-minute overview of Project Soli back in May 2015: