Waving hands around to control Augmented Reality interface might get a major update in near future. Google, always on the forefront of our imagination, got a green light from FCC to run with its Project Soli hand-tracking tech. If pursued relentlessly, the project will be a direct competitor to Leap Motion. Rather than image-based, Soli tracking will run on radars.

It was not really until Leap Motion filed its, for the time, groundbreaking hand-tracking technology that AR interaction took full swing. Having all but used to Leap Motion as the default controller, the Federal Communication Commission gave Google a clearance to change how we imagine AR interaction. Project Soli busies itself by devising sensors that run on hand gestures. Current developmental stage comes with a spin, a waiver for controlling aircrafts.

In its core, project Soli hand-tracking system is an extension of anther Google hopeful by the name of Jacquard. While it is a more design oriented clothing bravado, Soli is a potential game-changer in AR interaction space. It would provide Google its very own tracking tool and ensure needed competition in the industry. The hand-track control itself is nothing we haven’t seen, but the Google’s take on it is what counts. Instead of relying on image sensors like Leap Motion does, Project Soli hand-tracking is in the business of hand gesture sensor recognition. It is also significantly lighter. The technology is to be microchip compatible, which might cut down the bulk and weight of current gen Augmented Reality devices.

The idea is not new to Google. The hand-tracking waiver idea was ran for approval back in March, while there is a 3-year-old video out there showcasing the potential use cases. The parties called upon to comment on it were Facebook and Qualcomm, which resulted in Google lowering the operation levels. Still, the power levels are above what the regulations allow, but FCC has agreed to propel the public interest and allow Google to carry on with the tests. ‘We find’, FCC writes in the decision, ‘that grant of the waiver will serve the public interest by providing for innovative device control features using touchless hand gesture technology’.

A word on the Project Soli hand-tracking potential. The technology uses radars to catch the infinitely fine-tuned gesture potential of human hand. Gesture now become signals that the hand-tracking tech recognizes and puts to good use. What this means is that Google is utilizing the breadth and finesse of human hand and essentially turn it into a controller device. Using radars rather than cameras, even the tiniest motion can become a signal. The aforementioned video gives gesture examples that correspond perfectly with haptic sensations were we to execute them in the real world. Pushing a button or sliding a rock volume, it turns out, are perfectly intuitive actions using hand as a device.

One could push the argument that what Project Soli hand-tracking technology set out to do is invent a new haptic language for interaction. When will we see it in action is another matter. Our guess is that Augmented Reality is the logical first step.