Google’s Advanced Projects Division Debuts ‘Hand Motion’ Tech

Given the lukewarm response received so far to wearable technology, it makes sense that Google would want to expand the definition of the term. So the company has introduced Project Soli, an attempt by its Advanced Technology and Projects (ATAP) division to harness the power of hand and finger manipulation to make it easier to interact with ever-smaller devices and screens.

The idea behind the technology, introduced at Google's I/O 2015 developers' conference this week, is that instead of using tools, users should employ what ATAP calls hand motion vocabulary to control devices, even when they're not carrying their devices.

With Soli, the way users control their phones -- touching, tapping and swiping -- can be applied to other things. It responds to the slightest of signals and transforms them into various different visualizations to arrive at a final, subtle and multidimensional message about what exactly their hands are doing.

"Your hand can be a complete, self-contained interface control," Ivan Poupyrev, who leads the projects for ATAP, told the I/O audience. "It's always with you."

ATAPEUs debut of Soli came along with the announcement of Project Jacquard, which proposes to make interactive textiles not just a novelty, but something that the global fashion industry could adopt. Project Jacquard makes it possible to weave touch and gesture interactivity into any textile using standard, industrial looms, according to Google. Everyday objects such as clothes and furniture can be transformed into interactive surfaces.

Let Your Fingers Do It

In Soli, haptic feedback is included, since a userEUs hand naturally provides its own version when fingertips create friction by touching. Soli is designed to reimagine the userEUs hand as its own user interface.

Soli lets users control devices using natural hand motions, detecting fine motions accurately and precisely -- one could install a sensor (pictured above) under a table or...

Comments are closed.