Project Soli is being developed by Google ATAP as a new sensor technology that can track sub-millimeter motions with accuracy. Unlike traditional sensor technology involving cameras, lenses, and many moving parts, Project Soli relies on radar technology that can be built into a single micro-chip.
Soli is built to track minute hand motions to interact with wearables, IoT, and other devices. Soli does this by emitting a broad beam of electromagnetic waves ( in the 60-Ghz ISM band ) between the object and radar sensor. Soli then interprets those hand gestures with great accuracy and turns them into commands. It does all of this at frame rates between 100 and 10,000 fps.
To further this, the developers are creating a set of virtual tool gestures including: button pressing, dial rotation, and slider interaction. Because of the haptic sensation users have when their fingers touch each other; this virtual gesture help users hand motions be more precise.
The Soli SDK is hardware neutral and allows developers to easily build upon the virtual gesture library. Developers utilizing the Soli Alpha Dev kit have been able to do object recognition, 3d imaging, predictive drawing, in-car remote, security, and visualization.
To Learn More: