Demo in a meeting room
"How do you interact with invisible computers?" If that question is raised in a Google meeting room, one might expect a discussion of futuristic concepts rather than a physical demonstration.
At a Google meeting, a smartwatch sat on the table. Snap a finger a few inches from the watch face, and the dial would "respond."
Project Soli and ATAP
Ivan Poupyrev leads Project Soli at Google's Advanced Technologies and Projects (ATAP) lab. Project Soli was developed to show that consumer electronics can integrate miniature radar chips so users can control devices with very small gestures. Integrating radar into a smartwatch was intended to demonstrate how users could interact with an otherwise invisible computer.
ATAP is a unit within Google. Its previous director was Regina Dugan, formerly of the Defense Advanced Research Projects Agency (DARPA). Projects that have been associated with ATAP include modular smartphones (Project Ara), real-time 3D mapping (Tango), and virtual-reality storytelling (Spotlight Stories). Dugan left for Facebook earlier this year, leaving uncertainty about the future of some projects. Tango has moved from ATAP into Google, while Project Ara has faced difficulties.
Building a radar-based consumer platform
Some ATAP efforts, such as the Jacquard textile project and Soli, remain active at the lab. Soli has a broader goal: to establish an industry and design language for consumer electronics equipped with radar technology. This objective explains why Poupyrev's team pursued not only experiments but also concrete demonstrations of integrating radar into a smartwatch.
Poupyrev has said that if a technology can be integrated into a smartwatch, it can be integrated into other products as well. ATAP redesigned the Soli chip to reduce its size, lower power consumption, and optimize performance. According to Soli lead and hardware product engineer Hakim Raja, the team achieved significant miniaturization. The chip is very thin; four antennas provide full-duplex communication for transmitting and receiving radar signals. The first-generation Soli chip in the development kit consumed 1.2 W, while the latest chip consumes 0.054 W, a 22-fold reduction.
Challenges of scaling radar down
Making the chip so small brings trade-offs. Traditional radar systems are designed to detect metal objects at distances of miles, not finger gestures a few inches away with millimeter-level motion. Until recently, designers did not need to account for this scale of power and the resulting signal challenges posed by a tiny radar chip.
Signal processing and machine learning
Jaime Lien, Soli's lead researcher, focuses on optimizing the machine learning algorithms integrated into the chip. She emphasized the need to convert the spatial information provided by radar into time-domain signals that a computer can process. Noise problems at this scale are exceptionally difficult; the algorithms must extract the desired signals from a large amount of noise. Beamforming is not feasible, so every signal that passes through the chip must be captured. In short, this is a complex problem.
Compared with electronics engineering, the machine learning layer that maps recognized gestures to device actions is relatively straightforward, but it is not trivial. Touchscreen devices present buttons and sliders on the display; devices with physical switches provide tactile detents. If there is no visual or tactile affordance, however, the challenge becomes how to guide users to perform the correct gestures.