Google bets that smart fabrics, gesture interfaces will replace fiddling on a tiny wearable touchscreen

29.05.2015
Instead of trying to scroll through your contacts list on a tiny smartwatch, Google thinks you should be able to just touch your sleeve to initiate a phone call.

That's just one example of what developers might be able to do with Project Jacquard, unveiled at the I/O developer conference on Friday. The project centers on fabric that incorporates tactile sensors with the result that clothing can offer some touchscreen functionality.

Project Jacquard is one way that Google is attempting to develop new user interfaces for mobile devices that rely less on touchscreens and more on gestures. As the screen sizes on mobile devices shrink, people are reaching the limit to what they're able to accomplish, especially with wearables, said Regina Dugan, a Google senior executive who works in the company's Advanced Technology and Projects group.

During a demonstration of a suit jacket made with the smart fabric, a person was able to place a call on his smartphone by touching the garment's sleeve. The jacket needed additional electronics to work and it's unclear how this technology would respond to getting wet, for example when being cleaned.

While tactile fabric isn't new, Google wants to make the technology more than a novelty and incorporate it into everyday fashion, said Ivan Poupyrev, technical program lead at Google. To that end, the company is working with Levi's to create clothes made with fabric from Project Jacquard, but gave no details on when these garments would go on sale or how much they would cost.

Google, though, realizes that it must work with the fashion and textile industry if it wants to see its smart fabric end up in clothing, Poupyrev said. The fabric can withstand manufacturing processes, and is available in a range of colors, he said. Google is building the entire system for smart fabrics, from the yarns to the apps, technologies and services that would work with them.

I/O attendees were able to try out the smart fabric in a demonstration area. Sliding your fingers across or up and down the fabric controlled nearby electronics. In one demonstration, people used the fabric to control a 3D image shown on a display. In other displays, a swatch of fabric was used to control lighting and change the song playing on a smartphone.

Google also showed a gesture radar, called Project Soli, that can capture and interpret a person's hand movements. Instead of scrolling through a contact list on a smartwatch screen, for example, a person could perform that task by making the same movement over a wearable that's equipped with the sensor. To change the volume on a device, one could make a gesture that resembles turning a dial.

The aim is to replace all of the physical commands needed to control a wearable with hand gestures, said Poupyrev. Radar can capture subtle 3D movements and work either during the day or at night, among other sensor criteria, he said. For Project Soli, Google and its partners shrunk the radar functionality down to a chip that can fit inside a wearable.

The gesture radar can distinguish distance as well as gestures. During a demonstration, Poupyrev changed the time on a digital watch face by holding his hand at different heights. Holding his hands close to the chip and gesturing like he was turning a dial changed the hours while holding his hand higher up changed the minutes.

Google is building an API (application programming interface) for the gesture radar and will release it later this year.

Fred O'Connor writes about IT careers and health IT for The IDG News Service. Follow Fred on Twitter at @fredjoconnor. Fred's e-mail address is fred_o'connor@idg.com

Fred O'Connor

Zur Startseite