Inside Google's Advanced Technology and Projects group

30.06.2015
"Epic" was the word Regina Dugan used to describe her team's research and development projects that included enlisting "Fast and Furious" movie franchise director Justin Lin to help create the next-generation movie experience. Dugan, vice president of Google's Advanced Technology and Projects (ATAP) group, delighted thousands of developers in May at Google's annual I/O conference as she orchestrated demonstrations of applied technologies that seemed to originate from just over the horizon of most humans' imaginations.

The 36 pages of results when you Google her name confirms Dugan's fame, making kudos redundant, but reporting where she came from is important because it explains ATAP's role within Google.

Dugan's last job was head of the Defense Advanced Research Projects Agency (DARPA), a 50-year-old Department of Defense research agency with a budget of about $3 billion that's charged with preventing strategic surprises from and creating strategic surprises for America's adversaries. DARPA earned a reputation for producing high impact results quickly. A few of DARPA's innovations include the internet, global positioning satellites (GPS), drones and micro-electro-mechanical systems (MEMS).  

DARPA isn't a monolithic government agency. It doesn't have its own labs or a large R&D staff. Instead, the Agency recruits teams of highly accomplished technical leaders, usually PhDs and experts in their fields to work on short three-to-five year projects with university and industry partners. The goals are ambitious ... such as building a hypersonic test vehicle to fly at mach 20 (15,200 mph).  DARPA doesn't build products it proves or disproves the feasibility of building a solution that could be developed into a product. This is an important distinction that explains DARPA's role: Success in building a prototype gives tangible evidence that a strategic end-product could be built.

Dugan joined Google to create the ATAP group, where she could apply the DARPA model to speed Google's strategic research projects. She traded the Department of Defense's deep pockets for Google's, and exchanged fighting America's adversaries for fighting perplexing product development challenges.

[Related:  9 most important announcements at Google I/O 2015]

Dugan's first demonstrations addressed the unique user interface (UI) problems posed by smartwatches due to the small size of wearables. The tiny or in some cases non-existent wearable screens essentially call for new UIs.

ATAP applied trusted radar radio technologies to controlling another device with a touchless UI that interprets fine finger movements and hand gestures made in the air. The prototype device shown at I/O 15 captured the radar reflection of a movement and applied machine learning for accurate recognition. Any movement interpreted by the radar UI can be a metaphor for any input such as up, down, make a call or take a photo. An example one such metaphor was demonstrated with a thumb and forefinger twirling motion in the air that reset the time on a digital clock display. The concept is explained in a few words and a roughly minute-and-a-half video:

Cut from whole cloth

A second demonstration of a wearable product code-named Project Jacquard how to weave a multi-touch input panel like a mouse pad into regular cloth using existing textile industry's processes.  Now, multi-touch has been around since IBM first experimented with it in the 1960s, and it's now used in every touchpad and touch screen. No need for ATAP's high-powered R&D for that. Redesigning multi-touch so that it could be produced by the textile industry at scale, however, is right in ATAP's wheelhouse.

ATAP's Ivan Poupyrev described the route that began with hand weaving conductive yarns into cloth to make a prototype multi-touch panel. He stepped through the collaboration with textile industry partners to redesign the hand-made prototype so that it could be made in textile production plants using unmodified (legacy) spinning and weaving equipment.

The path to prove feasibility of Project Jacquard reached its destination when a multi-touch panel was woven into cloth in a textile factory that was then shipped to a London Saville Row tailor and sewn into a jacket. As conclusive proof, a telephone call on a smartphone was made with a swipe of the jacket sleeve.

Poupyrev made an important distinction that Project Jacquard demonstrated feasibility and not specific applications. Those he hoped would be engineered by software developers and tailored into fashion by designers creating new applications for soft e-textile computing. Poupyrev proved that if Google wanted to, it could turn over ATAP's design and manufacture at scale, like the U.S. Department of Defense could take a DARPA design into production.

Passe passwords

Dugan's notorious ire over password authentication also has become the object of ATAP's attention in a project called Abacus. ATAP recruited 25 experts from 16 institutions in 10 countries to collaborate on signing-in or authenticating without passwords at Google's Mountain View facility during a 90-day design session.  Applying machine learning, the collaborators built an app that vouches for the identity of the user based on a multimodal assessment of his or her behavior.

By enlisting such smartphone sensors as the camera and accelerometer, ATAP replaces passwords by collecting data about the users' unique patterns of behavior while typing, talking, changing facial expressions and walking. The combination of this data can uniquely identify the user carrying and interacting with a smartphone almost like a baby identifies its mother by the way she carries and interacts with her infant.

Individually these patterns may be a weaker security defense than a simple four-digit pin code, but combined they're stronger than the best fingerprint reader. Dugan declared success in her war against the password, saying "the result, proof of the hypothesis a new method of authentication that may prove to be 10-fold more secure than the best fingerprint sensors. The hope is that with only a software update we can provide this level of security to millions of Android devices."

Fast and furious technology breakthroughs

Google also delivered its fourth Spotlight Story filmed as a 360-degree immersive movie. The movie can be watched from a spectator's multi-angle vantage point in the middle of the movie set using a smartphone and earbuds. Google partnered with Justin Lin, director of the "Fast and Furious" movie franchise and the upcoming "Star Trek III" to film a 360-degree action movie. Titled "Help," the film's about an extraterrestrial reptile that lands on earth and grows in size from baby to velociraptor to a Godzilla-sized monster terrorizing a Gotham-like city amidst special effects.

The movie-goer watches the movie from any of 360 degrees up, down, right, left, front and back seeing the movie through the window of his or her smartphone as if he or she were in the middle of the scene. Instead of trying to explain the 360-degree movie experience with mere words, downloading the Spotlight player and movies from Google Play or iTunes will result in a richer experience and first-hand understanding of ATAP and Lee's movie.

The creation of a 360-degree action movie demanded a partnership between a creative movie director, who could express the camera equipment he needed, and ATAP, who could swiftly build this equipment. ATAP technical project lead Rachid El Guerrab's account of director Lee's needs and the solution that he designed gives insight into just one dimension of creating a 360-degree movie, the camera. But the solution built for Lee by ATAP is much bigger and more complex, including a directional audio system that follows the observer's position, management tools to help the director visualize and plan the video shoot, and integration with digital effects authoring tools so the virtual velociraptor can be merged into the scene with the human actors.  

ATAP boldly goes where no man has gone before

ATAP's developments all address strategic opportunities or risks for Google. Wearable technologies will become a large market that IDC estimates will amount to 126 million units in 2019. Obstacles to how users interact with wearables limit their usefulness. Commercial development of ATAP's radar motion detector of woven multi-touch interface could give Google a strategic wearable advantage in the same way that Android does, even though Google gives the technology away.

Passwords are another strategic issue for Google. Android mobile devices need strong authentication. Project Abacus could be a strategic response to Apple's fingerprint reader. And it's a key component of all of Google's apps. With increasing frequency, people use Google, Facebook and Twitter to register and login to websites. If Project Abacus proves Google's authentication is stronger and safer, consumers will choose it over the alternatives.

Google's interest in 3D and virtual reality anticipate new ways of interacting with internet content. Google Ventures invested over $500 million in virtual reality startup Magic Leap, a company that aims to merge virtual and real worlds. Virtual reality has reached mainstream attention and appears to be ready to grow and generate serious revenue. 360-degree Spotlight video engineering is a precursor to creating virtual reality content on a massive scale and consumption en masse.  It prepares Google with first-hand experience that it can use to lead its strategy and investment in this emerging market.

And it'll be just one more feasibility-to-reality success for the Dugan's ATAP wizards.

(www.cio.com)

Steven Max Patterson