The cutest demos was Google’s Vision Bot, a little robot on wheels equipped with a camera that scoots around and identifies the objects it sees. The bot uses a photo recognition database stored in the cloud and displayed what it was thinking on a screen. For example, when I held up a daisy, Vision Bot responded with words like “flower” and “plant.” It also recognized basic facial expressions, so when I smiled at it, it raced towards me; when I gave it a menacing look, it scurried away from me as fast as it could.
Next up was a 180-degree photo booth, where a semi-circle of Nexus phones took a photo of me at the same time and pieced the images together to make a GIF. The results looked like something out of The Matrix.
And of course, you can’t have a conference about tech and not feature Star Wars in at least one demo space. I played a Google Chrome-based game called Lightsaber Escape, where I used a Nexus phone to control my moves. The premise of the game I was a member of the Rebel Alliance, fighting my way through a mob of Stormtroopers.
So, what did I learn at the Google Cloud Studio That cloud-based technology threads experiences together in simple (and complex) ways that may not be obvious at first—and that I can get some pretty solid airtime with an explosive jump.