Nervana's cloud platform makes deep learning more widely available

29.02.2016
Deep learning has usually been accessible to only the largest organizations, but that's starting to change. On Monday, an AI startup called Nervana launched a cloud offering for what it calls deep learning on demand.

Nervana Cloud is a hosted platform designed to give organizations of all sizes the ability to quickly build and deploy deep-learning tools without having to invest in infrastructure equipment or a large team of experts. Based on neon, Nervana’s open-source deep-learning framework, the full-stack offering is optimized to handle complex machine-learning problems at scale.

Nervana claims to be more than 10 times faster than any other cloud AI platform. The services allow scientists to quickly build, train and deploy deep-learning technologies for internal data. Potential applications include reducing credit card fraud, increasing the accuracy of medical diagnoses, building intelligent cars and efficient energy exploration.

"Deep-learning networks can take 10 to 20 weeks to train," Naveen Rao, cofounder and CEO at Nervana, said in an interview. "We want to bring it down to hours."

Nervana Cloud is available as a public cloud service or in a hybrid model. Current users are putting it to work to improve crop yields and to find more efficient ways to explore for oil.

It's a popular emerging area. Last month, Microsoft released a toolkit on GitHub that it uses internally for deep learning, and Google has partnered with Movidius to bring machine intelligence to mobile devices.

California-based Nervana was founded in 2014 to make scalable AI more widely accessible. Among its 40 or so employees are 13 PhDs from universities including Harvard, Stanford, MIT and Caltech.

"The idea was to think about what's the biggest computational problem of our time," Rao said. "Today it's really the ability to find useful inferences in data."

Also in the works at Nervana is a new processor designed specifically for deep neural networks, Rao said. He and the company's other cofounder previously worked at the chip maker Qualcomm.

"Traditionally, processors have been somewhat general-purpose, but we wanted to make one that does deep neural networks very well," he explained.

The chip will offer a further 10x performance boost over current GPUs, he said. Nervana expects to have prototype chips ready in December, with full availability to users of its cloud service in the first quarter next year.

"If you're talking about going from 10 milliseconds to one millisecond, no one cares," Rao said. "But if you're going from 10 weeks to one week or one day, you have a very big gain in what you can do."

Katherine Noyes

Zur Startseite