AtScale looks to easily add BI on Hadoop

19.10.2015
During his tenure as vice president of Engineering in the Advertiser Analytics unit at Yahoo!, and later as vice president of Development in the User Data & Analytics unit, David Mariani's team was responsible for running data pipelines, connecting business users with the company's ever-growing store of data.

"We had this whole notion of never throwing data away," he says. "That was a golden rule at Yahoo!. We just about broke the bank with NetApp and EMC. That's why we invented Hadoop."

"The benefit of OLAP was our users [at Yahoo!] could self-serve," Mariani adds. "We generated $50 million in lift from the display advertising business every year because of Analysis Services. We wanted an OLAP interface without the OLAP baggage. Give me Hadoop, with scale-out architecture but an OLAP interface."

[ Related: Hadoop, in trouble Only in Gartner-land ]

With his new company, AtScale, which came out of stealth earlier this year, Mariani aims to assuage some of the pains he experienced building what may have been the world's largest OLAP cube for business intelligence (BI) at Yahoo! AtScale allows business users to use any BI tool to work with data in Hadoop by creating a virtual cube that essentially turns Hadoop into a high-performance OLAP server.

Today at the Tableau Conference in Las Vegas, AtScale took the wraps off the newest version of its platform: the AtScale Intelligence Platform 3.0. The new version introduces a number of innovations around enterprise security and scalability. It also features integration with Tableau Server and Tableau Online. Perhaps most importantly, it adds AtScale's new patent-pending technology: Adaptive Cache.

"Adaptive Cache is possibly one of the most meaningful breakthroughs in this space," Richard Langlois, director of Enterprise Data Management at Yellow Pages (Canada), said in a statement Monday morning. "We put the AtScale Adaptive Cache technology through a test on 38 billion rows of data. The results were beyond expectations."

Hadoop is hugely flexible and scalable, but it wasn't built to support the kind of interactive query performance that business users expect from their BI tools. To solve the problem, organizations have relied on data indexing, transformation or data movement methodologies that are typically complex and time-intensive.

[ Related: Self-service BI review: Tableau vs. Qlik Sense vs. Power BI ]

Adaptive Cache is an engine that leverages progressive learning — it watches user query patterns as users access the AtScale virtual cube and then caches the data that would allow the query to run faster the next time. Data change detection recognizes when new data is available and generates appropriate cache components, adding a new slice into the existing cache.

"As the cube is used and accessed, we get smarter and smarter about delivering interactive components," says Josh Klahr, vice president of Product at AtScale.

The end result is the sort of interactive experience BI users expect without the need for extensive preprocessing or movement off the cluster, Klahr says.

The Adaptive Cache technology uses a training mode that can warm up the cache before going into production. Klahr says extensive use of the training mode can get the cube up to speed within the first few days of implementation.

"The industry is going through a massive transformation and enterprises need a solution that allows business users to access any data from anywhere," Mariani says. "Enabling this level of accessibility requires sophistication and a solid and secure platform. With this release, AtScale becomes the BI on Hadoop vendor with the most enterprise-ready capabilities."

Follow Thor on Google+

(www.cio.com)

Thor Olavsrud

Zur Startseite