Blazent brings big data to IT operations management

24.06.2015
IT management software and services provider Blazent is putting the much-ballyhooed big data analysis to work on behalf of its enterprise customers.

Through the use of Spark, Cassandra and other big data technologies, the company has developed a service that can organize all the data an organization maintains about its IT systems, so it can be easily analyzed to improve performance, meet auditing requirements, and pinpoint problem areas, said Gary Oliver, Blazent CEO.

The Blazent Data Intelligence Platform can work with more than 230 common sources of IT operational and development data, including configuration management databases (CMDB), as well as software for performance management, antivirus, server management, and development lifecycle management packages.

Blazent has a lot of expertise in aggregating operational data on behalf of large customers. Founded in 2002, the privately held Blazent originally offered software for better managing IT resources.

Over the years, it developed technologies to synthesize data from multiple sources, even unusual ones such as human resource management systems. Large, IT-heavy organizations such as IBM, Visa, Motorola, and CSC have all used the company's services.

Analysis of operational data can have many benefits for organizations with large swaths of IT equipment and software. One service provider using the service found that an entire floor of its data center was being used by a third-party who was not paying rent for the use. A financial services company discovered that 300 of its servers were not being backed up.

With the launch of the Blazent Data Intelligence Platform, the company has moved to a big data distributed architecture for handling its customer information.

Data is fed into a hosted service, which runs either on Amazon Web Services or Verizon's Terremark service, and stored on a Cassandra NoSQL database. It then can be queried using the Spark data processing platform.

For customers, the new big data architecture is beneficial in two ways, said Michael Ludwig, Blazent's chief product architect.

For one, it allows them to get near real-time views of what is going on behind the firewalls. The former approach relied on an extract, load and transform (ETL) operation, which was typically executed in periodic batch mode.

Secondly, the cloud-based big data approach allows the client to build up a rich history of operational data, which can be useful for activities such as predictive analysis. Later this year, Blazent will release a number of new machine learning modules to take advantage of the historical data.

The new data engine provides a pipeline to prepare data for analysis, borrowing from a variety of techniques from the disciplines of master data management, data governance and service management to refine the information. The software can link data about a single resource that may come from multiple systems.

In addition to launching the big data engine, Blazent has also released two new analytic modules. The new Data Explorer provides an interface for inspecting hundreds of data sources. Another analytic module, called GLOVE (governance, lifecycle operational validation, expenditure), was designed specifically for large online service providers to audit the customer use of their systems.

A typical implementation of Blazent Data Intelligence Platform for a large enterprise may start at about $100,000, according to the company.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Joab Jackson

Zur Startseite