Can Big Data help you manage your IT infrastructure?

Posted by Alan Earls on Jan 12, 2015 4:20:52 PM

Big data has garnered a lot of attention in recent years for its supposed ability to magically turn mountains of data into actionable business information. Hadoop NoSQL and Cassandra and a host of other emerging tools and techniques have made sifting data an ever-more-manageable task.

Now, some are wondering if all the streams of operational data produced by IT operations could be similarly parsed and mined.

4773457853_b10fcc8294_zIt is both the expansion in scale and the heterogeneity of IT-generated data that is encouraging the big data approach. With thousands of virtual machines to monitor, it can be difficult to predict just what a new application added to the environment will mean in terms of security, performance, or compliance.

Taking a New Big Data Approach

Several companies have been participating in this trend. In particular, Splunk sees an opportunity in the mountains of ephemera that, in toto, provide a complete record of user behavior as well as indicators of service levels and security threats. To be sure, most organizations already have a good amount of monitoring and analytics going on, but the focus is limited. By comparison, the Splunk approach is essentially open, letting analytics follow where the data leads.

How might big data analytics be useful? At present most tools only provide a rough means for assessing how many virtual machines ought to run on a single physical machine. Taking a new big data approach to IT analytics can provide deeper insights, allowing correlation of all available data to performance attributes so that the impact of different kinds of applications and loads can be better understood, reducing over-provisioning and under-provisioning. 

Analysis and Insight

CloudPhysics, another company playing in this space, claims to have been inspired by the hyper scale IT operations of organizations such as Google and Facebook. Those organizations have made it a point to “instrument” and study the data so they can achieve optimal performance.

At this point, the whole idea is no longer bleeding edge but is becoming mainstream. For instance, Netuitive, a third company to harness big data analytics, has licensed some of its technology to Microgaming, the world’s largest provider of online gaming platforms, such as casino, poker, sports betting, and bingo).

The technology will give Microgaming insight into the performance of the gaming platforms it develops, hosts, and maintains on behalf of its licensees.That means monitoring key applications for more than 120 casino operators and over 600 software titles.

Larger Implications

With growing interest in the software-defined data center (SDDC) – which aims to extend the virtualization paradigm to everything – big data analytics may become the crucial enabler. After all, if server virtualization has sometimes proven challenging to manage, think how much worse it could get.

A recent article in Virtualization Practice made exactly this point, predicting, “Once all of the configuration of the data center is moved into software, and some of the execution of the work is moved into software, SDDC Data Center Analytics will play a critical role in keeping your SDDC up and running with acceptable performance.”

So, don’t just think big, think “big data” when mapping out your operational future.

Photo Credit: Tom Raftery and 1E on Flikr. Used under CC 2.0 license

Get updates in your inbox. Click here to subscribe to our blog!

Topics: Cloud Industry and Technology, Cloud Trends

Which Data Loss Gremlin Is Targeting You
MSP Health Check
MSP Phishing Quiz
Intronis Local Lunches
MSP Marketing Assessment