Splunk Guarantees Uptime for Data Analytics Service

When it comes to data sources, the most critical thing for most developers is being sure that the APIs they are calling will always be available. Rising to the challenge, Splunk has announced that it has instituted a 100% uptime service-level agreement for Splunk Cloud.

Accessible via RESTful APIs, Splunk Cloud gives developers access to a Platform in the cloud for analyzing machine data. Rather than having to locally store massive amounts of Big Data, Splunk Cloud leverages the resources of Amazon Web Services to inexpensively store that data.

Praveen Rangnath, director of product marketing of Splunk Cloud, says the Splunk Cloud machine data analytics service is a software-as-a-service plan that allows developers to access up to 5 TB per day in addition to support for 10x bursting capabilities to handle unexpected surges in data.

In addition, Splunk has announced a 33% price reduction made possible by increased operational efficiency of its cloud environment. Rangnath attributes much of those efficiencies to AWS price cuts that Splunk is passing on to its customers.

With the rise of the Internet of Things (IoT), the value of machine data has risen considerably. Splunk made a name for itself by giving IT organizations access to machine data generated by IT systems. But with millions of devices to soon be connected to the Internet, massive amounts of machine data will need to be analyzed. That creates an opportunity for developers to build any number of applications capable of correlating that data with any number of other data sources in or out of the enterprise.

Splunk itself combines a search engine with an indexing capability that makes it possible to analyze all that data. Splunk Cloud features that developers can invoke include monitoring and alerting, role-based access controls, data model/pivot, knowledge mapping, anomaly detection, pattern matching capabilities and report acceleration tools.

To foster the development of those applications, Splunk also makes available a Splunk Online Sandbox through which developers can store up to 5 GB of data for a total of 28 GB for free. Pricing for the Splunk Cloud beyond that starts at $675 per month for up to 5 GB per day of indexed data.

Given the rise of IoT, the race is on to provide access to data in a way that doesn’t require organizations to invest millions of dollars to store that data locally. International Data Corp., for example, estimates that by 2020 there will be 44 zettabytes of data in the digital universe, but only 1.5% of that data will be considered “high value.”

Whether it’s through AWS or some other cloud service provider, there is going to be more data available in the cloud than developers know what to do with. The challenge will be figuring out what portion of that data is valuable enough to do anything with.

Be sure to read the next Big Data article: Prelert Anomaly Detection Released For Big Data Analysis