TURNING BIG DATA INTO BIG VALUE
A lot of enterprises have already invested a lot of money in Big Data. But how are these investments panning out? Not so well according to Gartner, who estimates a failure rate of close to 60 percent.
As I pointed out in Solving the Big Data Abandonment Problem, organizations face two significant stumbling blocks to Big Data ROI. The first is a lack of alignment between business and IT stakeholders on objectives, roles, and resources. Ideally, business provides the context; they know the right questions to ask. And IT delivers the technical infrastructure—the data and analytical tools.
Agile Data Discovery
But even if business and IT fully align resources and expectations, finding and accessing all the relevant data for analysis can prove even more problematic. After all, large enterprises have literally thousands of data sources that might contribute to a successful analytic outcome.
In fact, most organizations aren’t set up to tap into the data sources they have, much less add new ones. Forrester estimates that “companies spend around 64% (as in 80% times 80%) of any BI initiative on identifying and profiling data sources.” And our customers tell us that before using the Attivio semantic data catalog, it could take a year to add 10 new data sources to their BI tool. Likewise, data lakes often become another silo of information that can’t be properly identified or accessed.
Big Data ROI
A key component of operationalizing a Big Data project is identifying and deploying critical technology to streamline the data supply chain—getting data from the source to the BI tool. That’s the bottleneck that prevents agile analytics and derails many a Big Data project. You can find more keys to operationalizing Big Data here.
Virtually every day a new use case emerges that confirms the value of Big Data. Your enterprise may consider abandoning Big Data if the path to success seems too complex to navigate, but don’t do it. You’ll be glad you stayed the course.