Some Attivio folks flew to the annual Gartner BI event last week to take the pulse of Business Intelligence, data discovery, and data democratization. We wanted to hear the latest from Gartner thought leaders and the several thousand data practitioners. In the opening keynote, there were 5 key takeaways. I’d like to zero in on numbers 2 and 3.
Number 2: Accept that the world will get more distributed. No surprise here. The shift began when new ways of consuming data emerged and spread like wildfire.
Two or three years ago, when Big Data had started to gain serious traction in large enterprises, there was a rush to hire data scientists. Of course, disagreement reigned about what credentials made a true data scientist. Wonky math geeks were a good place to start. The rush to hire data scientists echoed the trend some decades earlier when investment firms hired quants right out of college and put them in the basement to create trading algorithms.
One thing was clear though. Data scientists were scarce and expensive.
Most chief executives have dreamed of leadership that permeates their organizations at every level. Yet to have that, every decision maker from the C-suite down would need data. And they fear that such data democratization may invite anarchy.
Despite that fear, their dream persists, and for good reason. Organizations with democratized data tend to have more buy-in, people at all levels take more responsibility, and their ideas are more relevant, with a greater chance of real impact. They feel stronger loyalty. Overall, there’s just plain smarter decision making.
The evidence comes from a variety of contexts. Military battles have been won because low level officers were free to think for themselves. Organizations have thrived on products that spawned when a mid-level manager pursued an idea. Workforces have toughed out hard times when data let them believe in fairness and a bright future.
Another fiscal year has come to a close, and we at Attivio have never been more excited about what’s to come. Our entry into the Big Data market is filling a void that enterprises struggle with every day.
We understand that data is a strategic asset that, when made accessible to everyone in the organization, can lead to smarter, faster decisions, pushing past the competition in ways that didn’t seem possible.
We’ve built a platform that accelerates data discovery and the resulting analysis, and in the process, have helped numerous enterprises do more with their data than ever before.
Today we announced an expanded relationship with PerkinElmer, a global leader focused on improving the health and safety of people and the environment, making them the exclusive OEM of Attivio licenses in the life sciences vertical. With this new partnership, PerkinElmer’s deep expertise across life science fields will be coupled with Attivio’s data discovery solutions, resulting in accelerated big data analysis projects.
Big Data Analytics for Life Sciences
Like other industries, life sciences grapples with finding the best ways to derive value from the massive volumes of data available to them.
The short answer: “To get them out of their hair.” But seriously…
At Attivio, we work with decision-makers at various companies who own and administer the BI infrastructure. We call them “BI tech owners.” A BI tech owner’s team governs data and delivers it to business users for analysis.
BI tech owners are not in an enviable position at the moment. The proliferation of self-service analytical tools for Big Data and BI have generated orders of magnitude increases in requests for data. And those requests all come with an ASAP attached.
Before joining Attivio, I worked for several years at Tibco Spotfire. It was a great experience. I was on the front lines as the worlds of Big Data and Business Intelligence (BI) collided.
Traditionally, companies relied on canned BI reports to help them understand historical data. Such reporting solutions have been around for decades. It was, therefore, very exciting to see that massive market disrupted by data discovery and analytic solutions such as Tableau, QlikTech, and Tibco Spotfire. These new, easy-to-use, data visualization tools helped analysts, researchers, and data scientists quickly self-serve insights from massive data volumes.
Instead of relying on a combination of static reports and massive spreadsheets to manually comb through huge data sets—an unreliable process that could take months—the new data discovery and analytic solutions instantly revealed trends, patterns, and outliers in mere moments, with just a few clicks.
To some extent, virtually all business decision makers rely on business intelligence (BI) analytics and reporting. And, according to an online survey conducted by Forrester Research in 2014, more that 40 percent of organizations using BI achieve double-digit ROI on their investments within two years.  Not only that, but top performing companies tend to spend a greater percentage of their IT budgets on BI.  So it’s all good in the world of BI, right? Not exactly.
There was a lot of talk at the Tableau Conference last month about the challenge of getting the right data into Tableau. There’s a process bottleneck between IT and the business that prevents the easy flow of data in the organization. Data democratization is a powerful concept that ultimately enables companies to compete on analytics.
From EDW to Big Data
It’s hardly a new problem. The business intelligence industry has dreamed of the enterprise data warehouse, and now it dreams of big data and the gold it may become. Ultimately, it’s all to pursue the same goal: to use data as the strategic asset it should be. But something has always come in the way.
Recently I was thinking about the data bottleneck between BI analysts and IT that can add months to analytics initiatives before they produce any meaningful insight. It’s not just that most enterprises are saddled with legacy infrastructure for handling data. Tools like Tableau, Qlik, and Spotfire have created an order of magnitude increase in the number of data consumers in business. There just aren’t enough bodies on the IT side with the technical skills to handle the demand for data the old-fashioned way using code or ETL and MDM tools.