A central challenge in transforming a ponderous Big Data headache into an agile Big Analysis win is finding a way to scale (often spelled S-U-P-P-O-R-T) Tableau users with the data they seek to visualize. Like most coins, there are two images we can observe – one from the perspective of the data analyst visualizing in Tableau and one from the data manager who structures and provisions data tables worth visualizing. My last post took the former perspective – today, I’ll look at the latter.
Data security, governance, and management are essential elements – non-negotiable elements – in traditional data warehousing or contemporary Big Data initiatives. And, as table stakes, they can effectively limit the degree of freedom for deploying new solutions or tools.
3 days, 10,000 attendees, multiple TED speakers… but this wasn’t a motivational sales conference. It was a conference for data geeks – the Tableau Conference. In this case, what happened in Vegas will NOT stay in Vegas. It needs to be shared.
What a testament to the evolution of the data-driven mindset. We’ve known for a while that democratization of data could drive business results, and that the future belongs to data-driven organizations. But the sheer number of people at the Tableau conference is confirmation that the theory of the data-driven enterprise has been embraced by the masses.
Work in or with technology much and you’re likely to know the impact that the forces of democratization can accomplish. Broadening the effective population that can perform an essential task – without resorting to agency or hand-holding – has been at the heart of most business ‘disruptions’ of the Internet Era. There’s an emerging dynamic in the convergence of Big Data and democratized access to data analytic tools, like Tableau.
Pundits who focus on BI and Big Data consistently estimate that roughly two-thirds of any BI initiative is spent profiling and identifying the data that will be used for analysis. As a recovering data analyst – “Hi, my name is Lee. I’m a statistician” – I appreciate their focus on the most time-consuming element in living the quantitative dream.
But the statistic reminded me of another situation where advances in technology are changing the way things traditionally happen. In educational circles, the latest technology-induced change is termed “flipping the classroom”. Flipping reverses the focus of classroom and out-of-classroom activity – students use technology to master core material before each class and (precious) classroom time is spent applying and discussing what they’ve learned.
If you’re currently waiting for data to analyze or you’re working to find data for a colleague – you’re familiar with one of the productivity challenges associated with getting BI from Big Data. Finding the right information and provisioning it for analysis and decision-making constitutes a real bottleneck for many organizations.
Friction = Pain
Investments in data lakes and data warehouses - absent tools for analysts and data managers to identify and connect to specific, contextually appropriate data elements – create data friction. And the friction causes real pain. In a nutshell, it slows analysis, inhibits complete insight, and produces bottlenecks between BI and IT teams.
A lot of enterprises have already invested a lot of money in Big Data. But how are these investments panning out? Not so well according to Gartner, who estimates a failure rate of close to 60 percent.
As I pointed out in Solving the Big Data Abandonment Problem, organizations face two significant stumbling blocks to Big Data ROI. The first is a lack of alignment between business and IT stakeholders on objectives, roles, and resources. Ideally, business provides the context; they know the right questions to ask. And IT delivers the technical infrastructure—the data and analytical tools.
I followed Inc.’s lead and interviewed Stephen, but instead of discussing management, I asked about what’s happening in Attivio’s market space and the role we are playing.