How Data Virtualization Drives Secure, Agile, Enterprise BI
If you believe that the convergence of Big Data and Big Analysis is a ‘half full’ opportunity, chances are you’re familiar with and engaged in the implementation of data virtualization to accelerate the move to data democracy in your organization. New tools extend the franchise for secure, well-governed data provisioning – offering the ease of self-service and/or dramatic gains in IT productivity. By eliminating many of the costs and risks associated with traditional data integration methods, virtual data marts b
There was a lot of talk at the Tableau Conference last month about the challenge of getting the right data into Tableau. There’s a process bottleneck between IT and the business that prevents the easy flow of data in the organization. Data democratization is a powerful concept that ultimately enables companies to compete on analytics.
Recently I was thinking about the data bottleneck between BI analysts and IT that can add months to analytics initiatives before they produce any meaningful insight. It’s not just that most enterprises are saddled with legacy infrastructure for handling data. Tools like Tableau, Qlik, and Spotfire have created an order of magnitude increase in the number of data consumers in business. There just aren’t enough bodies on the IT side with the technical skills to handle the demand for data the old-fashioned way using code or ETL and MDM tools.
A central challenge in transforming a ponderous Big Data headache into an agile Big Analysis win is finding a way to scale (often spelled S-U-P-P-O-R-T) Tableau users with the data they seek to visualize. Like most coins, there are two images we can observe – one from the perspective of the data analyst visualizing in Tableau and one from the data manager who structures and provisions data tables worth visualizing.