We talk a lot of about the amount of data organizations capture and how that data comes in many sizes and formats, from structured to semi-structured and even unstructured. We also know that much of that data isn’t used for decision-making, hidden in silos across the organization. All of this makes it difficult to build a unified view of your data.
But the challenge with building a unified view is only partly due to data silos.
As Dan Woods points out in a recent article for Forbes, technology marketplaces cycle through predictable stages as they mature. He applies this insight to the component versus platform decision that organizations face when adopting new technologies.
In the month since FINRA announced total of $17 million in fines against financial adviser Raymond James and its financial services affiliate, two threads have circulated through social media. The first took note that broker-dealers were now squarely in the enforcement sights of regulators. The second focused on the piercing of the corporate veil and the penalties administered to James’ former AML Compliance officer.
If you're a CDO, how would you describe your most important role: as gatekeeper or innovator? Or are you walking a tight rope between the two? Those questions figured prominently at the 10th annual MIT Chief Data Officer & Information Quality Symposium held in July.
Data democratization is about giving a larger group of people in the company access to self-service tools to find and work with the data they need for analysis. This self-service capability needs to happen to enable data-driven organization. But there is a fine line between enabling self-service and ensuring data is accessed by the right people and used in the right situations.
As any developer knows, perfect software doesn’t just happen it, pardon the pun, “develops” over time. Developers engage in a seemingly everlasting iterative process involving bug fixes and changes that can last for the lifetime of an application. But writing the software is only half the battle; it must then be deployed.
For big data companies like ours that run software across distributed networks, this is no small task. In particular, a developer makes changes, runs tests, identifies errors or processing improvements to address, and then makes more changes.
I recently attended Hadoop Summit 2016 where not surprisingly there was a lot of conversation about topics other than Hadoop. For example, the importance of ecosystem partners to any Big Data solution.
Attivio is excited to be a part of EMC’s new Big Data Solution. It’s not generally available yet, so we thought we’d have a chat with Ted Bardasz from EMC to give you a look at what this new platform offering is and how Attivio fits in.