As Dan Woods points out in a recent article for Forbes, technology marketplaces cycle through predictable stages as they mature. He applies this insight to the component versus platform decision that organizations face when adopting new technologies.
In the month since FINRA announced total of $17 million in fines against financial adviser Raymond James and its financial services affiliate, two threads have circulated through social media. The first took note that broker-dealers were now squarely in the enforcement sights of regulators. The second focused on the piercing of the corporate veil and the penalties administered to James’ former AML Compliance officer.
If you're a CDO, how would you describe your most important role: as gatekeeper or innovator? Or are you walking a tight rope between the two? Those questions figured prominently at the 10th annual MIT Chief Data Officer & Information Quality Symposium held in July.
Data democratization is about giving a larger group of people in the company access to self-service tools to find and work with the data they need for analysis. This self-service capability needs to happen to enable data-driven organization. But there is a fine line between enabling self-service and ensuring data is accessed by the right people and used in the right situations.
As any developer knows, perfect software doesn’t just happen it, pardon the pun, “develops” over time. Developers engage in a seemingly everlasting iterative process involving bug fixes and changes that can last for the lifetime of an application. But writing the software is only half the battle; it must then be deployed.
For big data companies like ours that run software across distributed networks, this is no small task. In particular, a developer makes changes, runs tests, identifies errors or processing improvements to address, and then makes more changes.