Ludwig Bemelmans' classic picture book Madeline has been enjoyed by generations of children — my own daughters included — who are perennially attracted to the story of Madeline, the smallest yet most adventurous of twelve little girls in a Paris boarding school.
The story also has an important lesson on analyzing — and questioning — information.
In Madeline, Miss Clavel, the girls' teacher and caregiver, suddenly awoke one night sensing trouble:
In the middle of the night
Miss Clavel turned on her light
and said, "Something is not right!"
Sure enough, she found Madeline in her bed, in pain from appendicitis. Of course, all turns out well, thanks to Miss Clavel listening to her personal sense that something was not right.
That scene from Madeline somehow came to mind while recently reading Know What You Don't Know: How Great Leaders Prevent Problems Before They Happen (2009) by best-selling author and business professor Michael Roberto. One of the most troubling causes of unseen problems mushrooming into catastrophes, Roberto writes, is an organizational culture that dismisses intuition in favor of hard data:
Some organizations exhibit an intensely analytical culture. They apply quantitative analysis and structured frameworks to solve problems and make decisions. "Data rule the day; without a wealth of statistics and information, one does not persuade others to adopt his or her proposals".
Roberto convincingly drives this point home with a serious real-world medical problem: many hospitals experience high levels of cardiac arrest among admitted patients. According to one study, hospital personnel who did observe some advance warning sign(s) of cardiac arrest alerted a doctor only 25% of the time. Why? Nurses and other staff members might have noticed a change in patient monitoring data, but when observed in isolation, the data did not clearly indicate an urgent problem. Other warning signs are not based on quantifiable data, such as the patient's mental condition and level of fatigue or discomfort.
Often nurses and other staff felt a Miss Clavel-like sense that "Something is not right" with a patient who was indeed approaching cardiac arrest; but, with nominal hard data, if any, to support their concern, they did not feel comfortable alerting a doctor. The consequences of hospital cultures that unwittingly compel caregivers to ignore their intuition are high: once the window of opportunity to avert cardiac arrest closes, a "Code Blue" crisis is at hand — with a survival rate of less than 15%.
Many hospitals nationwide have since implemented a highly successful program to sharply reduce Code Blue incidents. Nurses and staff are actively encouraged to report observed warning signs, as well as concerns not yet supported by observed data, to a new Rapid Response Team. The team will arrive at an affected patient's bedside within minutes and actively diagnose whether further testing or treatment to prevent a cardiac arrest is warranted. Unlike a Code Blue team that "fights the fire" of a full-on heart attack, Roberto writes, a Rapid Response Team "detects the smoke" of a potential heart attack.
Traditional data warehousing and data analytics vendors often present their solutions as a way to make decisions ‘based on objective facts' rather than relying on ‘emotional gut feel.' The problem is, however, the known ‘objective facts' may not provide a complete — or even accurate — picture of what's really going on.
As Roberto's hospital case study illustrates, a gnawing sense that "Something is not right" should be interpreted as an alert that you probably do not have "all the facts," but rather just some facts. That is, you don't know what you don't know. On this key point, Roberto writes:
In highly analytical cultures, my research suggests that employees also may self-censor their concerns...In one case, a manager told me, "I was trained to rely on data [which] pointed in the opposite direction of my [correct] hunch that we had a problem. I relied on the data and ignored that nagging feeling in my gut."
So, listen to your gut, your intuition, as a signal that you need to dig deeper into the matter at hand. Actively seek out further information beyond the hard data available to you. Compare that information with your hard data and "connect the dots" for a far more complete picture, which may well yield surprising new insights.
What I find very exciting is that unified information access (UIA) is playing a vital role in empowering managers and leaders to connect those dots between data and other silos of information to realize those critical new insights.
UIA integrates, joins and presents all related information — structured data and unstructured content to complete the informational picture and significantly expand what organizations "know" to determine with confidence whether "Something is not right."
Analyzing just a single data type or source of information would not only fail to detect those new insights, but — even worse — such a limited, incomplete analysis may well point leaders in the wrong direction.
Stay tuned here on the Attivio blog for more insights on how UIA can help you get that complete informational picture, and, as Roberto writes, how to go beyond being a "problem solver" to becoming a "problem finder".