Are your big data initiatives solving your most challenging problems, or would a more focused effort produce better results?
“Imagine what we could do with all of this data in one place!”
“Distributing high-value analytics to end users will never be easier!”
If you’ve ever been involved in discussions leading up to an investment in big data, you’ve probably heard predictions like these. But what’s the reality?
Earlier this year, a few colleagues and I were doing some research for a presentation on how GE applies big data and analytics to improve performance in healthcare. We were looking for statistics on the success rate of big data strategies across the industry. What we found was shocking!
- Of no use; not serving the purpose or any purpose; unavailing or futile
- Without useful qualities; of no practical good
That’s right. IT research and advisory firm Gartner has predicted that, through 2018, 90% of deployed “data lakes” – enterprise-wide repositories of raw data – will be useless due to overload of information captured for uncertain purposes¹.
A big part of the problem is that leaders aren’t asking themselves the right questions from the start.
What exactly will you do with all this data in one place? What problems will you solve? What are some real-world use cases? What resources will this require? Specifically, what departments or service lines will you support first?
When I ask these questions, people often give many different answers simultaneously, or fail to form a good response at all. Many times, these critical questions aren’t even asked, and months later organizations find themselves in a real mess. They’re having a hard time getting accurate data upstream, they underestimated the resource investment they’d have to make up front, and the departments are still starved for data, often waiting months for a report from a centralized team.
More data, more problems!
I recently had the pleasure of participating in a panel discussion at H&HN’s Executive Forum in Chicago2. The topic was how organizations were using data and analytics to enhance clinical quality and reduce cost. Many of the panelists spoke of how much data they’ re amassing across their institutions … but to what end?
“We get data by the truckload,” said Tim Putnam, president and CEO of Margaret Mary Community Hospital, but added, “a big eye-opener for us is that the data is often incorrect. What do we do with that?”
Others echoed this problem with their own data and organizations. But don’t just take their word for it, IDC, a global market intelligence firm, projected in 2012 that the amount of healthcare data generated worldwide would grow 50x through 2020. Unfortunately, the same study also found in practice only 3% of potentially useful data is tagged and even less is analyzed³.
So, how can you avoid the pitfalls of overwhelming, inaccurate or useless data?
Little Data for the Win!
One solution is to start small. Instead of the much-hyped big data, try the little data – the low-hanging fruit, the departmental or service line specific issues that matter daily. This information is exponentially cheaper to access, problems with data integrity can be quickly addressed, and you can often realize greater advantages in terms of return-on-investment and total cost of ownership.
Defining a clear problem that you’re trying to solve and obtaining the minimum data set possible to take action are key steps to success. In fact, taking this approach we have helped one of our customers realize over $500,000 in cost savings in one year alone!
The next time someone starts talking about big data, ask yourself, what type of solution do I really need? What are the KPIs and data points I need to help me ensure I’m achieving the clinical, financial and operational outcomes I’m seeking? You might be surprised what wonderful, little improvements you’ll make.