Are You Making These Data Mistakes? (ALMOST EVERY ORG DOES!)
My overarching goal is to help anyone use data that wants to. As part of these efforts, I’ve worked in a variety of industries and organizations, and over-and-over I see some themes that hold people and organizations back.
Common Data and Reporting Mistakes
- Inaccurate aggregations (like averaging an average)
- Not fully exploring data you are making decisions on
- Reports overload
- Not actively measuring metrics that tie to business objectives
- Confirmation bias clouding your analytics
Look. I get it. We’re all trying to do the best that we can do with what we have. That said, mistakes are costly. Whether direct or indirect costs, it hurts. Check out this video if video is more your thing, but I’ve got some tidbits in this post that you won’t find in the video as I get to update my thoughts here wayyyyy easier.
Inaccurate Aggregations/Math
I mention the average of an average as data mistake number 1 because I see this one a lot. However, little nuggets of inaccuracies tend to crop up fairly regularly.
Put simply, averaging an average ignores how many units (widgets/calls/products/whatever) are in the different values you are averaging. Check out this post on averaging if you need more around this.
This specific aggregation is very common, but little errors that aren’t quite as universal is something to watch out for. The beautiful thing about our tools is that they insulate us a bit from issues, but keep an eye out. Here are some other things to keep an eye out for that tools won’t insulate your reports from:
- Percentages aggregations (should be recalculated with underlying data or weighted like the average of an average situation)
- Spreadsheets with hard-coded references pulling in bad info
- Scales of charts adjusted in a way that exaggerate results
Data Exploration is Critical Before Basing Decisions On It!
This one is fun. I am always happy to hear that an organization is eager to adopt a data analytics strategy in an effort to be a data-driven organization. That said, you can’t just ‘trust the data’ blindly. It is absolutely critical that the data is explored. Check out this post if you need more on the why and how much exploration to do.
Now, whoever is creating the reports should be doing this to some degree. Sometimes this comes in the form of checking against other systems for some level of accuracy. This could be you, or it could be someone else in your organization. However, YOU should be seeing some exploration into the metrics you are basing decisions on.
Time and time again, I’ve seen business leaders make decisions on numbers put in front of them that are unexpectedly based on assumptions no one in the room knew were baked in.
Asking questions of reports can help. At a minimum, if you are making a decision, questions force whoever is presenting you with the information to start explaining. How many times have you heard that listening is the most powerful thing you can do? This is one of those times. Leave dead air, and let whoever shared the info keep talking. That said, seeing some graphs that support those metrics is even better.
I can’t tell you how many times as an analyst that my analytics supervisor had me adjust reports to sell their story. Yup. An analytics supervisor. Not operations. A team leader of analysts with no direct skin in the game was having me alter PowerPoint presentations to sell an agenda. The changes were not unethical, but bias is everywhere. Insisting on seeing visuals that support the metrics you’re making decisions on at least give you the opportunity to uncover this type of activity.
Reports Overload
If you are maintaining (or having staff maintain) daily reports that you’re looking at monthly, you’re wasting resources. If you are maintaining reports that have been in production for the past 5 years, you could be wasting resources. This one is that straight forward.
Having too many reports is distracting and wasteful.
Not Joining Metrics to Business Objectives
This is highly related to reports overload, but there is a nuance here. Strategic business leaders have goal setting sessions. Before you leave that mental space, you need data and reporting strategy brainstorm, too. I have no idea where the term ‘What can be measured gets done’ came from, but it’s true. You have to tie business objectives to metrics.
I get that improving customer experience is likely to be a customer experience metric like a survey result, but I’m encouraging you to think deeper. What touch points do you have with your customer before that survey? Brainstorm the touchpoints so that they end up as line items to pursue as far as what you can measure to make strides with your objectives.
Tie your business objectives to your reports as part of your goal setting strategy – not as a sidebar to your business’ overarching strategy.
Confirmation Bias Clouding Your Analytics
Using your gut in business is not outright a problem. There are anecdotal events that are tough to measure. However, we have data to help manage the risk from gut-decisions, right? We do as long as you don’t let your confirmation bias influence your analytics space. Confirmation bias is essentially where your desires influence your beliefs.
Confirmation bias in data can rear its ugly head in the way that you ask for a report, the way you (or staff) creates the report, and the way that you perceive the results. We’re human, after all!
Processes can be helpful in avoiding this pitfall that feels inevitable. When asking for or thinking through a report, keep the problem that you are looking to solve at the front of mind. If you’re passing it on for someone else to create, make sure you’re framing it in that way. What is the problem? Remember the scientific method you likely learned in elementary? You can hypothesis, but your educated guess may or may not pan out. That’s the point. Let’s test that theory, but ultimately, you want to solve the problem. Even if it proves your theory wrong!!