Numbers Can Be Scary Sometimes

BOO! Did I scare you? No? Well, try this on for size…

PRODUCTIVITY IS DOWN 12%! Yeah, I’m sure that did the trick.

Numbers can be scary—even for data geeks like us. Lots of numbers can be cumbersome to understand, especially when you toss in decimals, negative signs, percentage “increases” and “decreases,” ratios, and… dare I say it?

I dare…

MISLABELED GRAPHS!!

Doesn’t that just sound gross, like something sticky you can’t get off your fingers?

Well, keep that mug of cocoa hot and that weighted blanket pulled tight. We have a tool to help numbers feel a little less… scary.

As in S.C.A.R.Y., an acronym. It might sound spooky, but it will embolden you to jump over numbers and not fall into a pit of data despair. (Are the puns catching?)

S is for “surprised.” Were you surprised when you saw a particular number in a presentation or report? It happens to everyone. Reflect on why you might have been shocked and consider what systems can be put in place so you can expect the numbers again. Use this to know your business better and avoid any blame or judgment.

Could a quarterly report be better informed by a monthly report? Did a recent customer survey leave you stunned? Then, consider better feedback loops so you know of any problems or successes sooner.

C stands for “credible data.” You can trust your colleagues and still hold a number accountable. Could a negative judgment of the information affect the one(s) who entered or pulled it? If so, would repercussions affect your confidence in the information? Even without malicious intent, the input and output of information could allow for errors. It would be wise to check the process from time to time.

If there’s a mistake, correct it. Then publicly share with the relevant parties that an error was made. Finally, see how future mistakes can be prevented or mitigated.

A is for action(able) plan. Data is most actionable when it is statistically significant, can be met without judgment, and informs a clear and agreed-upon plan moving forward.

When you receive (correct) information, first consider whether it is statistically significant. An increase in income by $5,000 this quarter would be impressive to a $10,000 quarterly projection but might generate less enthusiasm in a $10,000,000 budget.

Statistical significance can also include percentages. The former example was a 50% increase, which would sound hefty had each quarter not typically seen 50% increases. On Wall Street, a 5% change could be the difference of billions of dollars or tens of thousands. Even a 0.5% change could be statistically significant in certain contexts if it is unusual compared to other similar changes.

“Significant” is arbitrary and subjective. “Statistically significant” can create a common agreement and is a more objective way to evaluate information.

“Fix the store, not the score” as Dr. Jack likes to say. You may be tempted to challenge the numbers’ credibility before asserting any changes. That’s ok! But once you’ve validated the numbers enough times you might need to look inward. Adjust processes to meet your business needs that the data highlight. Even favorable results might warrant an action plan!

R brings us to Retest and Reevaluate. Oh, you thought that was it? The night is still young!

Once you breathe life into your action plan (and it has enough time to take effect), prepare to rerun your measurements. How do they compare to your first test? From previous numbers? Did your action plan work? If not, rinse and repeat: action to remeasure (validating again if necessary) to reevaluate and back to action until your “score” shows that your “store” is where you want it to be.

When you have good information that shows your business is where you want it: that’s how you get data to work for you.

Finally, Y gets us to why. Why? WHYYYY?????? *…End dramatic scene…*

Why would these numbers run this way? Who ran and delivered the numbers? What motivation brought these numbers to life, or at least into being presented to you? Could the data be used to spotlight one number as important or more so than another, distract attention from less favorable numbers, or inflate/deflate someone or something’s image? Could the numbers have simply been misrepresented in such a way that makes them look better (or worse) than they truly are?

When intentional, we call this situation: “Look at the bunny.”

“Yeah, sure, some numbers may not look so good. But wait! Look over there at that bunny [the impressive but not as important data]! Isn’t it so shiny? Isn’t it so fluffy and lovely and great??”

The presenter of the information may not intend or even be aware of the misrepresentation. Sometimes a graph designed to be simply more readable causes the data to look better or worse than it should. Validity in data refers to both the method and the person—or people—carrying out the method. Good data brings good decision-making.

See? Data isn’t so S.C.A.R.Y. after all. We even threw in a bunny! I hope that helps you all sleep better at night… or in the middle of reading your next annual report. We won’t tell you how to manage your time.

P.S.: If you’re a Monty Python fan, then the bunny we threw in might have made S.C.A.R.Y. a little spookier. Don’t worry. You’ll still get out of that operations meeting with your neck intact.

Probably.