November 10, 2012 is approaching. Tomorrow, in fact.
So? (No, it’s not my birthday.)
In Canada, tomorrow's date is written 10/11/12.
That’s too good a coincidence to ignore.
Therefore, with no authority to do so whatsoever, I hereby proclaim November 10, 2012 Numeracy Day in Canada.
Behold the power of numbers.
Behold the power of bad statistics to lead us astray. Behold the awesome grip upon us created by throwing around numbers, even when those numbers are cut from whole cloth and do not add up.
Behold the power of metrics. He or she who can wave metrics around truly holds the Talking Stick in a corporate organization. The fact that over 90% of the corporate metrics I've seen in my career were significantly flawed is no matter at all.
Behold the power of saying 90%. I made that number up. But it actually sounds reasonable. I once analyzed a 30-category scorecard and noted a statistical problem with every single metric on it. These problems didn't render the metrics useless, but they did skew what the people in charge thought they were measuring – sometimes slightly, sometimes dramatically.
So here's to Numeracy. Let's be smart(er) about numbers, starting on 10/11/12. And not stopping the day after.
An extended and anticlimactic footnote: What can we do to minimize our innumeracy?
- Question all metrics. Look behind them, both at how they were measured and what they measure. For example, what do client satisfaction (CSat) numbers mean? Do they tell you whether the client is likely to use you again? What if another firm has even higher client sat? Is there evidence that high CSat correlates with repurchase intent? Clearly DSat – client dissatisfaction – correlates with clients seeking a change. However, the opposite of CSat is not DSat but indifference. And how was CSat measured? Is an email or online survey reliable, or do you get responses only from those who aren’t very busy or whom you disappointed? If you call or visit, do they tell you the truth (absent major problems)? Finally, do you understand what drives CSat? If you don’t know for sure how to make it go up, what good does this metric do? In truth, CSat is very complex, and different clients have different needs and triggers.
- Don’t accept numbers in a vacuum. If a client says, “Cut rates by 5%,” what do they really mean? Is it okay to cut rates but bill more? To change the mix of professionals with which you staff their matters? Do they even care about rates directly, or do they really mean “cut our total bill”?
- Follow up. If a colleague says, “Task X will take 10 hours” and later says “I’ve spent 15 hours on this matter,” follow up. What else did they do besides Task X? Is that what you asked and expected? If their estimate was wrong (which is neither unusual nor a crime – estimating is hard!), remind them next time to let you know sooner rather than later, and learn what was different so that the next estimate is better.
- Add it up. If you commit to a client that your practice will spend 40 hours on a matter, how did you come up with that number? Was it “bottom-up” (adding up the estimates of each task) or “top-down” (saying this matter is worth 40 hours and then shaping the tasks to fit). Both are legitimate and appropriate for different circumstances. On a bottom-up estimate, ensure that you’ve captured all tasks (and added correctly). On a top-down estimate, make sure everyone on the project knows what it’s worth. And in both cases, confirm that everyone does their best to not exceed the allotted hours – and that they let you know quickly if they believe they’re going to run over by a nontrivial amount. (In other words, don’t sweat a 20-hour task coming in at 21; if it’s 30 or even 25, it’s a problem you need to resolve.)
- Don’t trust spreadsheets without reviewing them. They do what people tell them to do. They do that correctly, but that’s no guarantee that the spreadsheet creators are telling them to do the correct thing. Even if you don’t want to or lack the skills or time to look at the formulas behind the numbers, scan the sheet with a critical eye. Look for values and sums that don’t “feel right.” I’ve found numerous spreadsheet errors in other people’s work, and in my own, by looking for numbers that didn’t square with my expectations or looked odd in context. Obviously, expectations aren’t always accurate, but it’s a good place to start.
- And it should go without saying, but I’ll say it anyway: Don’t trust numbers in newspapers or magazines. They report generally what people say; few people either know or speak the whole truth; fewer harried reporters understand it; and even fewer publications have room for that level of information.