In last week’s post I talked about the Legal Trends Report, a data-driven benchmarking report based on actual billing data.
This approach an industry first, and as such the Legal Trends Report uncovers a number of interesting insights that I’ll be digging into over the next few weeks.
However, I personally found one most surprising finding of the Legal Trends Report to be the vast disparity between self-reported data and “real” data derived from real-world usage. Take, for example, utilization rate, the percentage of a lawyer’s day that ends up as being billing time. The Legal Trends Report found the average utilization rate for lawyers to be just 28% — again, using real-world, actual billing data from over 40,000 lawyers. Contrast this with self-reported data presented by the 2012 Lexis Nexis Billable Hours Survey, which reported utilization rates of over 60% across all firm sizes. How do we reconcile a 2X disparity between actual, real-world data and self-reported data?
As it turns out, this disparity can be explained by a well-understood phenomenon in the social sciences called the social desirability bias:
Social desirability bias is a social science research term that describes a type of response bias that is the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others. It can take the form of over-reporting “good behavior” or under-reporting “bad”, or undesirable behavior.
The social desirability bias can have a pernicious effect on surveys that rely on self-reported data, and legal is no exception. As we move to become a more data-driven profession, it’s clear if we want the ground truth, we need to depend on data, not self-reported survey responses.