Is there one right way to research the law?
Do most of us know the best? the most? or even a handful of useful search strategies? Almost certainly not, according to a few recent studies. As one of those studies highlight, even those who do probably aren’t sharing strategies in any event. These studies paint the picture of a profession that plops a few words into a single search engine, relies heavily on the machine to sort the results returns, and then stops looking within a few minutes having grabbed a few documents that look useful.
There are valuable methods, various tricks and tips that can be applied with any search tool, and innumerable ways to define an issue before entering the first keystroke. Yet for reasons of economics or habit, most of us will limit our efforts to application of a few comfortable strategies or engagement with one or two tools.
If there were but one way to do it right, then legal research technology would either be a natural monopoly (no more than one is necessary, and once that platform accrues dominant market share, there’s no market opportunity for the a second), or a pure commodity (if there’s only one way to research then everyone can build to that standard). But imperfect information, different tool sets, and variety of user types and needs means that at best an individual might be able to find alignment of all elements, but that provides no assurance that the next user will get equivalent or any value from following in lockstep.
We’re not the researchers we need to be
Courtesy of a tweet from LSO’s David Whelan, I learned of a fascinating comparative study out of the UK on the information retrieval habits of different professionals, including legal professionals. The legal group was comprised of 108 researchers (which, through self-identification, included 51 associates, 27 partners and 16 librarians) who responded to a survey posted in the LexisNexis-sponsored LexTalk legal community forum. Now, whether we should expect legal researchers to possess a diligence and commitment to the task on par with medical or patent researchers (or recruitment professionals) is an open question (spoiler alert: we don’t), but I was still quite surprised by results that suggest a fairly lackadaisical approach to research.
Prior to the advent of AI-driven or even natural language query-augmented research tools, it was accepted that digital legal research required at minimum, an understanding of your database contents, the time and cranial capacity to review and understand your findings, and either a wizard-like skill at crafting sophisticated Boolean query strings or a Sisyphean patience for the drudgery of trying again, and again, and again….and again, with different keyword and approaches to ensure you exhausted all possibilities. Whereas the patent analysts surveyed in this UK steady fit that description, reporting 12 hours as a median search session time and 15 being the median number of queries, our colleagues in the legal profession barely warmed their terminals with a 15 minute median for search sessions and 3 (three!!!!) being the median number of search queries. With respect to evaluation of the results, legal researchers reported engaging with an average (median) of 21 results for an average (median) of 5 minutes per result.
Imagined as a behavior of the single researcher, we have a picture of a lawyer who enters a search, scans the first page of results for a few minutes, maybe reads a few documents. Does this a couple more times and then wraps up the effort.
This may be entirely appropriate to the circumstances of a given lawyer for a given matter. After all, just as there isn’t one way to research, there isn’t one reason to research. And someone who knows his or her area shouldn’t have to spend more time than necessary to grab the insight or find the document needed.
We look where the light shines instead of shining more light
The UK study also showed that legal researchers have a much greater propensity than the other groups to rely on the relevance rankings of their tools and to prioritize currency/recency of content over precision relative to the query. Another way of describing this, is that legal researchers are more likely to go in with an idea of what they want and they stop looking when they find it, whereas health care or patent researchers are more inclined to go in with a theory and stop when their theory is proved and alternate theories are disproved.
All digital legal research tools shine light into the darkness of their records. Page 1 of the results to an initial query shines light in one spot. It’s up to the research to call on the tool(s) to shine more light, yet we don’t seem to do that, nor do we appear inclined to do it.
In another study of lawyer research habits, this one out of the U.S. and based on a survey of 100 judges, we learn that lawyers frequently miss referencing relevant authorities. 27% of judges report that this occurs “most of the time” and 83% say they see this problem “at least some of the time.”
This should come as no surprise. We’ve already accepted that in that other high-volume-of-information-needle-in-a-haystack parts of legal practice like litigation discovery or M&A due diligence, information gets missed and augmenting human limitations with technology has effectively become necessary to ensure a minimum standard of competence. Yet compared to other professional research groups, we’ve collectively shown remarkably little interest in improving the ways in which our research tools can help us. As shown in the image below, when asked about the ideal search engine, the highest value legal researchers placed on a search system feature set – recency of retrieved results – was essentially table stakes for other professions and in other applications.
It’s time we start thinking differently about how we research the law. Of equal importance, it’s time we start talking about how we do it. That UK study? It highlights how we keep our bad habits to ourselves and how there aren’t enough people out there sharing the good habits, with over 50% of legal researchers saying they don’t share their strategies with others and less than 5% indicating they share strategies beyond their organization.
There isn’t one right way to research the law. The Boolean string is but one approach, and an approach we aren’t even coming close to using to its fullest potential. Yet even as we re-learn our original training, we need to look beyond Boolean, and we need to open up discussions about alternate methods.
So much more is possible. What is it going to take to get us going?