Column

Quantitative Assessment of Access to Justice Initiatives

Quantitative methods are at once well-established and novel when speaking about access to justice. We’ve been reporting on our activities to funders, boards, and communities for decades, but we’ve also occasionally been complacent about what message we are conveying. When I think about data on the law and how we can approach using it better, I often think about Jon Snow and his search for the source of a cholera outbreak in London in 1854. Here you can see the original map that allowed him to identify the source as the water pump on Broad Street, which he created through research he carried out by knocking on doors and asking how many people in each house had died (here is some more information on Jon Snow’s cholera map).

This stage of identifying patterns through simple tracking to find the initial information about causes of problems is where we are when we speak about access to justice and data in Canada.

Data generally exists as a byproduct of other processes, or if it has been actively collected for a particular purpose. This means that it can lack value or be more resource intensive than is warranted. This lack of value or resource intensiveness is important in the context of data and the law, especially when we compare outcomes for different groups or communities, because we have no way to know what a correct outcome should look like or what an appropriate variation in society should be. In their paper “Big Data, Machine Learning, and the Credibility Revolution in Empirical Legal Studies” Ryan Copus, Ryan Hübert, and Hannah Laqueur wrote: “There often is no better indicator for the right decision than the decision that a judge actually made.” (preprint on SSRN).

So, given our inability to count “justice”, what can we do when we need to evaluate impact? Here is a short list of categories of ways to approach this question, with some discussion of their benefits and disadvantages, from easiest to most difficult.

The first way is to count interactions. This can be a simple count of the number of people who walk through the door, or the number of people who visit a website or use a service. The statistics are generally easy to collect, easy to explain, and easy to convey. They also allow comparisons over time to track impact year over year. However, they are frequently overly simplistic and don’t convey much meaning in terms of what people are actually doing or experiencing. This means that, though they are one of the most common ways of measuring impact, they are also frequently one of the least meaningful.

The next way to track impact in a community is to track metrics that are easily countable, but have a more significant element of impact. These may be things like the amount of time spent, a count of the materials shared, and other easily trackable ways of measuring whether someone was reached by a particular initiative. This is often more work and may require more training for people to collect. However, it gives better insight into the work being done and how communities are experiencing it than simple numerical tracking.

The third method of tracking impacts is attempting to count outcomes. This could be something like work product: divorces filed, contracts written, or negotiations conducted. It can be taken even further, trying to assess whether individuals had positive or negative outcomes. This is the most difficult, as it frequently requires significant effort to collect the data and, as it often involves communicating with people who are no longer connected with an organization, it will often have a significant number of missing data points.

The last way of conveying initiative’s impacts I will mention is storytelling. The lived experiences of people experiencing legal problems and other stressful events can be among the most meaningful ways of communicating the reality of those facing these issues. The problem is that they can easily be little more than anecdotes, which cannot be extrapolated beyond the individuals discussed.

When we talk about analyzing the impacts of our work, we are working interdisciplinarily. We draw on insights that have been developed in the fields of business, statistics, data science, and storytelling, which require analysis of the underlying conditions and exploration of impact on communities.

One of the easiest ways to find data for analysis is through tools that automate its collection, such as web statistics applications. This can be expanded to include the manual collection of data at the point of action and potentially more intensive research methods. This does not tell us how we can interpret what we collect. Telling the story about what data means requires a significant amount of interpretation and subject matter expertise to contextualize meaning.

We have to decide what’s important. We can engage in the ongoing collection of user statistics, and we can expand our efforts to work on full research projects designed to identify underlying patterns. Neither of these initiatives are simple. They also both have significant issues with how they can be deployed within organizations.

Another possibility is to consider developing key performance indicators, which are specially selected metrics that give a sense of how things are going in an organization at a glance. These can often be used crudely, and when they are poorly selected and managed, they can be ineffective at communicating what they were selected to do and can often incentivize or punish the wrong things. However, it is not possible to assess all possible metrics in real time. And, ideally, they provide insight into current operations, so they can be used to measure whether an organization or an initiative is working in the desired way.

We need to decide who we are collecting data for, what we want to communicate, and for what purpose. Without these initial answers, we cannot move forward with data driven projects in a sophisticated and meaningful way.

This column is based on the talk I gave at the 2025 People-Centred Justice Workshop in Vancouver in May on a panel titled “Measurement, Methods & Change in People-Centred Justice.” Thank you to the organizers for letting me share my thoughts on the subject and for hosting such a thought-provoking event.

Start the discussion!

Leave a Reply

(Your email address will not be published or distributed)