Column

A Digital Wolf in Sheep’s Clothing: How Artificial Intelligence Is Set to Worsen the Access to Justice Crisis

The increased presence of artificial intelligence (AI) in the legal sphere has been a controversial topic in discussions about access to justice. While some claim that AI will act as a great equalizer — ringing in a new age of AI-powered legal assistance for those who most need it — its rise to fame in the legal field instead seems to be catalyzing the opposite reality.[1] As private AI companies increasingly dominate the practice of law, we are beginning to enter an era of unchecked digital capitalism, where the legal field’s most underserved participants are increasingly falling behind the technological curb.

Unequal Access to AI Services

The average person’s exposure to AI is not an accurate representation of the kind of technology that is being used in professional spheres. While a lawyer using a popular AI chatbot is not unheard of (one has to look no further than the B.C. lawyer who cited fake cases generated by ChatGPT in a 2023 divorce case), the kind of legal AI largely used by firms is much more sophisticated, much more efficient, and much more expensive than any publicly available service.[2]

Private legal AI tools that increase speed, efficiency, and accuracy in areas such as e-discovery, legal research, and document automation are being increasingly used by big firms.[3] These tools can perform repetitive, time consuming tasks accurately and in a fraction of the time taken by humans.[4] As this technology continues to advance, AI tools that can do everything from answering routine legal inquiries to generating entire legal arguments are said to be the future standard of legal practice, and lawyers have reported using tools offering some of these capabilities.[5] By effectively replacing many of the tasks traditionally performed by junior lawyers and expediting timelines, legal AI allows professionals to focus their expertise on making the strongest arguments possible — giving them an undeniable edge over those who spend human labour on now-automatable tasks.[6] There is said to be a catchphrase going around the professional community: “AI won’t replace lawyers, but lawyers who use AI will replace lawyers who don’t.”[7]

For big firms and those with the money to hire these firms’ attorneys, these advancements are nothing short of revolutionary. AI’s positioning in the private sector, however, makes it a money game — those who can’t afford legal representation from these firms are unable to benefit from this innovation. While certain public sector organizations and nonprofits offer simple AI tools to increase access to legal information, these tools are nowhere near as sophisticated as those seen in the professional world.[8] This is causing the digital divide to widen at an unprecedented rate, which has concerning implications for people who self-represent, and access to justice as a whole.

The Digital Divide and Implications for Self-Represented Litigants

The tech companies at the forefront of legal AI development are not creating services with the aim of increasing access to justice. Rather, they are interested in making a profit by selling expensive products to the wealthy actors willing to pay for them.[9] As a result, those who can’t pay to play are largely excluded from the fruits of these technological advancements. This inherently problematic structuring of access to technology is an example of digital capitalism, which has been defined as, “the introduction of profit-driven digital processes of outsourcing, automatisation, dispersion, and commodification in the practice of law.”[10]

Big firms adopting new AI tools might initially appear innocuous, but the steadily increasing presence of private tech companies in the public legal sphere will have systemic impacts.[11] As technology advances at breakneck speed, and firms become desperate to edge out their competitors with the latest tools, the ability to offer the best legal services will no longer be a simple matter of how good a firm’s attorneys are: it instead becomes a game of who can access the best and brightest technology.

This commodification might cause lawyers to move away from being defenders of public interest and instead shift to become “one-dimensional businessmen of law” who value the protection and expansion of their markets above all else.[12] As private competition increases, the legal profession threatens to become increasingly fixated on optimizing profit instead of upholding the traditionally valued principles of justice or fairness, which could have worrying implications for the conservation of law as a public good.[13]

The threat of the legal field tipping further into a profit-driven structure is of specific relevance to Canadians who self-represent. The self-represented litigant (SRL) phenomenon is largely attributed to an inability to access justice due to the significant costs associated with obtaining consistent legal representation, and neither SRLs themselves nor the legal aid clinics that they sometimes qualify for are generally able to access the expensive AI tools increasingly used by big firms.[14]

As AI continues to infiltrate the legal field, SRLs will increasingly face opposing counsel with access to highly advanced technology, without being able to utilize resources anywhere near as sophisticated. Lawyers, who even without AI have access to a digital resource base that far surpasses what is available to SRLs, will now have an immense leg-up on an already-struggling demographic. The omnipresent class division that exists in the Canadian legal system is being exacerbated tenfold, and the most disenfranchised members of society will suffer for it.

While a digital divide in the legal field is not a novel issue, the serious upshoot in sophistication and efficiency of private digital services credited to AI technology should be a wake-up call to its urgency. It may have been possible to ignore the clear inequalities present in digital access to justice a few years ago, but as the era of AI sees the difference in quality of services accessible by wealthy actors and everyday people becoming increasingly stark, it will be much more difficult to turn a blind eye to the fact that any semblance of fairness is collapsing before us.

What To Do?

The inaccessibility of legal AI services can ultimately be credited to our tendency toward profit-driven technology and innovation. As long as AI remains largely unregulated in the private sphere, the risk of continued commodification of legal practice will remain. In order to promote equitable outcomes for SRLs and commit to true access to justice, public regulation of AI will likely be necessary in some capacity.

Attempting to bring AI under full public purview would be a deeply political undertaking that would (perhaps rightly) spike concerns of censorship and surveillance. The possibility of strict public regulation is, however, a conversation worth having if we wish to meaningfully protect the rule of law.[15] This might be the only solution that targets the root issue of AI currently existing within a free-market capitalist structure, although specific recommendations on how this shift might be possible are beyond the scope of this article.

Solutions that work within the current organization of the private sector might be more practical in the short-term. Providing government subsidies, grants, or other financial incentives could encourage tech companies to partner with community legal organizations to develop tools for underserved litigants.[16]

Beyond policy suggestions, there must be a collaborative effort from the broader legal community to develop a public, open-source legal AI tool specifically intended to increase access to justice. Currently, Queen’s OpenJustice program shows promise to become an open source tool accessible to lawyers all across the country, but further training seems to be required to make this beneficial for non-lawyers as well.[17] More recently, Harvard University’s Library Innovation Lab announced the launch of Open Legal AI Workbench (OLAW) last month, which consists of a common AI framework that researchers can collaboratively work on to prototype AI tools intended to increase access to justice for low-income people and underrepresented litigants.[18]

In general, keeping the impacts of AI at the forefront of access to justice work must become a top priority among the legal community. Those who stay on the sidelines out of concern for not knowing enough about AI, while understandably apprehensive about the complexity of this technology, will ultimately hamper the large-scale movement required to ensure that SRLs are given a fighting chance.

There must be a broad commitment to increase advocacy about this issue, alongside a recognition that the time to act is right now. Like it or not, AI is transforming the world in ways we are just beginning to understand. The choice is ours: ignore it and risk underserved communities being left behind, or meet it head-on with all that we have.

______________________

[1] Mary-Frances Murphy and Chris Owen. “Virtual Justice? Exploring AI’s Impact on Legal Accessibility.” Norton Rose Fulbright, November 2023; Zena Olijynk. “ChatGPT May Improve Access to Justice, but Won’t Replace Lawyers: Law Commission of Ontario Webinar.” Law Times, March 15, 2023.

[2] Jason Proctor. “B.C. Lawyer Reprimanded for Citing Fake Cases Invented by ChatGPT.” CBC, February 26, 2024.

[3] Amy Salyzyn. “AI and Legal Ethics” in Florian Martin-Bariteau & Teresa Scassa, eds., Artificial Intelligence and the Law in Canada (Toronto: LexisNexis Canada, 2021), ch. 12, 6-8; “What Is AI and How Can Law Firms Use It?” Clio, n.d.

[4] Ibid.

[5]Generative AI: A Guide for Corporate Legal.” Deloitte Legal, June 2023; Amy Salyzyn; “AI and Legal Ethics 2.0: Continuing the Conversation in a Post-ChatGPT World,” September 28, 2023.

[6] Jodie Cook. “Will AI Replace Lawyers? Entrepreneurs Share Their Predictions.” Forbes, February 15, 2024.

[7] Susan McGee. “Generative AI and the Law” 2023.

[8] See BC nonprofit tool “Beagle+,” n.d.; Robert Lim. “OpenJustice Takes Legal AI to the Next Level: Now Available to North American Academic Institutions,” October 16, 2023.

[9] Hassan Kanu. “Artificial Intelligence Poised to Hinder, Not Help, Access to Justice.” Reuters, April 25, 2023.

[10] Salvatore Caserta and Mikael Rask Madsen. “The Legal Profession in the Era of Digital Capitalism: Disruption or New Dawn?,” Laws 8, 8, no. 1 (March 1, 2019): 1; Sebastian Rosengrün. “Why AI Is a Threat to the Rule of Law,” Digital Society 1, 1, no. 2 (September 1, 2022).

[11] Caserta and Madsen, “The Legal Profession in the Era of Digital Capitalism”, 5.

[12] Ibid, 23.

[13] Ibid, 12.

[14]If You Still Think That SRLs Are Excited to Be Trying out as ‘Lawyers’, Watch This New VLOG.” National Self-Represented Litigants Project, April 21, 2015.; Julie Sobowale. “How Law Students Are Learning about AI.” National Magazine. The Canadian Bar Association, March 11, 2024.

[15] Rosengrün. “Why AI Is a Threat to the Rule of Law,” 2022.

[16]Bridging the Gap: Unraveling the Digital Divide.” Government of Canada, October 4, 2023; “Closing the Digital Divide: How to Maximize Public-Private Partnerships Between Communities and ISPs to Get It Done.” Vetro, July 18, 2023.

[17] Samuel Dahan, Rohan Bhambhoria, David Liang, and Xiaodan Zhu. “OpenJustice.Ai: A Global Open-Source Legal Language Model,” SSRN Electronic Journal, 2023, 3.

[18] Matteo Cargnelutti and Jack Cushman. “Cracking the Justice Barrier: Announcing the Open Legal AI WorkbenchC,” March 8, 2024.

Comments

  1. What a well-written article! AI is becoming so incredibly pervasive in professional fields like law and medicine; the time to consider how this affects us is NOW. Certain populations will certainly suffer if we don’t. Thank you for this insight and call to action!