Ontario’s justice system is fast approaching a digital crossroads.
New technologies, including algorithms, automated decision-making and artificial intelligence (AI), are set to challenge our long-standing assumptions and practices regarding human rights, due process and access to justice.
How well do justice system professionals understand these technologies? What are the broad legal implications of adopting AI in the justice system? How can or should the justice system regulate these challenges?
These questions, and more, will be addressed in a free half-day educational event on Wednesday, May 15th presented by the Law Commission of Ontario, Osgoode Hall Law School and Element AI, a leading Canadian AI developer.
Think of this event as “AI 101 for Lawyers.” The session will be a quick and easy primer on AI and its potential impact on the justice system. Topics will include:
- How does automated decision-making work?
- What could automated decision-making mean for legal decision-making?
- What issues should legal professionals be aware of?
- What are the emerging ideas and strategies for addressing automated decision-making in the justice system, including “ethically aligned design”, litigation, contracts and statutes?
We can’t wait to think about these questions. Across the world, there has been significant growth in the use of automated decision-making in a surprisingly broad range of legal contexts, including criminal law, access to government benefits, education, child welfare, taxation, small claims, immigration and refugee determinations and elsewhere.
For example, literally dozens of jurisdictions in the United States are using “predictive policing” systems and/or automated decision-making tools to assist decision-making in bail, sentencing and parole. Similar systems are being developed by police services and governments across Canada.
Early experience with automated decision-making in the justice system is decidedly mixed. Automated decision-making in some contexts has improved access to justice, reduced costs, and promoted speed, efficiency and consistency in decision-making. Unfortunately, experience also demonstrates these technologies can be opaque, inexplicable, and discriminatory.
The legal and technology communities are increasing focussed on how these systems can be developed and governed in a manner consistent with access to justice, respect for human rights and due process. These are challenging questions, particularly in light of rapidly changing technology. Consider the following questions:
- Disclosure: Litigants or parties should have a right to know if a legal decision was made by, or aided by, an automated system. But what needs to be disclosed? Simply the existence of the system? Or does disclosure include the data at the heart of algorithmic decision-making? What about the policies used to design the system, the system’s software or source code?
- Discrimination: Experience demonstrates that these systems have the potential to perpetuate or worsen biased decision-making in the justice system. How can we ensure automated decision-making complies with human rights law and principles? How can we ensure that AI systems prevent “data discrimination”? Can systems be tailored to comply with local or provincial or even national laws?
- Due Process: Automated decision-making systems must respect due process at both a systemic and individual decision-making level. How do we make sure respect basic due process rights including fairness, notice, evidentiary rules, a right to a hearing, reasons and appeals?
This event will help legal professionals understand automated decision-making and the related issues and implications for our justice system. We will also highlight several of the many Canadian and international initiatives, strategies, voluntary guidelines, and law reform efforts dedicated to addressing these issues in whole or in part.
The event will be held on Wednesday, May 15, 2019 at Osgoode Hall Law School, York University from 1pm to 4pm. The event is free and will be webcast simultaneously. Registration is through the Law Commission of Ontario website.
This initiative is part of the Law Commission’s multi-year Digital Rights Project, funded in part by the Law Foundation of Ontario. Readers interested in learning more about AI and criminal justice issues can link to the Law Commission’s recent Roundtable on Algorithms in the Criminal Justice System.
— Nye Thomas, Executive Director,
Law Commission of Ontario