The Rise of Predictive Programming in the Law

With the rise of technology, new programs are being created to predict the outcome of legal cases. These programs are often built on algorithms.

These algorithms generate outcomes by applying the facts of previously decided cases to the facts inputted by users. This means that the quality of the prediction is only as good as the data inputted by both the user and the programmer.

However, these programs should come with a warning. In “Weaponized Lies: How to Think Critically in the Post-Truth Era” by Professor Daniel Levitin he states:

GIGO is a famous saying coined by early computer scientists: garbage in, garbage out. At the time, people would blindly put their trust into anything a computer output indicated because the output had the illusion of precision and certainty. If a statistic is composed of a series of poorly defined measures, guesses, misunderstandings, oversimplifications, mismeasurements, or flawed estimates, the resulting conclusion will be flawed.

Although I am hopeful for the efficacy of predictive legal programs, I am concerned about the unchecked proliferation of such programs.

Several questions arise. Given that the technology industry is male dominated, will these programs also be designed primarily by men? How will that affect the accuracy of these programs? How will these programs be accessible to people, including accessible financially and to people with disabilities? How will lay people know which facts are relevant to input? How will the programmers know which facts are relevant for programming? What role will law societies play in regulating them and ensuring quality? What role will these programs play in submissions to the court?

(Views are my own and do not represent the views of any organization.)


  1. For more detail on these issues in criminal justice, see my Slaw note from earlier in 2017: