Thankfully I can begin by reporting that the statement above is not true. Sam Glover over at the Lawyerist (a blog he created in 2007 so he could “rant about bad legal software”) had a wonderful conversation with Ed Walters. Walters, in addition to being the CEO of Fastcase, is an adjunct professor at Georgetown Law where he’s recently been teaching a seminar called the Law of Robots. Glover chats with Walters about “Robot Lawyers and the Law of Robots” and “technology’s influence on the future of law.”*
Although Isaac Asimov’s short story series “I, Robot” does come up, this is no longer just the stuff of speculative fiction. As Walters notes we are surrounded by “robots and autonomous systems every day.” Robotic arms have built our cars and autonomous, unmanned vehicles will soon be picking us up and driving us around. “Computers fly airplanes, play Jeopardy, … trade stocks, and fight in wars.” And, as mentioned many times on this screen, technology, and specifically technologies that employ machine learning algorithms, are changing the way we do everything, including the practice of law.
But some feel that lawyering is somehow different. And as Glover mentions you’ll still find “lawyers in Star Trek.” Walters sees this perspective as faulty and a naïve way to consider the legal profession. He goes so far as to suggest that if the work you do is primarily form-based, falling toward the commoditized side of Richard Susskind’s legal practice spectrum, then it is “probably time to start finding a new speciality.”
It’s been predicted that by 2026 the average desktop computer (if we still have such a thing) will have the computing power of the human brain. Walters says the current cognitive computing star IBM Watson is operating at a level equivalent to a rat’s brain. And neither Glover or Walters are particularly happy with the term “artificial intelligence.” They prefer to think of these advanced technologies in terms of raw computing power, which results, if anything, in systems that currently possess a “clumsy” form of intelligence.
Walters goes on to ask, who is drafting the law of robotics? Who makes or monitors the algorithmic decisions embedded in these autonomous systems. The answer? Well, almost nobody: “there’s a handful of lawyers who deal with this and a handful of law professors who are thinking about it, it’s a pretty small club.” As it stands the law in this area is essentially reactive, responding to “terrible events,” or regulations proposed by lobbyists with “some sort of axe to grind.”
At one point the conversation takes a very interesting shift. They discuss the effect of law and regulations embedded directly into the programming code where the machine won’t let you do something wrong. But then, what if it’s the machine that does something wrong? Can you hold a machine criminally responsible? If you listen to nothing else I encourage you to review the section that begins at about 32:15 in the podcast.
Who, for example, is responsible when there’s a fatal accident involving two autonomous vehicles and a pedestrian? Glover asks: “How do the cars and the [pedestrian’s] smart phone decide who dies? Because, that’s essentially the decision they’ll be making, right?” That’s “some pretty sophisticated embedding of morality and law.” Will these decisions be made by a software engineer at Google? Or will these decisions be written into statute so these cars, by default, will react in a way that saves the most lives possible.
But Walters also wonders if you might begin by buying the inexpensive default car and then later upgrade and purchase a “freemium add-on” that will change the default to prefer saving your own life over others. Or maybe a software company, in an effort to maximize shareholder value, will decide to save the person who has clicked on the most adds for that particular company … What?!
In terms of law practice Walters doesn’t think anyone’s going to miss the kind of legal work that will eventually be displaced. If anything these changes will “empower lawyers” and improve access to the law by reducing the costs of legal services and creating a “bigger pie.”
It’s a thought provoking discussion and great way to begin this necessary conversation.
* the interview with Ed Walters starts at 11:45