Could Mediators Be Replaced? Examining Fears About the Role of AI in Collaboration & the Future of the Mediation Profession
My mother instilled within me the mentality of not going along with what everyone else is doing. It is a fairly common parental lesson, really. Typically presented in the context of all of one’s friends jumping off of a cliff with the question posed of if you would, too. While I wonder why everyone is jumping – there might be good reason and that information is needed to determine the best course of action – Wikipedia explains the lesson as “a rhetorical question used to discourage the interlocutor from blind obedience”.
There are no less than 20 articles already on SLAW about artificial intelligence (AI). I am not contributing to the topic because everyone else has. Actually, I was hesitant to in light of the volume of other contributions. Then, I was encouraged to – following a session I recently presented with Susan Guthrie and Colm Brannigan at the ADR Institute of Canada’s 49th Annual Conference. We focused on the opportunities and challenges of AI in mediation. Upon being asked to share my talk, I decided to buck my mother’s advice…
In October 2016, Drake released the song Fake Love. It was the lead single off the 2017 album More Life. On the track, Drizzy samples and sings part of Back Stabbers, a 1972 song by the O’Jays. The popular lines in the song nicely sum up many of the fears expressed about the emergence of ChatGPT and the prospect of AI getting involved in addressing conflict. Consider the following as a reflection about our robot friends:
They smile in my face
Whole time they wanna take my place
Whole time they wanna take my place
Whole time they wanna take my place
I joke that the friendly computers we have come to rely upon to keep our refrigerators stocked and get to our destination are actually out to get us. Want an example? Google news stories about people following the directions of their GPS and driving into bodies of water. Michael Scott does just that in a classic scene from The Office, and it is funny because it is real.
My cousin, Radhika Bhalla, is an Organization Psychologist in South Delhi. In her work, Radhika has explored both the significance of human contact in interactions (such as in offering empathy), and the tendency some have to unjustifiably trust the machines that support us. We mentally check out and thoughtlessly obey. Some of us have even developed cognitive bias opposite reactive devaluation. We grant unwarranted authority to what computers tell us. As Michael Scott says as he is driving into the lake: “The machine knows!”
When it comes to mediation, there are three areas of practical implementation of the use of AI to consider. They offer a spectrum of concern for today’s mediation practitioners, ranging from the comfort of added efficiency to full-fledged freaking out about what Drake warned us about.
Administrative Support
Many mediators already use online availability calendars and intake forms. It is not hard to envision AI supporting mediators further. Automated conflict searches, agreement preparation and invoice processing are conceivable and would appear helpful. Reducing the administrative functions mediators have to spend their time on would free them up to focus more on their work. The threat level here is directed more toward administrative staff – the human beings who assist mediators in their scheduling, formal appointment and book keeping. While I worry these administrative jobs may disappear, prominent Nova Scotia lawyer Devon Cassidy views AI as having the potential to better support legal assistants in the future. Automating more basic functions to allow them to spend their time interacting with clients and applying themselves to more mindful tasks.
Evaluation
In the course of a mediation, AI could help a mediator maintain their neutrality when offering evaluations. Consider a mediator tasked with suggesting likely outcomes if the matter were to proceed to court or arbitration. Particularly if a bubble needs to be burst in the course of a reality check, a party may feel the mediator is working against them if the mediator offers a bleak outlook.
Provided the data source is transparent, accurate and up-to-date, a computerized evaluation could separate the predictive analysis from the mediator. This could be seen as assisting mediators as they do their work. I cannot express enough how key the integrity of the data forming the base of the analysis is, but it is not difficult to envision how a reliable set of source data could offer very useful guidance at mediation. Related concerns surround mediators being less valued if their subject matter expertise is no longer needed or, worse, if they are seen as not being needed.
Empathy
Now, this is where we really get to freak mediators out!
Whether online or in-person, in joint session or caucus, many mediators see their irreplaceable value as the human factor they bring to their work. The real skills (formerly known as soft skills) they possess. The mediator offers empathy, gets to the underlying interests and offers those engaged in conflict the chance to feel heard. Can a computer really do all that?
The premise of 1980s family sitcom Small Wonder involved a robot being portrayed as a family’s adopted daughter. Hilarity ensues through the attempts the family makes to present the robot girl as human, including the never empty well of humour derived from literal interpretations of common sayings. What is not funny to many mediators is the suggestion that a machine could ever pick up on the emotional cues of their clients. Yet, developments in software include progress in AI detecting emotion. Morphcast is an available Zoom plug in. Consider how heart rate and blood pressure monitors can identify emotional responses that are not visibly obvious. Writing aids can identify inflammatory language, offering suggestions of alternate ways to make a point. To suggest AI offers no empathy is mistaken.
When we fully consider how AI can aid mediation, and evolution in terms of improvements in how technology functions, the fear that mediators could one day be out of a job feel warranted. I prefer the view that human beings will always be integral to the mediation process. That the cultural competency required to understand the human dynamic will never really be fully replicated by an algorithm or an android. To me, it is more realistic to anticipate both mediators and their clients being better supported with the technology. The question that remains, then, is… will you join us in the jump to embracing AI? All your friends have already!




Comments are closed.