Column

Make Lawyers Smarter, Not Dumber, or Worse, With AI

This might be one of the more important things you read. It’s purpose is to direct you to a submission to the Law Society of NSW Future of Law & Innovation in the Profession (FLIP) Commission of Inquiry. Robyn Bradey is a mental health consultant to the Law Society, NSW Legal Aid and other organisations. Her submission is among the many excellent videos to . This one should be compulsory viewing for every lawyer, and their management teams.

Robyn is testament to the benefits of a culture of diversity, and the foresight of those driving FLIP which also saw speakers from a Magic Circle firm, management consultants and corporate Australia present the themes of “Diversity, New Processes & Managing Change”.

First impressions would suggest that Robyn might not be the typical recruit to a corporate-focussed #OldLaw firm. However, her submission to the Inquiry showcased the benefits of a culture of diverse thinking and skills.

What got her involved with lawyers was the fact that the suicide rate of lawyers is higher than any other profession. The suicide rate for law students is now higher than medical students. The mental health problem rate for lawyers is 1/3, while for the rest of the population it’s 1/5.

Robyn suggested some causes for this issue could include the growth in vicarious trauma most obvious with those involved in the recent deluge in child abuse, domestic violence and also war crimes prosecutions. Less obvious, but also toxic could be the adversarialism of the legal system itself. From family law to high end corporate litigation increased client pressures, and the breakdown in traditional professional courtesies could contribute to a downward spiral in not just the well being, but also the reputation of lawyers as a whole.

She observed that some of the poor press lawyers have in terms of client relations could be attributed to their response to vicarious trauma, and the need to permanently turn down the empathy side of their nature, just so they can cope.

Hence, an Adelaide tax law specialist Adrian Cartland, who developed AI software called for research in his own practice, is enhancing its chatbot capabilities to act as “trauma insulation” between clients and the lawyer. Adrian was awarded a Government grant to explore Ailira’s use by clients giving instructions involving domestic violence. Apart from saving lawyers’ time, it’s intended to protect the lawyers from too many distressing client conversations. Adrian also points out that it also works the other way. From the clients perspective, Ailira will assist them without the client feeling that their personal behaviour/worth is being judged. AI and Chatbots listen dispassionately.

But there are other, more subtle causes of stress in lawyers that could also be helped by smart IT. Legal service visionary Noric Dilanchian, whose earlier FLIP submission can be found , pointed out to me that even routine legal practice is stressful because of the gap between what one expected from a career in law as a law student, and the reality. Good software, and new methods can make legal practice more enjoyable.

Depression can start as boredom or anxiety, which means that the brain is not developing, says Robyn. The result is that many lawyers are getting “dumber”. This is another positive argument for lawyers to broaden their skills. This however, is a balancing act, as more change is not so good for those already overburdened by change such as that brought on by government policy shifts and budget cuts in Legal Aid etc.

So as the brain evidently likes to learn, it seems there may be a role for IT here not just as an enabler, but as a stimulating challenge which one undertakes when embracing IT.

The year 2016 is likely to be recognised as the beginning of the beginning when it comes to real change in the legal services business. IT will be an essential enabler. The acceptance of the inevitability of change is now happening quickly. It has been reported that 25 of the AM100 law firms have embraced AI (whatever that means). The downside is that this risks stressing some lawyers who will see it as a threat.

Firms could be repeating the “mistake” we made in introducing sophisticated courtroom evidence display systems in the mid 1990’s.

I was reminded of our experience from that time, when Karl Chapman of Riverview recently :

“Like all technology the key is to not to be taken in and be excited by the hype, but to ask: ‘What problem are we trying to solve?’ Having answered this question it’s possible to have a sensible conversation about the role that tech generally and AI specifically can play in the solution. We advise people to get the foundations right first – data layer, workflow and processes, reporting requirements … before trying AI solutions. There is a danger that jumping straight to AI can be like building a roof having not put the foundations, walls and joists in first. Getting the foundations right ensures that a function is AI-enabled.”

My own experience in the 1990’s with knowledge management, evidence and court display systems led me to conclude, that while it was not essential to understand databases, visuals and images etc, to use the systems, users would be able to achieve more if they did. The same applies to AI. While it is tempting to introduce the big green AI button, lawyers need to understand why it should be introduced and what is involved. Human foundations underpin all long term successful IT projects.

In response to the question in relation to what problem IT solves, I would say that in some instances, the need for AI is because lawyers didn’t understand and routinely use databases. Later, to their detriment, data becomes a ‘big data’ management problem. An example is the need for data extraction software because no one had the foresight to put the contracts in a database at the time of creation.

Future AI-powered firms will be able to break through to the elusive 4th dimension, beyond “Better, Faster, Cheaper” to add “Profitable”. It will allow lawyers the luxury of tapping into the Latent Legal Market that is a huge buffer of unmet legal needs. While AI decimates employment numbers in other industries, AI could be very good news for lawyers – provided they are given the resources to adjust to the big changes ahead.

If handled properly, the process of learning about IT can be a positive experience in itself for lawyers. Pairing new lawyers with older lawyers in an IT v legal experience, mutual mentoring approach to training, has been shown to be a real win/win. It has both career starting and extending benefits. Positive change can be easily helped with the use of short focussed meetings, and greater use of Skype, or the appropriately named FaceTime Apple equivalent.

Maybe we must also look beyond #NewLaw, and add new targets to #NextLaw’s emphasis on prevention etc. Mantras such as Ron Friedmann’s #DoLessLaw and which #SimplifyLaw, need to be taken more seriously. The short-term financial benefits from ignoring them are outweighed by long term financial and health benefits for lawyers, as well as huge societal benefits.

The legal system needs a new architecture. One that is, among other things, less adversarial, and less wasteful. Legal service providers built from the ground up with AI and other advanced tech in mind would fare better than those trying to retrofit it to legacy systems, and attitudes that you can’t just add-in a feature called “AI”.

“If you talk to the automakers, they all think that autonomy is a feature they’re going to add to their cars. The Silicon Valley companies think it’s a brand new architecture. It’s a bottom-up reinvention of the fundamental assumptions about how these things work.”

The legacy of AI could be that it not only augments the intelligence of lawyers by adding extra knowledge, but along with workflow management and other software, frees them from the shackles of some of the mind shrinking boring work. It would be even better if AI software was used to protect them from much of the distressing work that leads to vicarious trauma in lawyers. A perceived weakness of empathy-free robots can be turned into a strength to allow more lawyers to once again also be seen as humans.

Just as the future will see military robots on battlefields, software should be protecting lawyers from what is proving to be hazardous legal work. The best application of AI in law will be preventive in nature. Solutions should not be limited to just helping clients avoid or deal with legal problems, but also extended to try to reduce collateral damage among legal advisors, and others.

Comments

  1. David Collier-Brown

    In a related discussion, people automating things (unstalling computers in this case) often try to make it try to replace humans. That then leads to programs no-one understands, and the humans have now forgotten how to do it without the program.

    http://queue.acm.org/detail.cfm?id=2841313

    The authors suggest we should try very hard not to create an “Ultron”, and instead make programs more like Tony Stark’s flying suit.

    –dave