A Risky Appetite for Apps: Can Best Practices Help?
Apps are everywhere. A 2014 study found that there are roughly 18 million apps users in Canada and that Canada’s apps enterprises generate $1.7 billion annually. These numbers have presumably only increased in the last two years.
The market for legal apps, in particular, is significant and growing. Research that I’ve done along with colleagues at the University of Ottawa estimates that there are now several dozen apps available in Canada that purport to help with law-related issues. This number is continually growing. In the United States, the numbers are exponentially larger: hundreds of legal apps are available.
The developers of such apps are diverse, ranging from large, well-established companies or government institutions to individual members of the public. The functions that such apps perform also vary widely: apps can help individuals find a lawyer, create their own legal documents, gather evidence against authorities (for example, record police encounters or track heating violations by landlords), do their own legal research or better understand court or tribunal processes.
There is palpable potential for such apps to help reduce some of the financial, psychological, informational and physical barriers that many people face in accessing justice. There are also, however, significant risks. One threshold concern is that issues of digital literacy and access to technology mean that the people who need innovative solutions the most may not be able to access them due to educational, financial or other access constraints.
For members of the public who do use legal apps, there are concerns in relation to quality as well as privacy and security. For example, serious issues of informed consent arise when it comes to users understanding how the data collected by apps will be protected or used. Terms of service are purportedly one tool that can address informed consent, but there is a risk that such terms will not be clear or fully forthcoming. Research into the privacy policies and terms of service in fitness wearables published by Open Effect in 2016 revealed that the terms of service amongst fitness wearables have variable policies relating to access to, correction of, or deletion of wearable-related data, do not disclose how long they retain data and fail to disclose with whom data is shared. There are also real questions, of course, relating to whether users actually read terms of service and know what they are agreeing to in the first place. This reality was captured dramatically in a study which found that 98% of participants agreed to sign fictional terms and conditions to access a fictional website notwithstanding that they included a clause requiring users to give up their first born child as a form of payment.
There are equally significant reasons to believe that many legal app users won’t be in a position to sufficiently vet the quality of the information they are receiving through apps. In the health care context, research has shown that “most people fail to apply any criteria to assess the quality of Web-based information, and instead they trust that source is credible” but that “most users are more influenced by the design and appeal of a website when determining its trustworthiness.”
Lack of quality and sufficient protection of data can have serious impacts. As in the health context, there can be significant consequences in the legal context if wrong information is provided or if personal data is sold or maliciously compromised by hackers.
In the legal context, specifically, there are also important issues relating to whether users understand when they are receiving only “legal information” as opposed to “legal advice” and the consequences of this distinction. If only legal information is given and a lawyer-client relationship is not established, users will not benefit from confidentiality protections and quality regulation that comes along with receiving a legal service.
To a significant extent, legal apps are unregulated. Exceptions include, as noted above, when legal apps deliver legal services and are thus regulated by law societies or when apps fall under the umbrella of privacy legislation. But should there be more regulation to address risk? Perhaps not.
The American Bar Association’s recent Report on the Future of Legal Services in the United States observes that unnecessary regulation of entities that use new technologies and internet-based platforms to provide legal services directly to the public “could chill additional innovation, because potential entrants into the market may be less inclined to develop a new service if the regulatory regime is unduly restrictive or requires unnecessarily expensive forms of compliance.” The diversity of legal apps also creates challenges to regulation. It is not clear, for example, what jurisdiction law societies would have over apps that do not deliver legal services or otherwise engage in the practice of law.
In view of the above, it is worth considering if best practices could be a useful route to addressing risk. Indeed, there is already some general guidance available for app developers in relation to privacy, security, and accessibility. There does not appear, however, to be any guidance specifically tailored for the development of legal apps to be used by the public. This would seem to be a significant gap. In the health environment, the American Medical Association passed a resolution in 2015 calling for “the development and dissemination of best practices to guide the development and use of mobile medical apps.” Perhaps the Canadian Bar Association might consider the same? A more ambitious strategy could involve the development of best practices backed by a voluntary certification program that would allow app developers, where applicable, to signal their commitment and compliance to best practices.
To be sure, who is best placed to develop best practice guidance and operate a certification program are outstanding questions. Optimally, public and private developers would partner with relevant government institutions, non-profits and academic researchers to effectively tackle the problem with the benefit of a variety of viewpoints.
In the meantime, the risks associated with legal apps are still out there. The question isn’t whether legal technology, like apps, will enter the legal marketplace and create change; it’s already happening. The operative questions now relate to how we might harness such technology to best serve the needs of the public. Facilitative measures such as best practice guidance and voluntary certification programs are worth considering as a potential means to contribute to this end.
(Our research team is in the midst of preparing our final report on our apps research. Any comments or feedback about the above or related issues are welcome. We will be presenting some of our research as part of the Cavanagh LLP Professionalism Speakers Series at the University of Ottawa on October 13, 2016, 11:30am-1:00pm. Everyone is welcome. 1.5 hours of CPD credit!)
Comments are closed.