Many factors impede the adoption of technology and hinder innovation in the legal sector — the reliance on billable hours, the complex structure of legal documents and processes, and the chronic underfunding of courts, among others. Part of what makes it challenging to address these issues is that there is no clear governing body capable of effecting systemic change. However, regulatory action could foster innovation in the legal industry by drawing a better distinction between what constitutes legal advice and legal information.
Advances in computing enable the presentation of legal information to researchers in more targeted ways. This brings us closer to automating the delivery of customized legal advice. Tools such as search algorithms and book indices have long helped users put information in context. Now, systems deliver information in the form of answers to questions or drafts of documents.
But the rules governing law practice across Canada impose restrictions on providing legal advice. Though they do not apply to the dissemination of legal information, it isn’t clear where the line between legal advice and information lies. That hampers the development of tools and services that could help alleviate strains on the justice system and create new ways for people to access legal information.
Meanwhile, widely available technologies like ChatGPT, which make functions that used to be exclusively done by humans easier to automate, are forcing us to confront a new reality. The design of these tools and the global nature of the internet make it nearly impossible to enforce the restrictions.
Legal advice is broadly defined in legislation, granting regulators the discretion to determine its scope. This could include various applications, from form completion apps to fully tailored services. Katie Sykes, a Thompson Rivers Law School professor, observed that the distinctions between customized legal services and automated systems’ functionalities are a matter of degree, not of kind. When a lawyer drafts a memo, they offer a specific and comprehensive response, whereas content generated by an app tends to be less so. A book may present the same information in a more generalized context.
Matthew Oleynik, the CEO of rangefindr.ca, a legal information provider on sentencing, says that the lack of well-defined boundaries, interpreted broadly, means that a police officer warning drivers about potential speeding tickets is effectively offering legal advice. At the same time, other standard legal products, like will kits, are tolerated. The question is, where will we draw the line for emerging applications and services?
The rationale for restricting legal advice is to safeguard the public. Until recently, developing legal tech tools primarily for professional users was a way to sidestep this concern. However, the recent court practice directives regarding the use of AI tools in their proceedings are designed to conduct oversight over tools like ChatGPT, as they might impact lawyers’ professional competence, given the challenges in verifying the accuracy of their output.
These tools are developed using established legal information and other content as training data. They present the information in innovative and sometimes novel ways. While they are not flawless, we must acknowledge that other sources of information may also be incorrect. Restricting their use too much will hinder the deployment of some of the most promising technical developments before they can be further refined. Oleynik argues that many individuals within the legal community want to innovate, but technical tools don’t have the protections in place for users that lawyers are required to have, such as professional insurance.
For now, many applications still clearly distinguish between the provision of information or advice. But the emergence of new applications and business models is blurring the lines between them.
When asked about his views on what should be considered legal advice, Oleynik emphasizes the importance of ensuring that innovative systems don’t harm or disadvantage users. Once this threshold is met, he believes regulation should be minimal, with more stringent measures reserved for unreliable or unproven technology.
Arguably, the people most affected by this lack of clarity will be those who work in the legal industry but aren’t lawyers. This group includes legal technology start-ups as well as professionals such as librarians and court clerks, who operate at the intersection of the law and public service.
The risk of being off-side in offering legal advice may deter them from assisting people with their questions and issues, which they might otherwise be inclined to do.
Finding effective ways to navigate these challenges is imperative if the public and the legal sector are ever to benefit fully from emerging technologies, which have the potential to create considerable social and business value. In this regard, regulatory sandboxes offer a great deal of promise to gather more information on encouraging the development of innovative solutions to legal problems while safeguarding the public’s interest.