Thursday Thinkpiece: What Makes Court Forms Complex?

Periodically on Thursdays, we present a significant excerpt, usually from a recently published book or journal article. In every case the proper permissions have been obtained. If you are a publisher who would like to participate in this feature, please let us know via the site’s contact form.

What Makes Court Forms Complex? Studying Empirical Support for a Functional Literacy Approach
15:1 Journal of Law & Equality 31 (2019)

Amy Salyzyn, Associate Professor, Faculty of Common Law, University of Ottawa
Jacquelyn Burkell, Associate Professor, Faculty of Information and Media Studies, Western University
Emma Costain, Associate Lawyer, Nelligan O’Brien Payne
Brandon Piva, Associate Lawyer, Twining, Short and Haakonson, Barristers

Excerpt: Abstract, Part III: The Study (Methodology, Scenarios, Study Observations – Part 1), Part IV: Study Conclusions & Recommendations [Footnotes omitted. They can be found in the original via the link above]


Court-form complexity is a critical facet of the access-to-justice crisis in Canada. The issue is particularly pronounced when it comes to self-represented litigants (SRLs) without any formal legal training. Given that the problem of court-form complexity is widely acknowledged, it is surprising that there is very little research into exactly what makes court forms difficult for members of the public to use. This article describes the second of two empirical studies examining court-form complexity. The first study assessed the complexity of court forms using a quantitative rating tool grounded in a functional literacy approach. In the second study, which is discussed here, we examine complexity using direct feedback from individuals, similar to SRLs in their lack of legal training, who were attempting to complete a standard court form. This second study substantially affirms the results of the first study, confirming that an instrument grounded in a functional literacy approach such as the rating tool is a promising way to rigorously study court-form complexity. We also learned, however, that testing using human subjects can flag additional issues of complexity. We conclude that using a functional literacy approach, coupled with robust human testing, generates useful insights about court-form complexity that will facilitate the redesign of court forms and supporting materials to better meet the needs of SRLs. The broad recommendations generated from both studies were similar and include form design changes (limiting the use of acronyms, developing a more comprehensive glossary of terms), the use of online forms with just-in-time information delivery, and the deployment of innovative service models that offer form-completion support to SRLs.


A. Methodology

A total of twenty individuals participated in the study, recruited through posters placed on the campus of a large Canadian university. Participants were required to be undergraduate students studying in any faculty other than law. Those with significant legal training—such as, for example, lawyers and paralegals—were excluded. Additionally, participants had to be fluent in English. Though the educational level of our participants could limit the generalizability of our results, we do not consider this to be a serious concern. According to Macfarlane and colleagues, two-thirds of SRLs in Canada have university, professional, or college training, and, thus, the majority of Canadian SRLs are similar to our participants in terms of educational attainment. The other one-third of Canadian SRLs have a lower level of educational attainment and are likely to have even greater levels of difficulty with the forms. We also recognize that our participants, who are not “real” litigants, have no real stake in the hypothetical legal situation and, thus, may take less care and attention in completing the forms than would an actual SRL. Evidence from decision-making literature, which typically examines hypothetical decisions, suggests that there are surprisingly few differences between “real” and “hypothetical” decisions. Moreover, our goal is not to assess the quality or accuracy of their responses but, rather, to establish their perception of the difficulty of the court forms. We believe that their position as hypothetical applicants will have relatively little impact on the latter.

Before beginning the study, participants were provided with a letter of information and a consent form. Once they had read the letter and any questions were answered to their satisfaction, they signed the consent form and began the study. Each participant was read one of the following four fictional scenarios that described the particular situation or complaint for which they were completing the form.


1. Roughly three weeks ago, at 10 pm on a Thursday night you were in your apartment watching TV and relaxing. All of a sudden, you heard someone fiddling with keys at your door and the next thing you know, the caretaker for your building was inside your apartment. He began swearing at you loudly and demanding that you turn down your TV. You were shocked by his language and his request as the TV was playing at normal volume. In response, you turned off your TV and calmly asked him to leave. The caretaker did leave, but he ended up doing the very same thing a week later. The second time he entered your apartment, he was even angrier. He threatened to kick you out of your apartment. Once again, you were shocked by his behaviour and his complaint about the TV as you always play it at normal volume.

2. You return to your apartment one day to find that you can’t get in because the locks have been changed. You call your landlord, and he says that he is evicting you. It has been 15 days since this happened, and you have not been able to get back into your apartment. All of your things—your furniture, your clothes, your books, etc. are still there. Because you have been locked out of your apartment, you have had to find a new place to sleep for the last two weeks. You have stayed at hotels on some nights and also crashed on friends’ sofas.

3. For the last month, your landlord has been carrying out renovations in your apartment building. At first, you did not really notice that these renovations were going on as they were taking place during the day when you were at work and were being done on a different floor of your apartment building than the one that you live on. During the last two weeks, however, renovations are now being done on the same floor that you live on. The work has taken place from roughly 8 am to 11 pm at night. It is very loud and you are not able to sleep properly. You have asked the superintendent if the work could stop at 6pm so you could enjoy your evenings and sleep properly. In response, she told you that this is not possible as the work is behind schedule.

4. You moved into a new apartment unit on September 1, 2016. Everything was going well with your new living space until the weather started to get a bit cold in mid-October. When your apartment unit started to feel cold, you adjusted the thermostat in your unit to increase the heat. This didn’t seem to make any difference. Your landlord told you that he doesn’t turn on the heat in your building until December 1 each year. You have been able to stay in your apartment but it isn’t very comfortable due to the temperature. You have had to buy a space heater to put in your room in order to sleep at night.

Participants were provided with a copy of the form entitled “Application about Tenant Rights (Form T2) (Landlord and Tenant Board)” (T2 form) and the associated information guide entitled “Application about Tenant Rights Instructions” and instructed to complete the form.

Participants were asked to “talk aloud” while completing the form, articulating any questions or difficulties they were experiencing. They were asked to vocalize what they were doing, the decisions they were making, any difficulties they were experiencing, or anything they found confusing while attempting to complete the form. Afterwards, participants engaged in a debriefing interview. During the interview, they were asked to reflect on their experiences completing the form. They identified aspects of the task that were particularly challenging and discussed parts of the form and accompanying explanatory materials that were helpful in overcoming these difficulties.

Both the think-aloud protocols and the debriefing interviews were transcribed for analysis. All identifying information was removed from the transcripts, and participants were assigned a numerical identifier. These materials were analyzed to identify common themes related to the challenges or difficulties experienced when completing the form. Analysis was guided by, but not limited to, the potential challenges identified in the task analysis results for the T2 form reported in our 2016 study.

C. Study Observations

The T2 form consists of four parts. Below, we detail each of these parts in turn in order to outline precisely how the perceptions of the participants in the second study compare to the predictions made in the 2016 study about what tasks in the form users would find challenging. More particularly, by going part by part or, essentially, task by task, our intent is to provide the reader with a clear outline of where both studies identified tasks as difficult or, conversely, where the results differed between the studies.

1. Part 1

Part 1 of the T2 form asked for basic information about the application, such as the address of the rental unit involved and the addresses of the tenant and the landlord. In the 2016 study, the tasks contained in this part were all classified as low complexity. The reports of the participants in the current study generally aligned with this analysis—overall, there was no significant difficulty expressed or observed in relation to the tasks in Part 1. There were, however, certain terms that the participants found vague or difficult to understand but were not so identified in the 2016 study. Multiple participants, for example, expressed confusion about a question as to whether the form user was applying not only against their landlord but also “against your superintendent or landlord’s agent because they caused the problem.” The participants were unsure what the terms “superintendent” and “landlord’s agent” meant. For example, the following are comments made in the debriefing about these terms by four different participants:

I didn’t really understand what a “landlord’s agent” or “superintendent” was. If I had a landlord agent or superintendent, I wouldn’t be able to identify who that person was. Maybe it’s because I have not come across those terms before; maybe other people would know. (Participant 3)

I wasn’t sure if that meant landlord or agent. … I didn’t understand what it meant. I thought it meant the landlord, so I put yes. My landlord caused the problem and so that’s what I meant by that [answer]. I guess I might have been confused. I am still kind of confused about that question. (Participant 7)

Looking at this, with my experience in renting, the superintendent is kind of acting on behalf of the landlord, but their agent—I don’t know if it’s a rental agency or if it’s a legal agent acting on their behalf. But, I’m applying directly against the landlord and superintendent, so I don’t think that that would be applicable, and if I wasn’t applying directly against them, then I would figure out what agent meant. But if it wasn’t against the superintendent or landlord, I think I would have some questions about what specifically “agent” was referring to, like a rental agency—cause what’s the difference between that and a landlord. Or legal agent. If it was a legal agent, I don’t know if this form would be applicable. (Participant 18)

I don’t know. I am not even sure what the difference is [between the landlord and superintendent]. I wasn’t sure of their roles. It would have been helpful in the form to say that “the landlord is the person who is doing this, and the superintendent is this.” At my current apartment [in real life], I don’t even know who the landlord versus superintendents are. (Participant 9)

When filling in the form, one participant wondered aloud “if there’s a difference between superintendent and landlord” and chose to look up the definition of superintendent on their phone but found that the distinction “still seemed a little bit confusing.” Some participants who experienced difficulty with these terms looked to the guide for definitional help. The guide, however, does not define or provide examples of these terms. An expanded glossary of terms would clearly have been helpful in resolving this difficulty.

Participants also found puzzling the request for street “direction” when filling in the address of the rental unit. Five participants flagged this prompt as generating an issue for them:

I have never seen [a form] that asks for your street type before, that’s very strange. I have also never seen a form that asks for the street direction before. That is really strange. (Participant 7)

“Direction”—I don’t know the direction. I feel like I never remember those sorts of things. I feel like offhand, I probably wouldn’t know what the direction of my street is, so maybe if [my street name] doesn’t specify direction, I don’t need to put it there. I would probably just skip it and take my chances. (Participant 9)

Just reading the first page … opening it up. I’m going to start with the address part. I guess it’s a little bit weird that they have “direction” on here. I will leave that blank on mine. (Participant 2)

“Direction”? How would you know the direction of your apartment? Can I just leave that blank if I don’t know? If I just live on a street and I don’t know what way my street is going? … I am just looking at the guide because I don’t understand [reading] … okay “if the street names include a direction.” (Participant 4)

“Direction”—that also seems pretty vague. Maybe I shouldn’t write anything if there isn’t a direction included in the street. … I guess I probably wouldn’t include anything. Let’s leave that then. (Participant 6)

One participant noted confusion about who should be listed as a “tenant” as she was filling out the form:

Okay “Tenant Names and Address. … If there are more than two tenants, complete scheduled parties”—so does “tenant” mean my whole family, if I live there with my family? Or does it mean just people who are paying? [Checking guide] okay so I am assuming I would put my whole family in? Say I’m like a single mother or something. So that’s what I am going to do—I am going to put my kid’s name down too. (Participant 4)

During her debrief, this participant elaborated:

I guess it didn’t really explain in the guide [what to write] if there was kids or anything like that. I am kind of realizing now that maybe [how I filled it out was incorrect], because what if you have more than two tenants? I guess it wasn’t clear to me if tenant refers to someone who is over eighteen or someone who is younger then eighteen, or whatever it may be. So I just put both people living there, which are me and my son. Whether that’s right or wrong, I don’t know. (Participant 4)

This too is a definitional issue that could be resolved by an expanded glossary of terms.

Other difficulties related to whether the alternatives explicitly or implicitly offered in the form covered all of the possibilities. Some participants expressed some confusion over the intended meaning of the term “living in” with respect to the rental unit. This issue arose with participants who were given Scenario 2, which involved their landlord changing the locks to their apartment and telling them that they had been evicted. All of the participants who received this scenario expressed difficulties with a question on the form that asked if they were still living in the rental unit. Here are two examples of the comments made:

For “questions about your tenancy,” it asks: “do you still live in the rental unit,” and then if you answer no, it asks: “when did you move out.” I find that question a bit ambiguous. If you were locked out, you weren’t willingly moving out. So I think it would be helpful if they had a comments line where you could add something like that—to say that you were locked out on that particular date. (Participant 4)

Okay, “do you still live in the rental unit?” It doesn’t ask you if you were evicted or not. I don’t live there. [Long pause]. … I just have no idea how to answer this. I am just going to say no? This is really difficult. … I am just going to say no, because I don’t live there, and that is why I am doing this. (Participant 7)

It is notable that their difficulties do not turn on the question of whether, in the hypothetical scenario, they are actually resident in the unit at the time they are filling out the form. Instead, they are expressing confusion over the specific meaning of the term “living in,” coupled with the question of when they “moved out”; some participants pointed out that these two alternatives do not effectively cover the range of possible situations and experiences of a tenant locked out of their unit. This particular issue requires more than an expanded glossary since, in this case, it appears that the form does not appropriately anticipate (and, thus, account for) the varied experiences of those who might be completing it.



In general, the tasks that were identified as complex on the basis of the 2016 analysis also posed problems for the participants in this study. Consistent with our earlier analysis, the form’s use of technical and legal concepts (for example, “abatement” and “substantial interference with reasonable enjoyment of the rental unit”) was a source of confusion for some participants. Multiple participants also expressed frustration with having to move back and forth between the form and the government-published guide on how to complete the form, an activity that the 2016 study had identified as leading to increased complexity. In addition, some of our participants found that, in some cases, the guide was incomplete insofar as it did not provide the definitional or explanatory support they were hoping to find. Finally, as predicted, several terms identified as distractors in the 2016 study (for example, the use of the term “representative”) did confuse some of the participants.

In addition to those difficulties predicted in the 2016 study, participants identified some other challenges with the court form. In particular, there were several technical and legal terms (for example, “superintendent” and “remedy”) that multiple study participants in this study found challenging but that were not identified as potential sources of complexity in the 2016 study. Additionally, for a large proportion of participants, the prompt to include “direction” when filling out their street address operated as a distractor. Many thought that this prompt was asking for an “actual” direction of their apartment or street as opposed to what was intended by the prompt—namely, any official directional information that is included in their address (for example, Wellington Street West). Finally, multiple participants in this study found that the “remedies” section of the form was confusing since it was spread over many pages and did not have a summary of all of the available remedies at the beginning. This type of structural issue is not something that the rating tool used in the 2016 study was designed to identify.

Some of these differences may be attributed to the fact that the individuals applying the rating tool in the 2016 study were law students and, therefore, had some familiarity with the terms and concepts used in the form. The term “remedy”—a legal concept that law students would be familiar with, for example—was not initially highlighted as complex, but several participants in the current study found the term confusing. In other instances, however, the difference in results cannot be easily attributed to the greater legal training of the researchers conducting the quantitative document analysis. As noted above, the participants found the prompt to include a street “direction” (a non-legal term) to be confusing, despite the fact that this task was not highlighted as potentially causing difficulty in the first study. It is possible that this discrepancy resulted because participants were required to attempt to fill in the form (and, thus, give direct and practical attention to what responses were appropriate), whereas the researchers in the 2016 study contemplated the complexity of the request in more abstract terms. Alternatively, this difference might be the result of the larger number of perspectives (twenty participants as compared to two raters) represented in the current study.

The participants suggested some relatively straightforward ways that court forms could be improved. To be sure, it would be important to verify any design changes with human participants in order to properly evaluate their impact, but their recommendations provide clear suggestions for the design of court forms and supporting documentation. Among their recommendations were simple document design changes. One suggestion was not to use acronyms in the government-published guide for technical terms (that is, “RTA” and “LTB”) even if these acronyms are spelled out at first use. Another suggestion was to provide a summary list of all of the remedies at the beginning of Part 3 of the form so that users would have a better idea at the outset of how many remedies there are and what they address.

Second, multiple participants expressed the view that easier access to definitions of terms and information contained in the guide would have been helpful as they were filling in the form. In our view, this supports our previous recommendation to consider the use of automated forms that support tailored and “just-in-time” information delivery. As noted in our 2016 study, in adopting automated forms, it is necessary to ensure that the new form does not simply transplant problems from a paper-based environment to an electronic environment. Shannon Salter and Darin Thompson’s case study of Canada’s first online tribunal—British Columbia’s Civil Resolution Tribunal (CRT)—provides some important insights about best practices in this regard:

In the absence of a clear determination of what types of processes and procedures strike the right balance, we must instead encourage a constant tension between the seemly incompatible forces of user-needs and the demands of justice. Using the example of procedural rules, one way to balance this tension would be to create a relatively complete set of rules for reference, but build processes whereby the rules are given to users only when and where they need them, rather than all at once. Taken a step further, this approach would see the rules embedded directly into the processes, forms, and interactions themselves, saving users from the burden of having to read, interpret, or decide how to adhere to them.

This tension also arises when we attempt to aggressively “plain language” or “clear language” rules, procedures, or text-based materials used during the process. Where it is possible and appropriate, technical words and phrases must give way to simpler ones. Defined terms and other cross-references must be kept to a minimum. Fewer words must be used even when a longer and more technically accurate provision would seem more complete. In our experience, even these modest-seeming recommendations can be challenging to put into practice. But their value cannot be overstated when it comes to accessibility for the public.

Multiple study participants indicated that they found the form to be intimidating. They reported difficulties in determining what information was relevant to include or indicated that they would have liked the help of an expert to fill in the form. These observations reinforce our suggestion in the 2016 study that, in some cases, optimal solutions for increasing access to justice for SRLs may involve innovative legal service delivery models whereby individuals are able to access some legal advice and expertise but can avoid the expense of retaining a lawyer to handle all aspects of their case from start to finish. Commonly discussed examples of such innovative legal service models include unbundled legal services and coaching. Additionally, the Law Foundation of Ontario has recently published a significant study on the important role that “trusted intermediaries” (that is, front-line workers in non-legal community organizations) play in assisting the public with their legal problems. A large proportion of front-line workers (63 percent) that were surveyed for this study indicated that they “helped people complete legal forms and documentation.” However, only approximately one-third of those surveyed indicated that they were either “comfortable” or “very comfortable” with providing this specific type of help. The Law Foundation of Ontario’s report noted that one major reason that front-line workers have concerns about providing this type of help is because they are worried about crossing the line between providing “legal information” (something non-lawyers can do) and providing “legal advice” (an act that is restricted by law to lawyers). In view of this tension, the Law Foundation observed the potential in using automated forms that can provide useful real-time information to “make the role of trusted intermediaries in helping people to fill out forms more feasible and effective.”

Our results confirm that the standardized quantitative complexity analysis used in the 2016 study provides valuable insights into the challenges faced by SRLs attempting to complete court forms. At the same time, we have extended the findings of the quantitative analysis, identifying additional challenges that were not flagged by the 2016 study. The existence of these discrepancies reinforces the need for assessments of court-form complexity to engage with the experiences of actual or intended users. Indeed, one of the participants made this specific suggestion:

I feel like it would also be helpful, before they publish these forms, to get someone without any legal knowledge to fill out the form and see if they struggle with it. People who write these forms have probably filled them out or seen them before. Get a layman person’s perspective. (Participant 9)

Stated in more technical terms, the observations here reflect the value of what has come to be known as a “human-centred design” approach. As explained by Margaret Hagan,

[t]his approach stands in contrast to the default approach to creating interventions, which is to take the point of view of the professionals or of the leadership in charge of the system. Their preferences, metrics, and hypotheses typically control how a system, like the court system, is created and run, as well as how new improvements are made. The human-centered design approach argues that to improve the functionality and experience of a given system, the needs and preferences of the user should be the guide. It judges a product, service, or system by what the experience of its audience is. Can they use this thing easily? Does it give them value? And is it engaging of their time and attention?

However, as Hagan also notes, “[h]uman-centered design does not, in itself, propose what type of intervention will best solve a problem … [but rather] is a methodology that could result in various types of specific design work, like the design of new graphics, information layouts, technological products, service flows, organizations, rules, or systems.” This observation helps to demonstrate the value of using a hybrid approach that involves the use of an instrument like the rating tool to identify which tasks are complex, alongside an approach that systematically examines the experience of non-expert human participants attempting to complete the form. In our view, the advantage in proceeding in this fashion is that form designers not only can obtain specific insights as to why certain tasks might be complex by using a detailed evaluation tool (and, thus, understand what the potential routes to reducing complexity might be) but also correct for limitations inherent in this approach by engaging with actual or intended users. An evaluation tool like the rating tool, which is based on a particular literacy framework, is unlikely to capture all of the potential forms of complexity because of constraints in its design. An example of a limitation of this type revealed in this study was the fact that multiple participants complained that the remedies in Part 3 of the form were spread over too many pages without a summary at the beginning. The lack of a summary page is not something that the rating tool was designed to identify as potentially leading to increased complexity. Moreover, as highlighted above, the use of robust human testing can help, first, to correct for possible tunnel vision on the part of researchers and form designers who may have pre-existing familiarity with technical language and, second, to introduce a broader diversity of perspectives by having a larger set of individuals engage with the form.

Helpfully, if researchers and court-form designers in Canada wish to engage in robust human testing but are looking for practical guidance and best practices, the CRT now has extensive experience with such a process. With respect to court forms, and as noted in the CRT case study authored by Salter and Thompson,

[f]ully 45% of British Columbians aged 16–65 have difficulty filling in forms or following instructions, due to literacy problems. Yet filling out complex forms and following detailed instructions is exactly what our public justice processes demand from people at every turn. Court and tribunal forms and documents are seldom tested, validated, and revised in collaboration with the people
who use them. Rather, they tend to be designed for an audience of judges, lawyers, or court staff.

One way the CRT is addressing this issue is by engaging in intensive user testing at the early stages of design. We started by testing conceptual designs of intake forms and processes with community advocates, who serve clients with various barriers, and then began testing them with individuals with real disputes. This feedback has allowed us to frontload changes and refinements at an early (inexpensive) development stage.

In our view, using a functional literacy approach, coupled with robust human testing, will generate useful insights about court-form complexity that will allow the redesign of court forms and supporting materials to better meet the needs of SRLs completing these forms.

Comments are closed.