Technology as a Barrier to Justice: Cautioning Legal Tech Designers

Harjote Sumbal

Technology alone is not the complete solution to Canada’s access to justice problems. Usage of technology can encounter resistance, the measures may ultimately be unsuccessful, and the approach can actually result in the creation of new barriers to access. Professors Roger Smith and Alan Paterson identify “digital exclusion” with its three “digital divides” as a good place to start in assessing challenges of technological reform: (1) physical access to the relevant technology, (2) the technical ability to use the relevant technology, and (3) the cultural inclination to use the relevant technology.[1] Designers of legal tech would do well to anticipate the barriers to justice their applications may create so that they can address them before they manifest. Addressing the second divide – technology itself as a barrier – should drive legal app design to ensure implementation of technology does not widen the access to justice gap further.

A successful application is driven by user demand, which in turn requires trust. Technology can indirectly risk undercutting the administration of justice and compromise user trust. Mistrust of the legal system is a noted barrier to access,[2] so the security of technological processes is essential to make user adoption a possibility. For example, Abedi, Zeleznikow, and Brien have identified three core “facets of security” Online Dispute Resolution (ODR) systems must ensure: (1) information security and confidentiality, (2) privacy of the parties involved, and (3) authentication of parties in transactions and communications.[3] If technological reforms are implemented without due consideration of security issues, legal tech may serve as an additional barrier for wary users rather than increasing access to justice.

Digital divides in accessing technology can serve as significant barriers to access. While cultural resistance speaks to the acceptance of technology by the existing legal industry structures, physical access and technical ability are both barriers for potential users that may actually want to engage with legal technology. While these digital divides can affect any given individual, their impact is likely to most strongly affect vulnerable groups like lower socioeconomic communities, elderly people, Indigenous peoples, and those with language barriers, whether they are refugees, immigrants, or citizens.[4]

In some cases, it is possible to address these barriers within the technological tool’s infrastructure. For example, the Civil Resolution Tribunal (CRT) attempts to address potential language barriers by providing information on the CRT, its process, limitation periods, and available help resources in multiple languages.[5] It also provides additional resources for Indigenous users and directs those without computers to ServiceBC locations or paper forms.[6] The CRT’s recognition of potential technological barriers is an important start. However, not all technology platforms are constructed in the same manner, nor are they as comprehensive as ODR platforms tend to be. Justice apps are more individualized in their scope and user design. For example, the MyLawBC website is designed, amongst other capabilities, to allow users to construct their wills, but offer none of the language, Indigenous, or general helper resources of the CRT described above.

The need for accessibility tailored to vulnerable populations is apparent. A 2008 Law Foundation of Ontario report stated those in vulnerable populations “need to receive direct services rather than rely on self-help”, as legal trouble often piles on to the barriers they already face.[7] As self-service is one of the key features of user-targeted legal technology to save paying legal fees, tools that are too daunting to use are essentially useless. Without specific consideration of vulnerable populations and their userability, technological reforms risk creating a further divide between users and access to justice.

Technological tools like ODR and justice applications have great potential. However, the design and conception of technological tools must consider the specific needs of vulnerable populations or they risk exacerbating the access to justice problem. In order to successfully facilitate greater access to justice, legal tech designers must exercise empathy with target populations when conceptualizing solutions.

[1] Smith, Roger & Paterson, Alan, “Face to Face Legal Services and their Alternatives: Global Lessons from the Digital Revolution” (2014), online (pdf): Strathprints <https://strathprints.strath.ac.uk/56496/1/Smith_Paterson_CPLS_Face_to_face_legal_services_and_their_alternatives.pdf> at 19.

[2] Tania Sourdin, et al, Digital Technology and Justice: Justice Apps, (Milton: Routledge, 2020) at 23.

[3] Fahimeh Abedi, John Zeleznikow & Chris Brien, “Developing Regulatory Standards for the Concept of Security in Online Dispute Resolution Systems” (2019) 35 Computer Law & Security Review 1.

[4] Sourdin, supra note 2 at 66.

[5] Civil Resolution Tribunal, “Resources” (2021), online: < https://civilresolutionbc.ca/resources/>.

[6] Ibid.

[7] Sourdin, supra note 2 at 68; See Karen Cohl & George Thomson, “Connecting Across Language and Distance: Linguistic and Rural Access to Legal Information and Services” (December 2008), online: The Law Foundation of Ontario <https://lawfoundation.on.ca/download/connecting-across-language-and-distance-2008/>.

One thought on “Technology as a Barrier to Justice: Cautioning Legal Tech Designers

  1. This is a very important perspective I think we all should take a moment to reflect on (especially as we attempt to finalize our “cool” “A2J” apps)!

    Technology designed around furthering justice must walk a unique path, in that it has to solve some problem to attract users, but in doing so, must make sure that problem-solving process isn’t an impasse to the potential limitations and backgrounds of those users.

    I think we have all seen a parent or grand-parent struggle with an application we normally breeze through, so that isn’t hard to unravel, but addressing something like “cultural inclination” (even in our apps) is a much more complex issue (and just as important).

    Nice work Harjote!

Comments are closed.