The Journal of Things We Like (Lots)
Select Page
Kate Robertson, Cynthia Khoo, and Yolanda Song, To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada, Citizen Lab and International Human Rights Program, University of Toronto (2020).

To Surveil and Predict is longer than the usual Jotwell suggestion. The authors carefully document and then explore the rights implications of the use of algorithmic and predictive tools by police forces in Canada. They conclude with a series of recommendations focused on public policy. My recommendation here is focused on the method and the equality focused parts of the report, although I like it all–a lot.

First, method. The Report works to expose and explore something that’s only just starting up. So classic doctrinal methods–where are the cases?–are not going to work well. But some of the analysis is quite legal, running things through Canadian human rights and charter provisions. At the same time and contrary to much (also very good) early work in this sector, they do not spend much time speculating about potential future technologies. Instead, Robertson, Khoo and Song pursued information (inter alia) about what was happening “on the ground” through freedom of information (FOI) requests. One of the many aspects of their work that I like: they provide information about how these requests were received and negotiated. (P. 13; Appendix A.) FOI is a critically important tool for researching the administrative state. How the process plays out is usually connected to the quality, volume and nature of the information obtained, but the process of making requests (and receiving replies, or not) is rarely described in articles. In my view, discussion of how the FOI requests worked in context is a good reason for adding length to research reports and analysis.

Other methodologies beyond doctrinal/theoretical analysis include convening a conference held under Chatham House rules (“participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed”), such that “insights from the symposium have informed some of the analyses in this report.” (P. 12.) From my experience, this is not a methodology expressly used or often described in legal research published in legal journals. Finally, they conducted a small number of interviews with key informants from the profession, and law enforcement. (P. 13.)

Through these methods, Robertson, Khoo and Song establish three categories of “algorithmic policing technologies”, the first two of which are also “predictive policing technologies.” All have been procured and/or deployed in Canada: location-focused algorithmic policing technologies, person-focused algorithmic policing technologies and algorithmic surveillance policing technologies (“sophisticated, but general, monitoring and surveillance technologies”). (P. 2, Pp. 38-69.)

For purposes of this Jotwell section, the meat of the report is in section 5.4, which considers the Right to Equality and Freedom from Discrimination.1 The authors have wisely taken an international human rights approach, so they are not limited to recognized rights under the Canadian Charter of Rights and Freedoms, or federal/provincial human rights codes. This ensures that along with racial discrimination (an early and continuing focus in the document), the authors take up concerns about socio-economic disadvantage as a salient ground of discrimination when these technologies are considered. (Pp. 113-119.)

Within the sections dealing specifically with equality and discrimination, they focus on three things. First, the operation of Virginia Eubank’sfeedback loops of injustice” (when data taken from a discriminatory system is used as the training material for an AI system). Second, the problem of hypervisibility for many low income people resulting from significant engagement with government systems. Finally, they turn to the ways that algorithmic approaches may build in discrimination but make it difficult to establish the cause, burying it in sophisticated techniques of maths and science. (“Inequality by Design and “Math-washing” Injustice” at P.122.)

In Canadian law, many criminal justice instances of racial discrimination have been dealt with not through the application of section 15 (the equality protection section of the Charter of rights and freedoms) but instead through other legal rights.2 This is at least in part because for some time the equality section has placed high barriers in front of claimants (U.S. readers might be interested to know, for instance, that collecting data disaggregated by race has not been a common practice and remains a controversial practice in Canada–which can make discharging the claimant’s burden of proof in section 15 equality cases difficult). 

To Surveil and Predict, however, makes recommendations at the policy level–not tactical suggestions for constitutional litigation. Thus, they are both broader and more preventative, highlighting the inability of courts to do this kind of work when faced with constitutional challenges. The three recommendations specifically aimed at equality and discrimination (Pp. 159-160) include an immediate moratorium on use of past data sets to inform predictive policing, a federal judicial inquiry into any and all such repurposing of past police datasets, and a requirement that all use of predictive policing and other algorithmic surveillance policing technologies be subject to a tracking requirement to “monitor potential emergence of bias.” (P. 160.)

The report relies on some of the excellent work on AI, policing technologies and algorithmic predictive technologies produced in the U.S., as we might expect. The development and operationalization of predictive policing technologies has been prominent in that country. But in a “small” jurisdiction like ours, with our own unique constitutional protections and human rights laws in place, implications, as well as doctrinal and policy questions have to be considered anew. It is also possible that the different contexts will reveal new effects of predictive and other forms of algorithmic policing technologies. For example, to the extent that hypervisibility depends on the existence of state services and information sharing between them–it is possible that this concern is heightened in a state which provides, in general, more services. And the problem might be even greater in unitary systems where information sharing would be intragovernmental not necessarily “intergovernmental” as it often is in federal states like Canada (where there remains surprisingly little intergovernmental sharing) and the US.

It is clear that particular features of relevant doctrines will be key to policy recommendations, and that particular doctrinal requirements of proof will be major barriers. In particular, Canadian constitutional law does not require intent to discriminate, but uses a substantive definition of discrimination. Furthermore, Canadian constitutional law does not limit the grounds of discrimination, which might be particularly important given the fine-grained targeting based on multiple characteristics that these technologies are designed to produce. Finally, the particular political and legal conventions which operate to define and differentiate the private and the public might also be relevant here where the majority of the technologies are developed in the private sector and used by the public sector under a variety of arrangements which, as the authors note, might make access to the information required to understand what is actually happening in the operation of the technology quite difficult.

I think that this report is perhaps a bit long for me to urge everyone to read all of it here (I would if I could though!). But for scholars of equality–whether already interested or only vaguely aware of these new tools available to the state, and whether interested in Canada or not–I highly recommend at least those portions of To Surveil and Protect focused on method, and on the right to equality and discrimination.

Download PDF
  1. There is of course much more, from attention to rights of privacy, expressive freedoms, and other rights which can be raised mainly in the context of legal proceedings and criminal justice.
  2. See, for instance, R. v. Williams, [2003] 2 S.C.R. 134, 2003 SCC 41, or R. v. Grant, 2009 SCC 32, [2009] 2 S.C.R. 353.
Cite as: Sonia Lawrence, Beyond Predictions About Predictive Policing, JOTWELL (October 13, 2020) (reviewing Kate Robertson, Cynthia Khoo, and Yolanda Song, To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada, Citizen Lab and International Human Rights Program, University of Toronto (2020)), https://equality.jotwell.com/beyond-predictions-about-predictive-policing/.