December 20, 2021 – The Women’s Legal Education and Action Fund (LEAF), in collaboration with several Citizen Lab Fellows and Senior Research Associate Christopher Parsons, has made a joint submission to the Toronto Police Services Board’s (TPSB) public consultation on its draft policy concerning police use of artificial intelligence (AI) technologies.
The submission, written by Suzie Dunn, Kristen Thomasen, and Kate Robertson, urges the Board to centre precaution, substantive equality, human rights, privacy protections, transparency, and accountability in its policy.
When developing and implementing this policy and making decisions about the use of AI by the Toronto Police Service today and into the future, the authors and LEAF implore the TPSB to continue to seek out the guidance and expertise of:
- AI and technology scholars and advocates;
- equality and human rights experts;
- affected communities and their members, including historically marginalized communities; and
- other relevant stakeholders.
The submission calls for proactive and ongoing identification and mitigation of risk arising from the use of AI systems by police. The submission also recommends that the Board place an immediate moratorium on law enforcement use of algorithmic policing technologies that do not meet minimum prerequisite conditions of reliability, necessity, and proportionality.
This joint submission was written and reviewed by a group of experts in the legal regulation of AI, technology-facilitated violence, equality, and the use of AI systems by law enforcement in Canada. The recommendations and comments in the submission focus on the following key observations:
- Police use of AI technologies must not be seen as inevitable
- A commitment to protecting equality and human rights must be integrated more thoroughly throughout the Board policy and its AI analysis procedures
- Inequality is embedded in AI as a system in ways that cannot be mitigated through a policy only dealing with use
- Having more accurate AI systems does not mitigate inequality
- TPS must not engage in unnecessary or disproportionate mass collection and analysis of data
- The Board’s AI policy should provide concrete guidance on the proactive identification and classification of risk
- The Board’s AI policy must ensure expertise in independent vetting, risk analysis, and human rights impact analysis
- The Board should be aware of assessment challenges that can arise when an AI system is developed by a private enterprise
- The Board must apply the draft policy to all existing AI technologies that are used by, or presently accessible to, the Toronto Police Service
LEAF’s work on this submission is part of its Technology-Facilitated Violence Project, bringing together feminist lawyers and scholars to advance equality-enhancing responses to technology-facilitated violence against women and gender-diverse people.
LEAF would like to thank Kristen Thomasen, Suzie Dunn, Kate Robertson, Cynthia Khoo, Ngozi Okidegbe, and Christopher Parsons for their collaboration on this submission. Read the full submission here.
About the Women’s Legal Education and Action Fund (LEAF)
The Women’s Legal Education and Action Fund (LEAF) is a national not-for-profit that works to advance gender equality in Canada through litigation, law reform, and public legal education.
Since 1985, LEAF has intervened in more than 100 cases that have helped shape the Canadian Charter of Rights and Freedoms, responded to violence against women and gender diverse people, pushed back against discrimination in the workplace, allowed access to reproductive freedoms, and provided improved maternity benefits, spousal support, and the right to pay equity.
LEAF understands that women and gender diverse individuals in Canada experience discrimination in different ways, and builds partnerships across communities to inform our understanding of how race, gender identity, sexual orientation, (dis)ability, class, and other intersectional identities underlie legal structures that perpetuate inequality, discrimination, and harm.