Report | To Surveil and Predict : A Human Rights Analysis of Algorithmic Policing in Canada
This report written by Kate Robertson—criminal defence lawyer and Citizen Lab research fellow—Cynthia Khoo—Citizen Lab research fellow and technology and human rights lawyer—and Yolanda Song,—lawyer at Stevenson Whelton LLP and pro bono research associate at the IHRP—examines algorithmic technologies that are designed for use in criminal law enforcement systems and the human rights implications of their use.
This report examines algorithmic technologies that are designed for use in criminal law enforcement systems. Algorithmic policing is an area of technological development that, in theory, is designed to enable law enforcement agencies to either automate surveillance or to draw inferences through the use of mass data processing in the hopes of predicting potential criminal activity. The latter type of technology and the policing methods built upon it are often referred to as predictive policing. Algorithmic policing methods often rely on the aggregation and analysis of massive volumes of data, such as personal information, communications data, biometric data, geolocation data, images, social media content, and policing data (such as statistics based on police arrests or criminal records).
To consult the report
This content has been updated on 26 November 2020 at 9 h 04 min.