Despite having no policy in place federally or at the Provincial or Municipal level, the Winnipeg Police Service (WPS) are purchasing an AI-based technology system to sift through “hours of video evidence for specific figures and items of interest, such as people and vehicles.” Calling it a “video analysis technology”, WPS have
stopped short of labeling it a Facial Recognition Technology system – but its description states it is capable of “identifying people, vehicles, lighting, and other parameters using metadata.” A rose by any other name, as they say, is an FRT system. The purchase of the software is being funded through proceeds of the WPS criminal forfeitures fund, which has come under criticism for essentially extracting value from communities to spend on policing.
Notably, the Winnipeg Police Service has no public policy on the use of AI or FRT. Staff Sgt. Josh Ewatski states that the technology can be used “especially when it pertains to serious crimes”, but what exactly does that mean?
It’s relevant to consider that in July of 2023, the WPS escalated investigation of vandalism of police headquarters to the major crimes unit – a unit generally reserved for investigation of serious crimes like homicide. The vandalism was tied to protests against the Province of Manitoba’s refusal to search the landfill for three murdered Indigenous women and included anti-police slogans. Among the slogans were no threats or other qualifiers that would otherwise warrant escalating vandalism to a major crime.
Given that cities like Buenos Aires are implementing FRT systems that target “serious crime”, it’s important that Canadian frameworks do not allow for police forces to decide what is and isn’t a serious crime warranting the use of these invasive technologies.