How is facial recognition technology used in Canada’s private sector?

Facial recognition technology (FRT) is increasingly being  deployed across Canada’s private sector in spaces that are essential to individuals’ daily lives, such as malls, retail locations, airports, and many digital environments. Investigations conducted by Canada’s Privacy Commissioner (OPC), its provincial counterparts, and journalists have revealed that many of these private actors are neither obtaining appropriate consent, nor are they adequately disclosing or informing individuals of their use of this invasive technology.

The widespread adoption of FRT across many physical spaces and its incorporation into more services is fueling the normalisation of individuals’ unbounded disclosure of personal information. Without a robust regulatory framework in place, Canada lacks the necessary legislative tools to promote socially beneficial and responsible uses of this technology. In the absence of necessary regulatory safeguards, FRT gives rise to the exploitation of individuals and their personal information to the detriment of human rights and democracy.

Case Studies

Cadillac Fairview Malls

In 2020, the Privacy Commissioner of Canada confirmed that Cadillac Fairview had broken federal privacy legislation by subjecting individuals to FRT analysis without properly informing or obtaining valid consent from shoppers in retail locations across the country.

In its joint investigation with provincial counterparts, the OPC found had installed covert cameras in digital information kiosks across 12 malls in Canada, including the Toronto Eaton Centre, capturing and storing over 5 million shoppers’ images. These images were then analyzed to generate additional personal information about each shopper. This information included individuals’ age and gender, and was then used to monitor foot traffic patterns, and to predict demographic information about shoppers for advertising and marketing purposes. Additionally, the investigation found that Cadillac Fairview had not adequately informed shoppers that they were being recorded and subject to FRT analysis.

Retail Stores

Other retail locations across Canada have been reported to be using FRT analysis or are considering deploying the technology to covertly gather information about shoppers. For example, Rexall, the second-largest retail pharmacy company in Canada, was reported to have used a trial of Clearview AI’s software to search for and identify individuals who were suspected of shoplifting. In 2021, federal and provincial privacy commissioners found that Clearview AI violated federal and provincial private-sector privacy laws by illegally scraping images of individuals from the internet without their consent. Additionally, grocery store chain Foody Mart was reported to have plans to deploy machines for a biometric driven payment system supported by facial recognition technology.

Canadian Tire has admitted to using FRT software in Manitoba and in other stores across the country. In 2022, FRT may have contributed to the misidentification of an Indigenous man at a Canadian Tire location in Winnipeg when he was incorrectly flagged by a store’s camera as being a shoplifter. In 2023, British Columbia’s privacy commissioner concluded an investigation into Canadian Tire’s use of FRT in the province and found that the retail chain contravened the province’s privacy laws through its undisclosed use of FRT because it did not notify customers nor obtain their consent concerning the collection and storage of their personal information.

Airlines

In 2023, Air Canada launched a digital identification system via the Air Canada app that verifies travellers’ identity via facial recognition technology to grant access to Air Canada lounges. While the FRT-backed identification system is currently optional, it is marketed as being faster than using traditional boarding passes and manual identity checks. In the future, the program is expected to be expanded for travellers to board flights without showing their boarding pass or government identification.

Personal Devices & Applications

FRT is commonly used in applications on personal devices, such as smartphones and laptops. For example, many smartphones use FRT to verify individuals’ identity, by comparing live and stored images of individuals faces to unlock their device. Popular photo storage applications such as Google Photos and Apple Photos use FRT to organise photos by faces and identity contacts. Social media applications including Facebook Messenger, Instagram, Snapchat, and TikTok have incorporated augmented reality features supported by facial recognition systems for users to digitally manipulate their faces through power animated “skins” and “filters.”

FRT is also being used in mobile applications that have practical value. For example, in London, Ontario, FRT is being used to screen drivers and validate their identity in an alternative ride sharing application that seeks to create a safe riding space for women and 2SLGBTQ+ individuals. The technology has also been used to assist blind or face-blind individuals with identifying others.

Political Parties

Leading up to the 2021 federal election, the Liberal Party of Canada used American technology company, Jumio’s one-to-one matching facial recognition tool in its candidate nomination process. Jumio’s digital identity verification tool functions by comparing an individual’s identity photos (such as a passport or driver’s license photo) against a live captured image to validate their identity, as well their active presence. The British Columbia Information and Privacy Commissioner then launched to determine whether the use of the technology is compliant with the province’s privacy legislation. The investigation was later discontinued once the Liberal Party confirmed that they had stopped using FRT.

Why should we be concerned about the use of facial recognition technology in the private sector?

The use of FRT in Canada’s private sector can contribute to individual, collective, and social harm through mass surveillance and monitoring. While some of the uses of FRT detailed above may have potentially socially-beneficial impacts, nonetheless, a comprehensive regulatory framework is required to ensure that this technology is used responsibly, is properly disclosed to individuals, and that consent is obtained prior to its use. Moreover, rules regarding the use and storage of information gathered and developed in relation to FRT are necessary so that individuals’ privacy and personal information are protected.

“[I]f we don’t take care to adequately assess the application, development and governance of AI, it can have adverse effects on end-users, perpetuate and even amplify discrimination and bias towards racialized communities and women, and lead to unethical usage of data and breaches of privacy rights.”

Alex LaPlante, Senior Director, Borealis AI (ETHI Committee)

The adoption of facial recognition technology in Canada’s private sector, as detailed above, raises some of the following concerns:

Data Profiles

The use of FRT involves the collection of personal and sensitive information, including biometric data and information about an individual’s age, ethnicity, gender, and preferences, among others. While companies have claimed that their use of FRT does not constitute a collection of personal information because they do not maintain photographs of individuals, such data is used to construct expansive data profiles that are particularly revealing when combined with other data points such as cell phone information. This information is usually gathered automatically and without adequate consent, allowing corporations and private entities to build expansive profiles of individuals and broader consumer habits that can then have effects beyond the invasion of privacy. Such practices contribute to mass surveillance and manipulation. For example, comprehensive data profiles can be used to influence consumer behaviour through targeted advertising.

Chilling Effects

Digital surveillance facilitated through invasive emerging technology including artificial intelligence, machine learning, facial recognition analysis gives rise to “chilling effects.” Chilling effects occur when individuals are deterred or discouraged from speaking and acting freely, and adapt their behaviours to meet social norms or engage in self-censorship. In turn, this negatively impacts individuals’ civil liberties including their right to privacy, freedom of speech, and other fundamental rights and freedoms. For example, increased surveillance on protesters, specifically those from racialized communities such as Black Lives Matter, can impact individuals’ willingness to take part in such causes.

Racial Bias & Equity

While FRT has been proposed to detect fraud and prevent theft, its identification capabilities are imperfect and oftentimes perform poorly when tasked with identifying individuals from racialized minorities and women. When used in the context of premises monitoring, such as to detect shoplifting, this can result in individuals being denied access to services and being incorrectly accused of crimes that they did not commit, as well as increased psychological harms. Additionally, once these misidentification issues are solved, FRT use still poses serious harm because it consolidates and perfects surveillance, leading to even greater privacy risks and inequity.

Surveillance Normalisation

The widespread use of FRT in social environments and as a requirement for service, such as in social media filters or to gain quicker access to airport lounges, normalises the use of surveillance technologies. By making the use of FRT entertaining and by granting access to increased benefits in exchange for personal information, these applications minimize the apparent dangers of surveillance. In effect, they habitualise the unfettered sharing of personal information making such practices routine, particularly for young people.

CCLA acknowledges the support of Microsoft Canada which enables us to provide administrative support for the Coalition.