Authors
Partner, Disputes, Montréal
Partner, Disputes, Toronto
Partner, Disputes, Calgary
Partner, Disputes, Montréal
Partner, Disputes, Toronto
Partner, Disputes | Insolvency and Restructuring, Montréal
Table of Contents
- Cleaver v. The Cadillac Fairview Corporation Limited, 2025 BCSC 910
- Doan c. Clearview AI inc., 2024 QCCS 3968
- Imprimeries Transcontinental inc., Re, Commission d’accès à l’information du Québec, 1024350-S
- Granger v. Ontario, 2024 ONSC 6503
- Clearview AI Inc. v. Alberta (Information and Privacy Commissioner), 2025 ABKB 287
Privacy Jurisprudence Review
Cleaver v. The Cadillac Fairview Corporation Limited, 2025 BCSC 910
Facts
In 2018, Cadillac Fairview Corporation Limited (Cadillac Fairview) installed cameras equipped with Anonymous Video Analytics technology (the Software) supplied by MappedIn Inc. (MappedIn) into wayfinding directories (Directories) at their shopping malls located in several provinces across Canada (the shopping malls).
Cadillac Fairview ran an eight-week pilot project, the purpose of which was to obtain an estimate of the number of visitors to each property and their rudimentary age and gender demographics. It disabled the Software in response to misinformation circulating online suggesting that the Software was “facial recognition” technology. The data obtained from the project was securely held by MappedIn on a decommissioned server. None of the defendants received, or made use of, the data, and the images taken were not retained.
The Privacy Commissioner of Canada, the Information and Privacy Commissioner of Alberta, and the Information and Privacy Commissioner of British Columbia (collectively, the Commissioners) launched a joint investigation to determine whether Cadillac Fairview was collecting and using the personal information of visitors to its malls.
On October 28, 2020, they released their report concluding that the Software created a unique numerical representation of a particular face, constituting a collection of biometric information. Since these numerical representations were created from images captured by the cameras, the Commissioners found that the creation of the biometric information from those images constituted an additional collection of personal information. It also concluded that the complaint was resolved because the Software had been disabled and all data deleted.
The plaintiffs, Joshua Cleaver and Curtis Kieres (the Plaintiffs), sought to certify a national class action pursuant to the Class Proceedings Act, R.S.B.C. 1996, c. 50 (CPA) on behalf of all “persons who viewed a wayfinding directory at one or more of the shopping malls during the relevant periods and any persons including minors, who accompanied them.” They alleged that Cadillac Fairview secretly mined biometric data from unsuspecting visitors to their shopping malls, and that the defendants breached the proposed class members’ privacy rights by collecting their personal data, namely their facial images, and converting them into numerical data.
Decision
The Court rejected the possibility of certification of the claims for (i) certain alleged statutory breaches; (ii) intrusion upon seclusion in British Columbia and Alberta; (iii) negligence; and (iv) alleged breaches of the Québec Charter.
The Court found that the Plaintiffs satisfied the requirements of paragraph 4(1)(a) of the CPA with respect to some of the causes of action set out in the pleadings notwithstanding the fact that it found there was no “basis in fact” that the defendants captured or stored any biometric or personal data.
However, it concluded that the Plaintiffs had failed to establish that there was an identifiable class of two or more persons, a requirement to certify a class action pursuant to paragraph 4(1)(b) of the CPA. Indeed, there was no factual basis to demonstrate that the class members could self-identify and no rational relationship between the proposed class definition (which had been amended three times) and the fundamental common issues, being that a facial image of an individual was recorded and used to create biometric and personal information about that individual.
The Court also held that the claims of the class members did not raise common issues, another certification requirement pursuant to paragraph 4(1)(c) of the CPA. It notably found no basis in fact that any facial images were recorded by the cameras located at the Directories or that biometric and personal information about class members was created. It also found there was no basis in fact for the allegation that the data contained personal information within the meaning of the relevant statutes, as no individual could be identified from the data.
Finally, the Court was not persuaded that a class action was the preferable procedure for the fair and efficient resolution of the common issues pursuant to paragraph 4(1)(d) of the CPA. Importantly, the Court’s analysis considered the lack of evidence of demonstrable harm, the conclusion of the pilot program, and destruction of the data.
Key takeaway
The dismissal of this certification application highlights the challenges plaintiffs face in privacy-related class proceedings. This decision demonstrates that plaintiffs cannot rely solely on regulatory findings and conclusions to substantiate their claims in court, particularly where there is no identifiable data causing the information to be personal, and where the alleged collection of data has ended and the data deleted.
Doan c. Clearview AI inc., 2024 QCCS 3968
Facts
The plaintiff, Ha Vi Doan (Doan), sought the authorization to institute a class action on behalf of all Québec residents whose facial images and personal information were collected, used or disclosed without their consent by the defendant, Clearview AI (Clearview).
Clearview developed a facial recognition algorithm that enables it to create a facial imprint from biometric data taken from a photograph. Its search engine scours the Internet, locates photographs of faces and classifies them in its database according to their respective facial imprints. This software enables Clearview to offer its customers a service capable of assembling in a search report all the facial images whose imprints match a given image, and which are available on the Internet.
Doan alleged that Clearview has violated certain fundamental rights of the class members, including their right to privacy, the right to preserve their dignity and the right to control over use of their image. Clearview was also alleged to have breached its obligations under laws applicable to the collection of personal information. Clearview contested the apparent merits of only some of the causes of action raised, but also argued that the Québec courts lacked jurisdiction.
Decision
The Court held that it has jurisdiction over the proposed class action on behalf of Québec residents, further concluding that the claim that collecting photographs of the class members, using them to create a facial imprint and compiling a file on each of them without their consent constitutes an attack on the safeguard of their dignity pursuant to section 4 of the Charter of Human Rights and Freedoms, C.Q.L.R. c. C-12 (Québec Charter), is not a frivolous one. The Court held that this question deserved to be analyzed on its merits and authorized the class action.
The Court highlighted that the scope of the right to control the use of one’s image has not yet been examined by the courts in such circumstances. It concluded that it is not far-fetched to raise the issue of whether the use of an image to, among others, create a facial imprint may violate the class members’ right to control over the use of their image.
Key takeaway
This decision highlights the growing legal concerns surrounding the use of biometric data and facial recognition technology. As the case proceeds to the merits stage, it will be closely watched for its potential to shape the legal landscape in this rapidly evolving area, notably with regard to human rights and the right to one’s image.
Imprimeries Transcontinental inc., Re, Commission d’accès à l’information du Québec, 1024350-S
Facts
In October 2020, the Québec privacy commissioner, the Commission d’accès à l’information (CAI), received from Imprimeries Transcontinental inc. (the Company) a declaration informing it of the creation of a database of biometric characteristics or measurements (the Declaration).
The original purpose of the Declaration was to justify the implementation, in the context of the COVID-19 pandemic, of an authentication system (the System) with two functionalities to control access to the Company’s premises, i.e., a facial recognition functionality and a body temperature measurement functionality. At the time of its Declaration, the Company’s objective with regard to the System was to ensure the safety of its employees and of its premises.
Considering that the temperature-taking functionality of the System had not been used since October 2022 and that the data generated by it had been destroyed, the CAI’s decision only concerns the facial recognition functionality of the System.
Decision
The CAI ordered the Company to cease collecting biometric information allowing facial recognition, to cease using a facial recognition system using biometric measurements to control access to its premises, and to destroy the templates created and hash codes obtained by converting the facial photos collected.
After concluding that the Company is subject to the Act respecting the protection of personal information in the private sector, C.Q.L.R. c. P-39.1 (the Québec Private Sector Act), the CAI held that a photograph of a person’s face and its codification into a mathematical representation — both being part of the System’s process — are sensitive personal information.
The Québec Private Sector Act provides that personal information may only be collected if there is a serious and legitimate interest. In addition, only personal information that is necessary for the purposes identified prior to the collection may be collected. In order to justify the need to collect such data, a company must demonstrate the legitimate, important and real objective pursued by this collection, as well as the proportionality of the invasion of privacy in relation to the objectives pursued. A company may not depart from these requirements, even with the consent of the person concerned.
First, the CAI considered that the Company had a legitimate interest in ensuring the security of its facilities and to take measures to control access to its premises. However, the CAI concluded that the Company had not demonstrated any particular security issues justifying such collection of personal information.
The CAI held that the Company did not demonstrate the importance of the objective pursued. Controlling access to a company’s premises is a usual and common objective. A company’s activities or a particular situation might justify a higher level of security that biometric data can provide, but there was nothing to indicate that this was the case here.
Second, the CAI concluded that the collection carried out by the Company was not proportional to the underlying objective considering the biometric and sensitive nature of the personal information in question. Indeed, the invasion of privacy resulting from the collection of the personal information was not minimized. The Company also did not establish how the collection of personal information required for the operation of the System provided benefits that outweighed the harm caused by such collection.
Key takeaway
This decision highlights the rigorous standards that apply to the collection and use of biometric data under Québec’s privacy laws. Organizations must demonstrate a serious and legitimate interest and can only collect personal information that is necessary for the purposes identified prior to the collection. They cannot rely on consent alone to justify their practices.
This case reinforces the importance of prioritizing less privacy-intrusive alternatives wherever possible.
Granger v. Ontario, 2024 ONSC 6503
Facts
Micky Granger, a migrant farm worker, was subjected to DNA collection by the Ontario Provincial Police (OPP) in 2013 during an investigation of a violent sexual assault. Granger and 95 other workers provided DNA samples under what they believed was informed consent. However, the police did not provide copies of the consent forms, and the Centre of Forensic Sciences (CFS) retained the DNA profiles despite the Criminal Code’s requirement to permanently remove such data if the samples did not match any crime scene DNA. Granger alleged that this retention of his DNA violated his rights under section 8 of the Canadian Charter of Rights and Freedoms (Canadian Charter) and gave rise to a tort claim for intrusion upon seclusion.
Decision
The Court found that the CFS had failed to comply with the statutory requirement to permanently remove the electronic results of DNA analyses once it was established that the samples did not match. This failure constituted a breach of the plaintiffs’ reasonable expectation of privacy, as they had consented to the collection of their DNA under the belief that their profiles would be destroyed if they were excluded as matches. The Court ordered aggregate damages of $1,000 per class member, totalling approximately $7,267,000, to be awarded for the breaches of the Canadian Charter rights. The Court emphasized that the breaches were serious and warranted damages for vindication and deterrence, despite the absence of evidence showing actual harm from the retention of the DNA profiles. However, the Court declined to award punitive damages, concluding that the CFS acted in good faith and did not engage in conduct that was malicious or reckless.
Key takeaway
This decision underscores the importance of strict compliance with statutory obligations when handling sensitive personal information, such as DNA. The CFS’ failure to delete the electronic results as required by law was a key factor in the Court’s decision. This highlights the legal risks organizations face when failing to adhere to statutory privacy safeguards, even in the absence of evidence showing actual harm.
Clearview AI Inc. v. Alberta (Information and Privacy Commissioner), 2025 ABKB 287
Facts
Clearview AI Inc (Clearview), a U.S.-based company, scraped billions of images from the Internet, including social media, to build a facial recognition database marketed to law enforcement. This decision arose following a joint investigation by Canadian privacy regulators (Alberta, British Columbia, Québec, and federal) into Clearview’s facial recognition practices. It was released after the British Columbia Supreme Court’s related decision in Clearview AI Inc. v. Information and Privacy Commissioner for British Columbia, 2024 BCSC 2311, and is largely consistent with it where there are common issues.
On February 2, 2021, the regulators issued a Joint Report finding that Clearview had violated privacy laws by scraping billions of images without consent, creating biometric profiles, and marketing its services to Canadian law enforcement. It recommended that Clearview cease offering its facial recognition tool in Canada, stop collection and use of Canadians’ data, and delete any data in its possession.
On December 7, 2021, after Clearview refused to accept the recommendations, Alberta and its Information and Privacy Commissioner (the Commissioner) issued a binding order (the Order) requiring Clearview to adopt them. Clearview sought judicial review, challenging (i) Alberta’s jurisdiction over it as a foreign corporation; (ii) the interpretation of “publicly available” information under the Personal Information Protection Act, S.A. 2003, c. P-6.5 (PIPA) — which exempts consent for such data; and (iii) the constitutionality of the Order under subsection 2(b) of the Canadian Charter of Rights and Freedoms (Canadian Charter), which guarantees the right to freedom of expression.
Clearview argued that: (i) its scraping of publicly accessible images was lawful and comparable to the practices of search engines like Google; (ii) PIPA’s consent requirement was overly broad, chilling legitimate uses of public data (e.g., search engines); and (iii) the Order was unenforceable because it could not distinguish Albertans’ data within its database.
The Commissioner: (i) maintained that Clearview violated Albertans’ privacy rights under PIPA; (ii) defended its interpretation of the term “publicly available” information to exclude social media; and (iii) argued that any infringement of Canadian Charter rights was justified under section 1, given the low value of Clearview’s commercial expression as compared to the significant privacy harms.
Decision
The Court declared that sections 12, 17, and 20 of PIPA, and subsection 7(e) of the Personal Information Protection Act Regulation, Alta. Reg. 366/2003 (PIPA Regulation), unjustifiably infringed subsection 2(b) of the Canadian Charter (freedom of expression). As a remedy, the Court struck the words “including, but not limited to, a magazine, book or newspaper” from subsection 7(e) of the PIPA Regulation, thereby vastly broadening the meaning of the term “publication” to include personal information and images posted to the internet (without privacy settings), and therefore the use of that information is not subject to a consent requirement.
This constitutional ruling did not invalidate the Order because the determination that Clearview’s purpose for collecting and using personal information was unreasonable remained valid.
Applicability of PIPA to Clearview (jurisdiction)
The Court held that Alberta had jurisdiction over Clearview under the “real and substantial connection” test. Clearview had marketed its services to Alberta law enforcement and scraped images from servers located in Alberta, thereby establishing sufficient ties to the province. Its withdrawal from Canada during the investigation did not negate jurisdiction, as the Order addressed both past conduct and prospective compliance.
Interpretation of “publicly available”
The Court held that it was reasonable to interpret the “publicly available” exception contained in the PIPA and the PIPA Regulation to exclude social media. The Court accepted that privacy legislation warrants narrow exceptions to consent requirements. More generally, in considering the Canadian Charter arguments, the Court commented on an important principle of statutory interpretation of the PIPA: that the section 3 purpose statement indicates that the legislature sought to achieve balance, and not create a regime where privacy rights prevailed over all others. The result of this is that the purpose does not create an expansive or restrictive approach to interpretation.
Charter subsection 2(b) infringement
However, the Court ruled that PIPA’s consent requirement unjustifiably limited freedom of expression, based on three main considerations:
- Expressive activity: The Court found that Clearview’s scraping facilitated expression (e.g., search results), thereby engaging Canadian Charter protection. It specifically rejected Alberta’s argument that the expression was not protected because it is commercial or motivated by profit.
- Overbreadth: The Court determined that the law’s blanket consent rule captured benign activities (e.g., search engines indexing public data), disproportionately restricting lawful expression.
- Minimal impairment: While protecting privacy was recognized as a pressing and substantial objective, the Court held that the law’s means were not minimally impairing. To address this, the Court tailored the remedy by broadening the “publicly available” exception to include public internet postings with no inherent privacy protections.
Reasonableness of Clearview’s purpose
The Commissioner’s finding that Clearview’s use of the images it collected lacked a “reasonable purpose” under PIPA was upheld. The Court agreed that indiscriminate scraping, biometric profiling, and the commercial resale of data did not meet PIPA’s reasonableness test. Clearview’s argument that its practices aligned with the Canadian Charter values was rejected as untimely.
Enforceability of the Order
The Order was enforceable despite Clearview’s claim that it could not identify Alberta-specific data. The Court dismissed this argument as a “scrambled egg defense,” noting that Clearview could comply by adopting measures similar to its Illinois settlement. The Commissioner’s iterative compliance process was also found tobe lawful.
Key takeaway
The Court struck a balance between privacy and freedom of expression: while PIPA’s overly broad consent requirement on the collection and use of publicly available information was deemed unconstitutional, Clearview’s specific practices were found to be unreasonable under PIPA. The decision clarifies that privacy laws must not stifle legitimate internet searches and indexing but can appropriately target unreasonable uses of personal data, such as using that information to create a facial recognition database.