Skip To Content

Privacy and biometrics/AI

Author(s): Kristian Brabander, Robert Carson, Tommy Gelbman, Jessica Harding, Craig Lockwood, Julien Morissette

Aug 9, 2023

Technology Scanner Decodes Retinal

Table of contents

Read the full edition: Privacy Jurisprudence Review


Enquête concernant le Centre de services scolaire du Val-des-Cerfs (anciennement Commission scolaire du Val-des-Cerfs), 2022-11-09, 1020040-S

Read more about the case: Enquête concernant le Centre de services scolaire du Val-des-Cerfs (anciennement Commission scolaire du Val-des-Cerfs), 2022-11-09, 1020040-S

Facts

This decision emanates from the oversight division of the Commission d’accès à l’information (CAI). The CAI launched an investigation of the Val-des-Cerfs school board which had developed an algorithm, in partnership with a consulting firm, to target Grade 6 students who were at significant risk of dropping out. The school board had developed a machine learning methodology which would analyze more than 300 types of raw data taken from a database of the students’ personal information and generate a set of predictive indicators of dropout risk (the Tool). The CAI’s decision following the investigation ruled as to whether the organization had met its obligations under the Act respecting Access to documents held by public bodies and the Protection of personal information (the Access Act) in the collection and use of personal information in the development phase of the project.

Decision

First, the CAI determined that while the personal information was depersonalized to prevent the direct identification of the students and their parents, it was not anonymized as it was not irreversibly depersonalized, and therefore still allowed for identification of the students.

Second, the CAI found that, in the development of the Tool, the school board had used the personal information for a new purpose, contrary to section 65.1 of the Access Act. When the information was first collected, the students and their parents had not been informed and therefore had not consented to the use of the information to generate predictive indicators of dropout risk. However, the CAI determined that the purpose for which the information was used was compatible with the objectives of the school board to ensure academic success.

Third, the CAI concluded that the Tool constituted artificial intelligence, as it was “a system whose purpose was to augment human work, capable of predictive analysis by a technological system involving algorithms.” Importantly, as a result of the analysis it performed, the Tool produced new personal information, namely, predictive indicators of the risk of dropping out, which the CAI determined amounts to a collection of personal information within the meaning of the Access Act.

In light of the determination that the school board had collected personal information in its development of the Tool, the CAI found that it had not abided by its obligations to inform the parents of the students about the ways in which the data was used. The CAI called on the school board to adopt security measures to ensure the protection of personal information collected, including procedures for its destruction, and to destroy the Tool’s existing output. It also called on the school board to proceed with a privacy impact assessment prior to the deployment of the Tool.

Key Takeaway

This decision of the oversight division of the CAI is the first in Québec to provide guidance on the CAI’s interpretation of several issues surrounding artificial intelligence. Specifically, the CAI definition of artificial intelligence and its determination that the production of predictive indicators constitutes a “collection” of personal information, are novel. This determination in particular could subject organizations that use artificial intelligence to generate insights to the privacy law provisions applicable to the collection (and not the use) of personal information, including notably obtaining consent of the concerned individuals. In certain circumstances, this decision may need to be taken into consideration in the interpretation of new section 65.2 of the Access Act, and its private sector equivalent, section 12.1 of the Act respecting the protection of personal information in the private sector, which will enter into force in September 2023 and set forth new transparency rules for automated decision-making.

 

Situmorang v. Google LLC, 2022 BCSC 2052

Read more about the case: Situmorang v. Google LLC, 2022 BCSC 2052

Facts

The plaintiff sought certification of a class action against Google LLC for Google’s use of face grouping technology. The plaintiff alleged that Google did not obtain informed consent from the class members for use of the face grouping technology and used the facial biometric data of the class members for its own competitive advantage. The plaintiff advanced claims under both the B.C. Privacy Act (the Statutory Claim) and the common law tort of intrusion on seclusion (the Common Law Claim).

Decision

The Supreme Court of British Columbia refused to certify the action certify the action, finding that it was plain that it was plain and obvious that both the Statutory Claim and Common Law Claim could not succeed.

The Court found that it was plain and obvious the Statutory Claim could not succeed because it could not be established that Google’s conduct was a wilful violation of privacy or that Google lacked claim of right to engage in the face grouping conduct.

In assessing the Common Law Claim, the Court was required to consider whether Google had invaded the plaintiff’s “private affairs or concerns.” The Court held that it was an open question as to whether a retained collection of facial biometric data may be information capable of implicating one’s “private affairs and concerns.” Despite this, the Court held that it was plain and obvious the Common Law Claim would fail because the plaintiff could not establish that an intrusion arising from Google’s use of face grouping would be considered highly offensive by a reasonable person.

Key Takeaway

The question of whether a retained collection of facial biometric data may be information capable of implicating one’s “private affairs and concerns” remains open. There is therefore a risk that organizations that collect and retain facial biometric data may be vulnerable to claims of common law intrusion on seclusion.

 

Stay informed

Get notified by email when the next edition of Privacy Jurisprudence Review is available.

Subscribe now