Canada’s new Online Harms Act (C-63): what you need to know

Mar 1, 2024 7 MIN READ

Joining the growing global trend towards social media regulation, the Government of Canada introduced its long-anticipated online harms legislation, Bill C-63, on February 26. The bill proposes to create a new statute, the Online Harms Act (the OHA or the Act), and to amend other existing statutes, including the Criminal Code and the Canadian Human Rights Act.

Online Harms Act

The OHA would impose three broad duties on social media platforms, namely:

  • a duty to act responsibly by implementing measures to adequately mitigate the risk that users will be exposed to harmful content
  • a duty to protect children by integrating design features respecting the protection of children
  • a duty to make non-consensually distributed intimate images (NCDII) and child sex abuse material (CSAM) inaccessible within 24 hours

The new framework would create a Digital Safety Commission to administer and enforce the Act, a Digital Safety Ombudsperson to support social media users and advocate for the public interest in relation to online safety, and a Digital Safety Office to support the Commission and Ombudsman.

Social media, adult content and live streaming services

Regulated services include social media services, adult content services and live streaming services. The Government of Canada has indicated that the Act will apply to platforms such as Facebook, Pornhub and Twitch. However, the precise application of the OHA will not be known until regulations are promulgated. The regulations would consider factors such as the number of users on a platform or whether there is a “significant risk that harmful content is accessible on the service.”

Notably, a service is not a social media service under the Act if it does not enable a user to communicate content to the public (i.e., a potentially unlimited number of users not determined by the user). Further, the duties under Act do not apply to private messaging features on platforms.

Duty to act responsibly

The OHA would require platforms to implement measures to mitigate the risk that users will be exposed to seven categories of harmful content: content that induces a child to harm themselves, content that is used to bully a child, content that foments hatred, content that incites violence, content that incites violent extremism or terrorism, NCDII and CSAM. 

As part of the duty to act responsibly, obligations will require platforms to take steps such as

  • publishing accessible and easy-to-use user guidelines that include user standards of conduct and descriptions of compliance measures vis-à-vis harmful speech
  • providing users with tools to block other users from finding or communicating with them on the service
  • implementing tools and processes to flag harmful content, including notices to users who flagged content and users who communicated the content that was flagged
  • labeling certain harmful content artificially amplified through third-party automated means such as bots or bot networks
  • making a resource person available to support users with concerns about harmful content and OHA compliance measures, as well as ensuring that the resource person is easily identifiable and that their contact information is easily accessible
  • preparing a digital safety plan that meets prescribed disclosure requirements (e.g., how the platform complies with the Act, statistics on the moderation of harmful content and an inventory of electronic data), submitting the plan to the Digital Safety Commissioner and making it publicly available in an accessible and easy-to-read format

Duty to protect children

The OHA would require platforms to take steps to protect children by integrating design features respecting the protection of children, such as age-appropriate design. These requirements would be detailed in regulations respecting child protection design features, such as account options for children, parental controls, privacy settings for children and other age-appropriate design features. 

Duty to make NCDII and CSAM inaccessible

The Act contemplates 24-hour take-down obligations triggered by platforms identifying or receiving flags about CSAM or NCDII, which are paired with due process requirements (i.e., notice and appeal processes).

Platforms are not generally required to proactively search for harmful content. However, regulations made under the OHA may require platforms to use technological means to prevent CSAM and NCDII from being uploaded.

Duty to keep records

Platforms would be required to keep all records, including information and data, that are necessary to determine whether they are complying with their duties under the Act.

Access to inventories and electronic data

Under the Act, the Digital Safety Commission would be able to give accredited persons (i.e., who conduct research or engage in education, advocacy or awareness activities related to the purposes of the Act) access to the inventories of electronic data included in digital safety plans submitted by platforms to the Commission and could order platforms to give access to such data.

Remedies

The OHA would empower individuals to

  • make submissions to the Digital Safety Commission respecting harmful content or a platform’s compliance with the Act
  • make complaints to the Commission relating to NCDII or CSAM

Administration and enforcement

The OHA would grant the Digital Safety Commission with extensive enforcement powers to

  • investigate complaints by, among other things, summoning persons to give evidence on oath and produce records, and receiving evidence without regard to whether it would be admissible in a court of law
  • hold hearings in connection with complaints relating to NCDII or CSAM, as well as any other matters relating to a platform’s compliance with the Act
  • verify or prevent non-compliance with the Act by authorizing designated inspectors to, among other things and subject to statutory restrictions, enter any place in which they have reasonable grounds to believe that there is any document, information or other thing relevant verifying or preventing non-compliance
  • make compliance orders requiring platforms to take, or refrain from taking, any measure to ensure compliance with the OHA, where the Digital Safety Commission has reasonable grounds to believe that an operator is contravening or has contravened the Act

Under the OHA, compliance orders made by the Commission could be enforced by the Federal Court.

AMPs and offences

The OHA would also introduce significant fines for non-compliance (subject to a due-diligence defence):

  • administrative monetary penalties for social media operators and operators of up to the greater of 6% of gross global revenue or $10 million
  • penalties for offences for operators of up to the greater of 8% of gross global revenue or $25 million

The Digital Safety Commission could also publish notices of violations and undertakings, naming the applicable parties.

Cost recovery 

This OHA contemplates the recovery of costs incurred by the Digital Safety Commission, Digital Safety Ombudsman or Digital Safety Office from regulated social media platforms. Future regulations will provide details on what related fees platforms will be required to pay and any exemptions.

Amendments to existing statutes

Criminal Code and Human Rights Act

Bill C-63 introduces significant changes to the Criminal Code, including a new definition of “hatred,” a new hate crime of “offence motivated by hatred” carrying a maximum sentence of life imprisonment, and stronger sentences for existing hate propaganda offences (including a maximum sentence of life imprisonment for advocating or promoting genocide).

It also amends the Canadian Human Rights Act to add the “communication of hate speech” by means of the Internet or any other means of telecommunication as a discriminatory practice. The amendments will provide individuals with the right to bring a related complaint before the Canadian Human Rights Commission and authorize the Commission to assess penalties of up to $50,000. These amendments focus on public communications by users, and do not apply to private communications (e.g., direct messages), nor to operators of social media services, broadcast undertakings, telecommunication service providers, or intermediaries hosting, caching or indicating the existence or location of hate speech.

An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service

Bill C-63 introduces amendments clarifying the definition of an “Internet service” to include services providing Internet access, providing Internet content hosting, and facilitating interpersonal communication over the Internet, such as email services. It also makes changes to the mandatory notification process by requiring any person providing an Internet service to the public to send all notifications to a law enforcement body designated by regulations and by extending the preservation period for data related to an offence.

Osler’s Online Harms Series

Stay informed about Bill C-63 and gain insights into the OHA’s impact by following our next installments of Osler’s Online Harms Series.