Big changes are coming to social media. No, we are not referring to the rise of “super apps” or Facebook’s retreat from the facial recognition space. This time, changes are coming from the top-down in Canada and businesses would be wise to take note.
Canada’s Minister of Heritage has completed the first round of consultations for a new social media regulatory regime. The new rules and regulatory structures aim to eliminate “Harmful Content” posted on social media. To achieve this, “online communication service providers” or “OCSPs” will be subject to mandatory reporting and response obligations mirroring Germany’s “NetzDG” laws. In addition to the rules, three new regulatory bodies will be created to enforce and develop policy. For businesses, this could mean increased compliance costs, potential fines, and a new set of regulatory bodies which will oversee the previously unregulated social media space.
Who will be affected?
The new regulations will target businesses that provide an “online communication service”. This means services whose primary purpose is to enable users to communicate with one another. Facebook, Twitter, and other household names in social media are examples of companies that may be regulated should the legislation be adopted.
Less certain is the regulation of smaller websites, some of which may derive a significant portion of their traffic through user created content and communication. The current guidance states that services such as “travel review websites” will not be affected, but it is unclear where the regulations will draw the line.
Reporting, response and preservation obligations
Under the new rules, all OCSPs will be under a positive obligation to take reasonable measures to identify Harmful Content and make that content inaccessible to Canadians. Harmful Content refers to:
- terrorist content;
- content that incites violence;
- hate speech;
- non-consensual sharing of intimate images; and
- child sexual exploitation content.
In addition to monitoring and preventing access to Harmful Content, OCSPs will need to respond to “flagged” content within 24 hours of it being reported. A 12 month preservation period and mandatory reporting obligation are also contemplated.
When an OCSP makes the decision to remove content, it will need to have a specific process for users and content creators to appeal the decision. Small businesses and start-ups are the ones that will likely feel this burden the most. The costs of compliance will be felt when these companies begin to scale-up. Without ample resources to dedicate to content review, small businesses may unwittingly find themselves offside the new regulatory requirements.
The new regulatory bodies and penalties
The proposed legislation would create the following three new regulatory bodies:
- “Digital Safety Commissioner” (“The Commissioner”)
- “Digital Recourse Council of Canada”
- “Advisory Committee”
The Commissioner and Advisory Committee will develop policy and enforce the new legislation. The “Recourse Council” will provide an independent recourse to users and content creators to appeal decisions made by OCSPs. The Recourse Council will make binding decisions as to whether content is harmful under the new legislation.
The Commissioner will have the power to impose significant penalties. The current proposals suggest administrative penalties of up to 10 million dollars or 3% of an entity’s Gross Global Revenues. Quasi-criminal prosecution is also on the table, with potential fines up to 25 million dollars or 5% of the entity’s Gross Global Revenue. In extreme circumstances the Federal Court will be further empowered to order that telecommunications providers such as Internet Service Providers (ISPs) block part or all of the offending website as opposed to merely ordering the removal of offending content.
Takeaway
Established and emerging businesses may want to consider building compliance systems into their operations early on. Speaking with counsel can help clarify best practices and prepare your organization for future regulatory developments.