Social media platforms have revolutionized the way people receive and deliver their news and information. Industry players, legislators, and consumers of social media have all had to adapt to this new medium of speech. While having the permanence and public nature of traditional forms of news, such as newspapers, social media posts are not subject to the same kinds of editorial review and control. The sheer volume and pace of social media posts has made it impractical for social media companies to maintain a similar amount of content review as newspapers or television broadcasts.

Although this new environment has provided a robust avenue for free speech, it also creates legal risks as it becomes difficult to protect against illicit forms of speech. Social media companies face both business and legal risks in Canada where their platforms are used by consumers to spread illicit speech. To thwart this risk, social media companies may need to consider monitoring and removing illicit content without taking on undo expense or undermining the benefits of the platform.

The Limits of Free Speech

As explained by the Supreme Court of Canada in R v. Sharpe, while free speech is integral to a democratic society and the self-fulfillment of individuals, it is not an absolute right and can be limited in certain circumstances. Laws limiting free speech are only constitutional if they are considered a reasonable limit, “as can be demonstrably justified in a free and democratic society,” under section 1 of the Canadian Charter of Rights and Freedoms. For example, courts have found that free speech can be restricted where it constitutes defamation, obscene materials (i.e. child pornography), hate speech, fraud, and threats or harassment. Although newspapers and television programs can more easily review and prevent the publishing of these forms of illicit speech, it is substantially more difficult for social media companies that publish a great volume of speech at such a fast pace.

Consequences for Social Media Platforms

Although it may be difficult for social media companies to monitor the vast content of their platforms, they can potentially face legal liability in Canada for publishing illicit speech. Furthermore, material service providers that want to minimize their legal risk may incorporate restrictions on illicit speech as conditions in their service agreements. This restriction means that a social media platform that is unable to monitor and remove illicit speech could lose vital service agreements in addition to potentially facing costly litigation. Examples of integral services include web hosting services, cloud-based services, and listings on application stores such as the Apple and Google app stores.

The attraction of social media platforms may be their capacity to allow almost anyone to express themselves freely at any time, however, legal restrictions to free speech still apply. Social media companies should be aware of the contractual terms of their key service agreements that relate to illicit speech, and ensure that their monitoring and removal policies comply with these terms.