In April 2021, our post “Is social media regulation on the way?” discussed the accelerating global proposals to hold social media giants accountable for the potentially dangerous content that appears on their sites. Specifically, in the US there is resurged momentum to reform Section 230 of the 1996 Communications Decency Act (47 U.S.C. 230), which has long been used to shield social media platforms from liability for content posted by third parties.

Recent developments

On October 3, 2021, Facebook whistleblower Frances Haugen further ignited this call for reform. Previously a Facebook product manager, Haugen exposed Facebook internal research concluding that Facebook’s algorithms and business practices are dangerous for public health—particularly with respect to youths and teens. Findings that Facebook contributed to female body-image issues, enabled human trafficking, and exacerbated Coronavirus vaccine misinformation are among the issues Haugen disclosed.

Subsequently, Haugen testified before a Senate subcommittee to urge reform of Section 230. As Section 230 only applies to third-party content, Haugen opined that holding Facebook liable only for its users’ content would not be enough to encourage Facebook to change its current practices. She instead suggested lawmakers revise Section 230 to hold Facebook accountable for its algorithms which determine how user content is ranked on its website. Generally, she said, this would motivate Facebook to eliminate algorithms that repeatedly show users harmful and inflammatory content. Facebook (now Meta) did not agree.

Section 230(c)(2) states that:

(d) Obligations of interactive computer service

A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors.

Social media platforms, like Facebook, are currently required to offer parental control protections to limit access to harmful material to minors. Proponents argue that regulations requiring the ranking algorithm to not display harmful material to all users seems similar.

Legislative proposals

Initiatives analogous to Haugen’s are not new. In February 2021, the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act was proposed to the US Senate (S. 299), along with a numerous other legislative proposals. Some of these proposals argue that protections for social media platforms ought to be eliminated entirely, while others, like the SAFE TECH Act, aim merely to limit or refine protections. See “Is social media regulation on the way?” for more details.

Looking ahead

The fate of the February 2021 legislative proposals, as well as any changes to Section 230 resulting from Haugen’s testimony, still remains to be seen. Section 230 reform has received a rare bipartisan support, which is encouraging for those in support of change. President Biden also expressed displeasure for the current status of social media liability standards.

In anticipation of potential changes to Section 230, all internet platform providers should review their current policies for monitoring third-party content. Section 230 currently protects providers from liability for actions taken in good faith to restrict access to or availability of material the provider considers to be objectionable. How would your policy and procedures have to change if the proposed law passes?

Special thanks to Lindsay Weinstein for assisting with the drafting of this post.