US Section 230 shields website platforms with respect to third-party content. Though effective at its inception, the backlash against the law and the protection it gives Big Tech has been growing. As public opinion and political support to change Section 230 increases, the question is not if, but when Section 230 will become a relic of the past. The case of Hemp v. Facebook might be the landmark decision to turn tide.
Daniella Vanova, 30 May 2022
Social-media platforms provide services and benefits to its users. Peaceful – and not so peaceful – protests throughout the world are organized on these platforms. Whether it be the BLM movement, anti-COVID demonstrations or Fridays for Future – social media has proven to be powerful tool to spread messages and garner support for a cause. COVID-19 has amplified this trend toward digitalization.
In the US, social media giants like Facebook, Twitter and Instagram are legally shielded with respect to third party content by Section 230 of the Communications Act of 1934, enacted as part of the Communications Decency Act of 1996 (CDA). At the time Section 230 was deemed “the most important law protecting internet speech”. Its objective was to create a friendly online environment while allowing harmful content to be removed by providers.
Since the inception of Section 230 social media has drastically evolved. Today it is also used to spread fake news and incite to violence. To what degree should Instagram be held responsible for shaping our male and female ideal body image? Should platforms such as Facebook and Twitter be liable when it is used for illicit human, weapon, and endangered wildlife trafficking? Section 230 did not foresee these developments. It is time to reform Section 230 in order to guarantee the freedom of the internet, protect users and hold Big Tech responsible for what happens on their platforms.
As a governing piece of legislation, Section 230 has served as a guiding principle for online intermediary liability in the United States. It is comprised of two sections. According to the Congressional Research Service, Section 230(c)(1) specifies that service providers and users may not “be treated as the publisher or speaker of any information provided by another information content provider.” Second, Section 230(c)(2) states that service providers and users may not be held liable for voluntarily acting in good faith to restrict access to “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable material.” Once a case is brought to trial, the U.S. Supreme Court asks three questions pertaining to Section 230(c)(1):
- Is the defendant a provider or user of an interactive computer service?
- Does the plaintiff seek to hold the defendant liable as a publisher or speaker?
- Does the plaintiff’s claim arise from information provided by another information content provider?
If the answer to any of these questions is “no” pursuant to Section 230(c)(1) no liability is held. In other words, Big Tech companies are shielded from content liability because they can control and moderate posts as they see fit.
The case of Karen Hemp v. Facebook could mark a turning point against Section 2030. Karen Hemp, mother of three, professional newscaster and host of FOX 29’s Good Day Philadelphia, filed a lawsuit in 2019 against Facebook because her photograph was used on the dating app FirstMet without her consent. Although the judge dismissed her claim in 2020, siding with Facebook which successfully claimed immunity under Section 230, she won her appeal. In March 2022 Facebook lost its motion to dismiss. The case is currently in the discovery phase and if the plaintiff is successful against Facebook, this case could be a landmark case expanding the liability of Big Tech.
Lawmakers from both sides of the aisle are proposing amendments to clarify and create more protection within the law itself. Former US President Donald Trump, and current President Joe Biden have expressed their interest to amend or revoke Section 230, though for different reasons. Republicans also want to amend or revoke Section 230 to combat censorship of conservative voices. Democrats are in favor of making Big Tech liable for disinformation and harmful content. In 2018, President Trump signed the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA). It was passed with overwhelming support in both chambers of Congress. FOSTA carved out an exception to Section 230’s liability shield for content violating federal sex trafficking laws.
In February 2021, the Information Technology and Innovation Foundation (ITIF) published a report on Proposals to Reform Section 2030. ITIF is an independent non-profit, nonpartisan research, and educational institute. Some suggestions outlined in the report included establishing size-based carve-outs or carve-outs for certain types of content or activity from Section 230; requiring online services to comply with a notice-and-takedown requirements; expanding federal criminal and federal civil enforcement laws; and establishing a “good faith” requirement to prevent bad actors from taking advantage of Section 230(c)(1)’s liability shield.
Frances Haugen, whistleblower and former Facebook employee, proposed a reform of Section 230 which would give Big Tech the ability to modify content through algorithms rather than monitoring individual posts. Although this proposal would be more cost effective, the issue of liability for third party content would remain with Big Tech and their algorithms. In response to Haugen’s documentation on Facebooks algorithm, a group of Democratic members of Congress introduced The Justice Against Malicious Algorithms Act in October 2021. This law would open websites to lawsuits for certain content. Republican Representative Mike Doyle of Pennsylvania: “Under this bill …Section 230 would no longer fully protect social media platforms from all responsibility for the harm they do to our society.”
As public opinion and political support to change Section 230 increases, the question is not if, but when Section 230 will become a relic of the past. The case of Hemp v. Facebook might be the landmark decision to turn tide.