Supreme Court Ruling Could Prompt Safer Digital Spaces
At Promly, we are focused on harnessing technology’s power to serve humanity’s best interests rather than simply creating profits for Big Tech. I’m particularly proud of our efforts to partner with Congress to address harmful content like YouTube videos that promote/assist suicide.
But there is some recent news concerning this topic: Next week the Supreme Court will hear oral arguments in a case that could transform the internet as we know it, ultimately creating safer digital spaces for us to grow and connect.
This case, filed by the family of a woman killed in a 2015 terrorist attack in Paris, claimed that YouTube/Google knowingly hosted and “recommended” thousands of radicalizing ISIS videos to users. Google argued that its actions were protected by Section 230 — strong 1996 legislation that shields web and social media companies from legal liability for content posted by users.
In prior proceedings, Google’s position was supported by both a federal district court and the U.S. Circuit Court of Appeals. By agreeing to hear this case, the Supreme Court signaled the justices’ interest in this landmark law, which remains a vital shield to protect Big Tech against countless lawsuits. It also gives powerful companies free range to host and promote dangerous content, including self-harm videos, without any liability.
When Section 230 was enacted, social media and the web in general were brand new. Early platforms didn’t widely track, target, or manipulate their users' activity. But today, their entire business models are built around this, and Tech giants like Google, Facebook, YouTube, and TikTok seem to abuse the privileges of Section 230 to increase their profits. So how can we, as a society, come together and harness the power of technology for the good of all? I believe that the best path forward is through compromise.
Rather than repeal Section 230, or make broad changes that could stifle the free expression of ideas, Congress could reform Section 230 to allow for free expression while also holding companies accountable. Its liability shields should protect content that a provider has no role in promoting, but not without limits. Liability protection could be removed for content that violates a site’s rules for posting — for example, rules that prohibit targeted harassment, bullying, self-harm, incitement of violence, or doxxing.
Additionally, Section 230 could be amended to require sites to be transparent about their content moderation activities and give users proper notice of changes to those policies. Free speech must be protected from politically- motivated views of a site’s management team or employees.
I also think it’s important to spell out which content companies won’t be liable for. For example, what happens if a social media company recommends a post about longboarding, and a teenager who views the post is seriously injured riding a longboard? Can her family sue the media platform? I think a good outcome is to be clear about what type of promoted content will create liability, for example, libel, or incitement to violence and self-harm.
All of us at Promly will be closely watching this case. If the Supreme Court upholds Section 230, then Congress must do the important work of reforming the law to balance free speech and personal safety by holding tech companies accountable for specific, narrowly-defined content they censor or promote.