TechBuzz welcomes the contributions of writers from Utah's tech sector. Today we feature Peter Christianson of Clearlink who shares his insights on the Third Circuit decision on the TikTok case and its implications for online platforms.
Amid its more high-profile litigation and cringeworthy congressional hearings, TikTok was stripped of one of its most important legal protections—and that could have major implications for the biggest sites on the Internet.
In August 2024, an appeals court overturned a lower court’s ruling that TikTok was protected by Section 230 and thus couldn’t be sued due to content posted by its users. The loss of these protections will have a huge impact on internet users everywhere and will change the way that companies like Google, Facebook, and Amazon operate.
What is Section 230?
Section 230 of Title 47 of the United States Code, often referred to as just “Section 230,” is a 1996 law that protects online expression by shielding online platforms from liability for their users’ speech. In other words, if someone posts something harmful on Facebook, you can’t sue Facebook. The person who posts content is held responsible, not the service that hosts the content.
The internet as we know it today was built on Section 230. YouTube would not exist if it had to manually review millions of videos every day to avoid being sued. The very idea of Twitter would have given any lawyer a heart attack. Even Google would have to ensure that the content it displays was pre-approved before anyone else saw it.
What we would have been left with would have been a centralized, highly-curated medium more like broadcast television than the internet of today. If you wanted to start a blog, you’d have to either host your own web server or pitch it to a bigger site like a writer pitching a sitcom to a television network.
There’s a reason Section 230 has been called “The 26 words that made the internet.” If it was overturned, not only would video platforms like TikTok and YouTube be sued out of existence, but so would any site that allows users to upload text or images. Everything from eBay auctions to dating apps would be too risky for any tech company to touch.
That’s why it’s always concerning when there are legislative or legal challenges to Section 230
The TikTok Case
The lawsuit in question came about after some TikTok users began posting videos of themselves choking themselves until falling unconscious, dubbing it the “Blackout Challenge.” The videos showed up in children’s feeds and several children died trying to participate in the trend. The parents tried to sue TikTok in 2022, but the court found that TikTok wasn’t liable for the videos its users posted.
The plaintiffs appealed this decision to the U.S. Third Circuit Court of Appeals. Judge Patty Shwartz reversed the lower court’s decision, finding that the TikTok algorithm is an expressive product and therefore not protected by Section 230, which only addresses harmful content posted by third parties.
Judge Shwartz’ decision follows a similar ruling from the Supreme Court in July, which found that laws in Florida and Texas that aimed to control the moderation policies of social media platforms infringed on the companies’ First Amendment rights. If a platform’s algorithm is protected speech, then the platform is also responsible for that speech.
Live by the algorithm, die by the algorithm
This ruling could be earth-shattering for the various tech companies that rely on recommendation algorithms to drive engagement and profits. The most severely impacted, of course, is TikTok. Not only does this mean it can be sued for damages in this case, but TikTok’s value as a platform comes in no small part from its algorithm. Whereas YouTube would still work without recommendations, It’s pretty hard to search for specific videos on TikTok because they don’t even have names. Most people find everything through their For You Page, which has been wildly successful.
Many other platforms have tried to copy TikTok’s success with products like YouTube shorts and Instagram reels, but even more conventional platforms rely heavily on their recommendation algorithms. At one point, Netflix reported that 80% of watched content is based on recommendations. Other media empires like Spotify and Amazon also rely on recommendations to keep you on their sites.
And in this sense, I think that the Third Circuit’s decision has merit. Upholding Section 230 is still foundational to preserving the free and open internet, but we shouldn’t let companies try to hide their bad behaviour beneath the same umbrella.
Users deserve basic consumer protections
Social media platforms are notorious for building deceptive patterns into their software to keep you from leaving. For example, pressing the “back” button when using TikTok doesn’t take you out of the app, as you might expect, it just shows you another video. These design patterns aren’t some third party creations the site is hosting—they were carefully and intentionally designed by the platforms’ developers to achieve a desired outcome.
As consumers, we shouldn’t have to put up with platforms actively trying to make our online experiences worse. A researcher at the University of Massachusetts—Amherst is currently suing Meta to force the company to allow its users to opt out of its algorithm. Meanwhile, Utah lawmakers are looking to ban algorithmic feeds for minors altogether, following similar legislation in New York.
While questions of software design and algorithms might seem like niche technical topics, they’re a huge part of our daily lives.To paraphrase Langdon Winner, the design of a system provides a convenient means of establishing patterns of power and authority. If we want to live in a society where our rights and values are protected, we need to be able to hold people accountable for the algorithms they use to interact with us.