Appeals Court Raises Questions Over Section 230 Law Giving Social-Media Companies Legal Immunity
The Philadelphia-based Third U.S. Circuit Court of Appeals ruled that a mother’s lawsuit could proceed against TikTok over the “blackout challenge,” raising questions about Section 230 of the Communications Decency Act of 1996, a federal law that offers broad protections to tech companies that host user-generated content. The appeals court said TikTok could potentially be responsible because it exercised editorial judgment, and wasn’t merely acting as a content host, when its algorithm recommended blackout-challenge content on the girl’s “For You Page.” Social-media companies are legally protected from lawsuits over content. The Communications Decency Act was originally created to protect children from accessing sexually explicit content online. Section 230 was intended to be a “Good Samaritan” provision encouraging internet companies to proactively curate online activity and give them immunity from lawsuits if they chose to block content. The law has stood despite challenges to it. The Supreme Court essentially upheld Section 230 when it ruled last month that social-media companies were allowed to moderate content under the First Amendment. Changes to Section 230 could limit protections social-media platforms currently have.
Appeals Court Raises Questions Over Section 230 Law Giving Social-Media Companies Legal Immunity