Facebook is coming off a rough two months, and deservedly so. The company earned a special place of distrust in the hearts of many: A CNN poll found that 3 out of 4 U.S. adults say Facebook is making American society worse.
In an October Senate hearing, former Facebook employee Frances Haugen made explosive allegations that the company’s own research documented the harms its site inflicts upon users. In other words, Facebook itself allegedly knew that its business harmed others in concrete and preventable ways, like promoting photo sharing that damages the mental health of young people, especially girls.
How has Facebook gotten away with it?
Part of the answer lies with Section 230 of the Communications Decency Act, the controversial federal law that essentially gives websites broad protection against liability for content posted by others. The law shields Facebook from the responsibility and liability of a traditional publisher.
Though a newspaper might be sued for libel over a defamatory article, Section 230 protects online platforms from liability for the content they distribute as long as they did not create it. In effect, Facebook has received a federal subsidy in the form of Section 230, which largely protects it from an important form of societal regulation: lawsuits.
Lawsuits bring issues into a public forum for scrutiny and discussion. In the absence of adequate regulation, the public depends upon private citizens to assert their rights and redress wrongs in court. When companies deploy new technology and business models, legislators and regulators are often slow to react. As a result, the legality of these new practices is often litigated — meaning they get debated by attorneys, reported by the news media and discussed by the public.
Social media companies have escaped these lawsuits mostly unscathed. For example, Facebook was sued by a victim of sex trafficking who had connected with her abuser through the site. In June the Texas Supreme Court dismissed most of her claims based on Section 230 immunity. In a different case, family members of victims killed by terrorist attacks sued Twitter, Facebook and Google, alleging that these companies provided material support to terrorist organizations. The 9th Circuit ruled (also in June) that most of the claims were barred by Section 230.
But there are grounds for civil liability lawsuits against Facebook outside the scope of Section 230. While 230 lets social media companies off the hook for harmful content posted by users, Facebook’s internal documents and Haugen’s Senate testimony suggest its business model and products are themselves harmful and addictive.
The “like” button and the endless scrolling feature may have negative consequences for mental and physical health by keeping users glued to their screens, as noted by tech insiders such as Tristan Harris and former Facebook executive Chamath Palihapitiya. The company’s product design also rewards misinformation. When Facebook overhauled its algorithm to increase user engagement, it boosted amplification of divisive and provocative content.
Facebook should further be held liable for misleading public statements about the nature of its products. For example, the company’s statements about the mental health benefits of social apps for young people glaringly omit its own internal research showing that Instagram use makes body image issues worse for 1 in 3 teenage girls.
Facebook’s products and what the company says about them should be fair game for product liability lawsuits.
Certainly Section 230 needs to be modified. It is currently written so that courts interpret it too broadly to mean blanket immunity even when the claims against a company are not based on publisher or speaker liability.
Legislative reform won’t happen fast, and accountability for Facebook shouldn’t have to wait. In addition to compensating injured victims, lawsuits serve another purpose — they will compel the famously evasive company to disclose more information on what it knows about its own products.