But the case adds to a growing wave of lawsuits challenging tech companies to take more responsibility for their users’ safety – and arguing that past precedents should no longer apply.
The companies have traditionally argued in court that one law, Section 230 of the Communications Decency Act, should shield them from legal liability related to the content their users post. But lawyers have increasingly argued that the protection should not inoculate the company from punishment for design choices that promoted harmful use.
In one case filed in 2019, the parents of two boys killed when their car smashed into a tree at 113 mph while recording a Snapchat video sued the company, saying its “negligent design” decision to allow users to imprint real-time speedometers on their videos had encouraged reckless driving.
A California judge dismissed the suit, citing Section 230, but a federal appeals court revived the case last year, saying it centered on the “predictable consequences of designing Snapchat in such a way dating hookup apps android that it allegedly encouraged dangerous behavior.” Snap has since removed the “Speed Filter.” The case is ongoing.
Congress has voiced some interest in passing more-robust regulation, with a bipartisan group of senators writing a letter to Snap and dozens of other tech companies in 2019 asking about what proactive steps they had taken to detect and stop online abuse. Continue reading “The girl killed herself last summer, the mother said, due in part to her depression and shame from the episode”