Scroll into the Abyss
Google and Meta scrolled a bit too close to the sun.
The concept of the infinite scrolling feed in apps is so commonplace that it's easy to forget it's a relatively recent invention. It seems innocuous: just auto-load the next page of information instead of asking the user to click "next page."
Unfortunately for the average user, it is the perfect example of a dark pattern: a way to design apps such that people are more likely do exhibit the behaviors the designer wants them to do. In this case, staying in the app and continuing to scroll.
More scrolling means more chances to show ads, more chances to collect user data, and less time users spend giving other apps the same treatment. This battle for user focus has been at the center of now decades-long tug o' war between social media companies and tech regulation which has seen almost all accountability simply slide off the backs of these tech giants.
'til now!
Several recent court cases in the US have slapped some meaningful outcomes on social media giants, including Meta and Google. Until recently, a lot of the lawsuits lobbed at social media companies have been grounded in the ongoing debate about "Section 230," a stipulation that essentially gives social media platforms cover from liability for things posted by users of the platform. From a mixture of fear and uncertainty around changing Section 230, it's created a bit of an aegis for the tech giants to throw around whenever regulation is discussed.
But now, two big court cases just landed, one finding Instagram and YouTube to be addictive and harmful in design, and another finding that Meta's app design includes intentionally dangerous decisions which inadequately prevent child exploitation on the platform.
Here's a quote from the New Mexico Department of Justice which oversaw the second case I mentioned:
The evidence presented at trial–which included internal Meta documents and testimony from former Meta employees, law enforcement officials, and New Mexico educators–established that Meta's design features enabled pedophiles and predators to engage in child sexual exploitation on Meta's platforms. Evidence from those witnesses and other industry experts also demonstrated that Meta intentionally designs its platforms to addict young people and, contrary to Meta's public commitments, expose them to dangerous content related to eating disorders and self harm.
Now, does this mean we're about to see the end of social media as we know it? No. And the fines handed down by the courts are a tiny drop in the bucket to these companies. The bigger headline here is that giants can bleed.
These were civil cases with direct penalties paying out to plaintiffs. This sets a precedent that opens up the road to dozens, if not thousands of similar lawsuits. And it avoids the "Section 230" issue entirely because it's not about what content they've published on their platforms, it's about how the apps are designed. Ever-so-tiny yet ever-so-meaningful distinction.
For now, just take a deep breath and enjoy the fact that somewhere, Mark Zuckerberg is a little bit miffed.