Rage Against the Feed: Grassroots and Litigious Methods of Accountability for Social Media Platforms

Article by: Agha Sadaaf

All memorandums are for informational purposes only and do not constitute legal advice. Additionally, the memorandum does not create or intend to create a solicitor-client relationship between the reader and the initio Technology and Innovation Law Clinic

Rage Against the Feed: Grassroots and Litigious Methods of Accountability for Social Media Platforms

In June 2023, social media platform Reddit sparked outrage within its userbase when the company announced a restructuring of its Application Programming Interface (API) pricing model for third party users. Prior to this change, Reddit allowed for the public use of it API to develop third party applications and tools using the website’s software and infrastructure. The new billing model, however, would require developers of third-party Reddit applications to pay exorbitant costs to maintain their services. The developer of the Apollo Reddit application, for example, shut down the highly popular third-party app within weeks of the announcement as continued operation following the pricing change would have cost him $20 million annually.

In response to these changes, almost 7000 subreddits protested by making their public communities private or restricting access for 48 hours. After this period, many communities extended their protest over a couple of weeks by maintaining its restrictions or by allowing Not Safe for Work (NSFW) content on their subreddits to damage Reddit’s advertising revenue. As Reddit is highly dependent on ad revenue while it has been reportedly considering an IPO, and subreddits tagged NSFW for adult content cannot post advertising, this incendiary method of protest led to Reddit directly intervening by removing protesting moderators, working on purely volunteer bases, from their positions. Before the end of the month, the Reddit Blackout of 2023 was quashed. 

This begs the question: how might social media platforms be held accountable for unpopular and sometimes even harmful decisions? In post-Blackout debrief Reddit threads, users flagged the timed nature of the protests, the fragmented nature of subreddit collective action, and Reddit’s control over its systems (enough to directly remove protesting moderators from their positions) as the reasons why the Blackout ultimately failed. As the numbers of protesting subreddits shrank after the 48 hours, and different subreddits employed different levels of protest, it was simply a matter of waiting out the dissent and actively intervening on the remaining protestors impacting ad revenue. In reflecting on these aspects of the impacts and failures of the Reddit Blackout, an unprecedented event of user generated protest for an online social media platform, we might see future online resistance movements learn a thing or two from how online collective action can play out.

Bottom-up user generated protests for redress are not the only form of accountability against social media giants however. Since the inception of social media in the mid-2000s, the law has engaged with the social impacts of these platforms and the harms they often cause. Since the 2010s, several different iterations of legislation has been passed in Canada and in the US to regulate online harms such as defamation, cybercrimes, sexual violence online, and the non-consensual distribution of intimate images. Litigation seems to be taking an increasing role in social media platform accountability as well.

While some methods of accountability such as the Reddit Blackout might be unprecedented in their methods of user-driven protest, others are unprecedented in their scale.  In October 2023, the State of California and 32 other states (well over half the Union), filed a joint claim against Meta Platforms Inc. (the corporate owners of Facebook, Instagram, and WhatsApp) for violating the Federal Children’s Online Privacy Protection Rule (COPPA) as well as a plethora of state consumer protection laws for actively engaging in misrepresentative practices to encourage addictive social media use in children for profit. As of this writing, this claim is still going through court and a decision has yet to be rendered so the efficacy of this unprecedentedly large-scale litigious action remains to be seen.

That is not to say that litigation against social media accountability for users has not been useful in the past. If Supreme Court of Canada decisions like CUPW v. Foodora Inc. are any indication, litigation has been very impactful in regulating harmful social media platform activity – in this case, their interpretation of their gig workers as independent contractors being dismantled in favour of considering gig workers as employees.

As massive social media platforms increasingly become integrated into our lives and in our social institutions, a close eye must be kept on the different methods of accountability that are emerging in response to unpopular and harmful activities undertaken by powerful corporate entities and the role the law can play in these measures.

 

 

Previous
Previous

Canadian Businesses and the GDPR

Next
Next

Robinhood's Gamble: Platformization, Market Power, and the Quest for Antitrust Modernization