The internet has been an incredibly powerful tool for enabling users to share a wide variety of content. Some have argued that large online platforms like Facebook and Twitter have too much power over what speech can be heard. Often this leads to calls to change Section 230, a critical legal protection that enables content moderation and protects platforms from being held liable for their users’ content, or is used as evidence of these platforms’ “monopoly” power. To use antitrust enforcement to address concerns about content moderation is unlikely to result in the desired policy changes and could set a dangerous precedent for abusing antitrust enforcement for non-competition related purposes. Policymakers should not ignore the important role Section 230 plays in enabling a dynamic market for services to host user-generated content well-beyond the social media context. History has shown that it is often hard to predict which small company may prove to be an innovative and disruptive success, but a framework including Section 230 that allows new entrants to start with minimal regulation is most likely to yield a competitive marketplace and benefit consumers.

By Jennifer Huddleston1

 

I. INTRODUCTION

In January 2021, several social media platforms banned then-President Donald Trump’s accounts. Similarly, Amazon Web Services cloud hosting and Apple and Google’s app stores removed the social media app Parler for violating their terms of service. These ac

ACCESS TO THIS ARTICLE IS RESTRICTED TO SUBSCRIBERS

Please sign in or join us
to access premium content!