A Pre-Forum Primer for “Who Should Make the Rules? The Question of Platform Regulation”



Since the fall from techno-utopian grace following the 2016 US Presidential Election, platform companies have been trying to figure out how to tackle hard questions around content moderation, governance, and fair rule-making. These challenges are sometimes addressed by the issue, topic, or case, but fundamental questions remain around who should be making the rules governing these platforms in the first place.  The Forum @ Civic Hall, “Who Should Make the Rules? The Question of Platform Regulation” on October 24 at 6 PM addresses some of these questions at the 30,000 ft level as well as up close, using specific examples.  Representatives from Facebook will speak and attend with Crystal Patterson, the Head of Global Civic Partnerships, as well as Harold Feld, Senior Vice President, from Public Knowledge. Jessica Dheele from Ranking Digital Rights at New America will speak to best practices in making global standards and incentives for companies to respect users’ rights, free expression, and safety.

This primer is designed to get you up to speed on some of the main intellectual currents around platform regulation that will be discussed at our Forum. While this piece focuses on some Harold Feld’s work and Facebook’s efforts, it also offers a few core themes. 

A “Digital Platform Act?”

Harold Feld offers a comprehensive, market-based approach to platform regulation that bridges arguments around competition, content moderation, consumer protection, and public safety. You can read his e-book, The Case for the Digital Platform Act: Breakups, Starfish Problems, & Tech Regulation as well as the executive summary. If you like video, watch his Personal Democracy Forum 2019 talk about the subject. Feld’s book offers a unique new concept for platforms, which he defines as internet-based companies where the public can act as both consumers and creators, allowing the platform to reap powerful network effect benefits. This measure of dominance is called the digital “cost of exclusion.” When market players become too dominant, they can engaged in anti-competitive and even anti-consumer behaviors. By creating this measure, Feld demonstrates a market-based approach for congress to regulate in a variety of ways, focused on curbing bad structural behavior, not just antitrust.

He suggests the creation of a government agency, the “Digital Commerce Commission” that can execute his “Digital Platform Act,” which considers the unique qualities of data as both a valued asset, and as a form of speech and information. 

Examples of Platform Governance by Platforms: The Facebook Oversight Board

Facebook recently released a post on their Oversight Board’s structure. Company CEO Mark Zuckerberg likens the board to the “Supreme Court.” It makes decisions about tricky content moderation scenarios that can’t be easily adjudicated by Facebook officials alone. The Board is an attempt to democratize Facebook’s governance, but has received significant criticism.  Professor of St. John’s University Law School, Kate Klonick, has written extensively about how this oversight board could work. In a NYT op-ed she points out that while this Oversight Board has promise, it also lacks certain guiding principles that a constitution typically provides to other “supreme court” bodies. It could also simply disperse and mitigate risk to the company by having outsiders make decisions.

Facebook’s Oversight Board is part of a variety of programs designed to tackle the challenge of fair platform governance. Facebook is not alone. In their Combating Hate and Extremism post, they describe actions taken in response to New Zealand’s Christchurch Call to Action, and co-developed a nine-point industry plan with other tech giants on how to respond to calls for terrorism living on their platforms.  Yet as Farzaneh Badiei from Yale Law School points out, one of the challenges is having a globally-agreed-upon definition for terrorism, which makes following the point of “User Reporting of Terrorist and Violent Extremist Content” in the industry plan difficult.  

Various Proposals for Regulation

There are other approaches and concerns when it comes to platform regulation. This is not an exhaustive list by any means, but it does cover some of the core ideas:

    • Studies and critiques of platform governance: Tarleton Gillespie of Microsoft Research has written an excellent book, “Custodians of the Internet” that outlines some of the main issues facing platform governance. (You can also check out this excellent reading list compiled by Microsoft Research on content moderation). University of Virginia professor Siva Vaidhyanathan has written a “history of the last ten minutes” of Facebook in “Anti-Social Media” and its impact on democracy.  
    • The human impact of content moderation: Behind the Screen” by Sarah T. Roberts, a professor at University of California – Los Angeles,  goes through the emotional and psychological impact of contract content moderators as they try to carry out the speech policies of larger platforms.
    • Homogeneity of the tech workforce and the impact on platform functionality:  There are arguments that diversifying the tech workforce can also help mitigate some of the structural problems of platforms. The biases inscribed in policy, coders of algorithms, and also moderators themselves can mean prejudiced platforms designed for sensational click-bait that sometimes privileges racist, xenophobic, and extremist content. These ideas are outlined in University of Southern California Professor Safiya Noble’s “Algorithms of Oppression.”  
    • Censorship and Section 230 Critiques: Senator Josh Hawley (R- Missouri)  recently released his “[Ending] Support for Internet Censorship Act,” which, in order to end perceived censorship of conservatives on platforms, would revoke Section 230 immunity from tech platforms. Some have criticized this policy for 1) exaggerating the problem of conservative censorship 2) offering too much power to the government to choose what is and isn’t biased political speech and 3) misunderstanding Section 230 of the Communications Decency Act. Section 230 is sometimes mistaken to require political “neutrality” from platforms (it doesn’t) and to define the difference between a platform and a publisher (it doesn’t) — where platforms get immunity from responsibility for what their users might publish on their sites, as opposed to publishers of newspapers or blogs who can liable for things like hate speech. 
    • Free speech on private platforms: American constitutional instincts for free speech impact the expectations of users who do not want their speech abridged by powerful interests. Yet the First Amendment does not apply to private companies, despite the cultural desire of many who view platforms as a “public sphere.” Legal thinkers like Danielle Citron, law professor at Boston University Law School have discussed tricky issues relating to when one’s free speech might infringe on others’ safety and privacy. The UN has been getting involved in coordinating policy around free speech and safety online (particularly in regimes where critical political speech can mean death). In his book, “Speech Police,” UN Special Rapporteur on Free Expression David Kaye writes about the closed door discussions with governments, platforms, journalists, and dissidents around speech, privacy, security, and expression. 
  • Antitrust approaches: Some consider antitrust law as a tool to limit the consolidated power of platform companies. This includes Columbia Law professor Tim Wu (The Curse of Bigness: Antitrust in the New Gilded Age). Lina Khan, a fellow at Columbia Law, has a now classic work on “Amazon’s Antitrust Paradox” which reinterprets antitrust so that it can be used for more than just guaranteeing short-term consumer welfare, and to consider market health and market dominance. Some like Harold Feld consider Antitrust as a “starfish” problem — by breaking up these big platforms or cutting the limbs off of starfish, you aren’t fixing the marketplace, but just regenerating more big starfish. 
  • Multi-stakeholderism as regulatory mechanism: There have been calls for multi-stakeholder governance, inspired by ICANN and other Internet agencies. These include those by Visiting Scholar at the Elliott School of International Affairs, David Morar, and Forums Director, Danielle Tomson.