Social media platforms like Facebook and Twitter are private entities even though we may think of them as similar to public “marketplaces of ideas.” A panel of three experts on law and media discussed this and other social media issues in a Chicago Humanities Festival conversation this week.
This was the second in the CHF’s three-part Deep Dive: Speech, curated and hosted by Geoffrey R. Stone, University of Chicago law professor. The first event in late March observed the 50th anniversary of the Pentagon Papers with a discussion on national security and press freedom.
Joining Stone for this week’s lively and well-informed conversation were lawyer/journalist Emily Bazelon (New York Times Magazine and Slate Political Gabfest) and Monika Bickert, head of global policy management for Facebook. Bickert manages a global team that oversees the policies for types of content that can be shared on Facebook and how advertisers and developers can interact with the site.
Stone noted that while the First Amendment allows a lot of what some might call “bad speech,” it has “self-correcting mechanisms” in the classic free marketplace of ideas. (More speech can counteract bad speech.) Newspapers and broadcast TV stations are liable for their content, even the opinions of readers stated in letters to the editor. Social media, on the other hand, are immune to lawsuits because of Section 230 of the 1996 Communications Decency Act, which makes social media immune from liability for any content placed on their sites by users.
Taking his first stab at the regulation question, Stone reminded us of the early days of radio when there were limited frequencies; the Federal Communications Commission was created in 1934 to grant licenses for the use of the public airwaves and to regulate broadcasting. The same situation applied in the early days of television when there were limited frequencies and the FCC followed the radio pattern by granting licenses and regulating broadcast tv stations. When cable tv came on line in 1980s, there were no frequency limits and no FCC regulation.
Social media platforms have been immune from regulation because they are private business entities and because “First Amendment jurisprudence is very speech-protective,” Stone said. Thus any possible regulation of speech is a complex question.
Bickert said that only 6 percent of content on Facebook is political and only 4 percent of the content in an average user’s newsfeed is political. She pointed out that 90 percent of the Facebook audience is outside the U.S., many in countries that have very different laws and policing of speech. Bazelon mentioned that this week the government of India asked Twitter to censor criticism of prime minister Modi’s Covid policies; Twitter complied, taking down dozens of tweets.
Bickert described the types of speech that her team monitors on Facebook. (She said that Facebook is the only company that works with independent fact checkers.) Hate speech is the easiest to manage, she said, because they have a list of words that are banned in that category. See the Facebook Community Standards page for their policies. (A number of my friends have landed in “Facebook jail” for using such terms or quoting someone else who used them.) Misinformation is harder, she said, because fact checkers may have a difficult time determining the difference between truth and falsehood. Government guidance would be helpful here, she said. The third category—protected speech—can be polarizing speech but must be handled carefully. She also said that evolving situations, such as the Covid crisis, require flexibility as data changes.
Bazelon emphasized that even though social media platforms may seem like public spaces, they are more like private shopping malls where the owners control any protests or free speech events. (Also, as an aside, many activists don’t understand that “my First Amendment rights” only apply to speech regulation by government agencies—federal, state or local—not by a private business. If Twitter bans you, don’t complain that your First Amendment rights were violated.) There’s anxiety about any regulation of speech, Bazelon said, and it’s important to ask the tough questions about why speech might be regulated. In terms of social media policies, she recommended that it would make sense to treat public figures with millions of followers differently than the guy next door with 300.
Stone posed the question about why we should trust Facebook, Twitter and YouTube to make decisions about speech dangers and risks. “Is there a way to put a check on disinformation on these private platforms? I don’t trust the government to do it either.” Perhaps we need an agency like the FCC to do this, he said. (My question: Would such an agency be made up of private citizens, appointed by the government and industry organizations, or would it be more of a quasi-government agency like the FCC, FDA, SEC, etc.?)
Editor’s note. The iconic term, “free marketplace of ideas,” is derived from the philosophy of John Milton in Areopagitica: A speech of Mr. John Milton for the Liberty of Unlicenc’d Printing, to the Parlament of England in 1644 and by John Stuart Mill in his 1859 book, On Liberty.
The CHF program on social media and democracy runs about one hour. You can view the session here. Part 3 of Deep Dive: Speech will address hate speech at 7pm on Thursday, May 27. The event is free; registration is required.
Did you enjoy this post? Please consider supporting Third Coast Review’s arts and culture coverage by becoming a patron or making a donation. Choose the amount that works best for you, and know how much we appreciate your support!