When social media companies suspended President Donald Trump’s accounts in the wake of the attack on the Capitol last week, many who believe Trump incited the violence cheered. And when Amazon, Google and Apple all cut off Parler, the social platform where many Trump supporters had connected before the riot, the company’s CEO and others claimed their free-speech rights had been abridged. Are tech companies obligated to provide a platform to all users who want one, and do the companies themselves have rights when it comes to choosing who does and doesn’t have access?
Daphne Keller is director of the program on platform regulation at Stanford’s Cyber Policy Center, and teaches at Stanford Law School. She told Texas Standard that while Parler, which is currently offline because Amazon’s AWS service will no longer provide server space, is unlikely to win a legal case against Amazon, it makes sense for advocates of broad access to social platforms to be concerned about what “deplatforming” could mean for other social networks or websites.
“We really do have a relatively small number of companies with the potential to act as gatekeepers to the public discourse that most people see,” Keller said.
In more than 30 other cases in which sites were denied access to internet infrastructure like web hosting or servers, they lost against the big tech companies, Keller said.
Platforms like Twitter and Facebook, who have suspended users, including Trump and many other accounts spreading conspiracy theories, have rights when it comes to what they allow their users to say, Keller said.
“The platforms themselves have their own free-expression rights to set editorial policy, and that one of the reasons that they win, and one of the reasons that even if Congress changed legislation, there would still be a powerful argument that platforms have the right to set their own speech policies, and kick off the speech that they don’t want to carry,” she said.
Keller says that because there are currently many places online where people can speak, a legal imperative to require specific platforms to carry speech they don’t want to is less than it would be if there was a monopoly on access.
“There might be one answer to that for Facebook and Twitter and YouTube, and a different answer for an infrastructure provider that’s deeper in the technological stack of the internet, and maybe more essential for offering services like your [internet service provider] or possibly Amazon Web Services,” Keller said.
Many conservatives who are angry about big tech’s influence, and its ability to remove or regulate user activity on their platforms, advocate repeal of Section 230 of the 1996 Communications Decency Act. Section 230 protects tech platforms against being sued for certain kinds of speech published by their users, though it doesn’t protect them against claims for illegal speech. Keller says many misunderstand how Section 230 works.
“Congress’ goal in setting this up, which is spelled out in the statute, is to give platforms freedom to moderate content and to set rules – to take down inappropriate or offensive content even if that content is legal,” she said. “Congress wanted to create a world where there are a lot of places you can go to speak online, and they all have different rules set by their owners.”
Keller says that for those who are concerned about social platforms with monopoly power, there’s potentially a greater risk that telecommunications companies that provide internet access could be monopolistic gatekeepers.
“But all of it works by Congress passing big, complicated, careful laws, and then things being litigated to the Supreme Court for years on First Amendment questions,” she said.
Keller expects a Democratic-majority government will probably focus its regulatory efforts on taking down “bad stuff,” rather than forcing platforms to carry speech they find offensive or harmful.