This post is co-written with Akshaya Kamalnath[*]
One of us (Akshaya) recently visited the Postal Museum in Washington DC. Looking at the historical development and role of the postal services in the US brought to mind our modern forms of communication—social media platforms—and their value, especially in terms of free speech. We often associate free speech with the press but, as a quote by Nat Hentoff in the Postal Museum informs visitors, it was the post that brought news to the press and then brought newspapers to the public.
Today, social media platforms like Facebook, Twitter and Google (YouTube) play a role similar to that of the postal service by acting as intermediaries for communication. They are, in a sense, the high-tech descendants of the postal services. The post physically transports letters and parcels from one person to another, while Facebook electronically transmits speech that one person wants to convey to others. The tech platforms have just made it easier to convey messages to a number of people at once. Tech platforms also help transmit news content—just like the post delivers newspapers. In fact, the use of postal services to deliver newspapers was considered the most important information technology in the late 1700s.
As lawmakers are talking about regulating speech on social media platforms, a comparison with postal services is instructive. The postal service is not required or even allowed to scrutinize people’s mail and make decisions about whether or not to deliver it. So why should its technologically more advanced relatives have to identify and remove misinformation or statements supposed to be “hate speech”? Of course, social media can be used to commit crime, including engaging in hate speech as defined in the criminal law of some countries including Canada. The post collaborated with law enforcement where necessary to investigate fraud and other criminal activities and social media companies should do the same. Social media companies should obviously comply with court orders if someone is found to have committed a crime. The issue is whether they should be expected to engage in preventive enforcement.
The further question about whether we should require these tech platforms to service all users equally, like the postal service is expected to, is more complicated. This is because the dominant postal service is usually run by the state, while the tech platforms like Facebook are run by corporations in the private sector. While we can ask a state-run enterprise to provide services to all equally, more thought needs to be given before private enterprises are held to the same standard. Yet, government regulation is being considered because, among other things, there are complaints about the spread of what activists deem to be “hate speech”, and also complaints about the silencing of conservative voices on social media.
Overall, we have to tread carefully with government regulation. In addition to interfering with their freedom of expression and association, heavy-handed regulation of online platforms would have the effect of making it harder for new and, at least initially, smaller players to enter the social media market, which ought to be the real solution to the concerns about the existing platforms’ behaviour. It will be highly unlikely that we see university students create the next Facebook or Google from their dorm rooms or garage if regulation becomes burdensome.
The history of the postal services can again serve as a warning to resist government-backed monopolies, which Facebook and the few other social media giants can in effect become if government regulation becomes burdensome. It is telling that the Postal Museum makes no mention of Lysander Spooner who tried to set up a private postal service in 1844.
Spooner argued against state monopoly over the postal service, saying:
The present expensive, dilatory and exclusive system of mails, is a great national nuisance—commercially, morally and socially. Its immense patronage and power, used, as they always will be, corruptly, make it also a very great political evil.
He added (referring to the US Constitution’s First Amendment protection for free expression) that
any law, which compels a man to pay a certain sum of money to the government, for the privilege of speaking to a distant individual, or which debars him of the right of employing such a messenger as he prefers to entrust with his communications, “abridges” his “his freedom of speech”.
Although Spooner’s business was eventually forced to close by a tightening of legislative protections for the government post’s monopoly, it had the temporary impact of bringing down the cost of postal services.
Government regulation requiring Facebook and other social media platforms to set up a complex decision-making system to enforce restrictions on what messages they can be used to convey will increase the cost of operating such platforms. Any new platform will be required to spend heavily on human moderators, artificial intelligence systems capable of assisting them, or, likely, a combination of the two. While established platforms like Facebook will not find it difficult to invest in complying with such regulations, the cost will be prohibitive to outsiders who want to set up competing social media platforms. This should explain why Mark Zuckerberg, CEO of Facebook, is in favour of government regulation.
Heavy regulation of speech on social media also runs the risk of government using social media to their political advantage—a modern version of the political abuses of the power over the transmission of ideas that Spooner denounced, which we are already seeing in some countries. In France, it emerged that the President’s office circulated a doctored video on social media, despite the President himself being committed to censorship of “fake news”. In Austria, a politician asked Facebook to take down a post calling her a “lousy traitor of the people”, a “corrupt oaf” and a member of a “fascist party” none of which amounted to hate speech under Austrian law.
The converse possibility, regulation requiring Facebook and other platforms to host all users irrespective of their opinions, would also be problematic, because it would infringe the platforms’ ability to hold and act on their own views, as well as to provide an environment in which they think their customers will be happiest. Just like restaurants may ask misbehaving patrons to leave so others may enjoy their dinner, social media platforms should be able to decide where to draw the line so that a large majority of their users are able to enjoy the platform. Or, to return to the postal analogy, suppose a private delivery company insisted on reading the letters or examining the content of the packages we wanted it to deliver for us and refuse to deliver those it deemed morally objectionable. The appropriate response for a person who did not want his or her letters read, or who submitted to the exercise and had a letter rejected, would be to go to a competitor—or to establish one, like Spooner did—rather than to force his message on a party unwilling to deliver it. Similarly, when Facebook or other online platforms set out standards with regard to the type of content and members it will allow, they make specific choices as private actors, and should be free from the government’s interference.
All this is not to say that the large social media platforms should do nothing to address the problems associated with their use. Companies like Facebook are under pressure from their shareholders and consumers. Facebook’s shareholders recently demanded a change in management since the current management had not dealt with misinformation and hate speech. Even though Mark Zuckerberg holds the majority voting power in the company, the shareholder proposals convey a message. Facebook’s management is aware of the market pressures and has taken a number of measures, including releasing its public content-moderation rules and a proposal for an independent body to hear appeals regarding decisions by Facebook regarding content moderation. (That said, presumably, the independent body would still be working under guidelines that Facebook has drafted or at least is in agreement with.)
While not perfect, these are voluntary responses to market sentiment against problems of misinformation and censorship that big social media companies have chosen to invest in. Facebook’s taking such measures does not preclude a new company from starting a modest platform without having to invest in these systems at the outset. As they get bigger, the new competitors could devise their own solutions on different principles, rather than having to follow a pattern imposed by legislation, not only enacted at Facebook’s suggestion but, quite possibly, drafted based on its proposals. Just like new courier companies have differentiated themselves from postal services based on GPS tracking, expedited delivery or convenient package pick-up options, new social media companies may exploit gaps especially if the big social media companies preclude certain views on their platforms.
[*] Dr Akashaya Kamalnath is a corporate and insolvency law scholar. She is currently teaching at Deakin University, but will be joining the Auckland University of Technology Law School shortly. You can read her papers here, and follow her on Twitter.