Should private platforms engage in censorship? January 30, 2023 on Drew DeVault's blog

Private service providers are entitled to do business with whom they please, or not to. Occasionally, a platform will take advantage of this to deny service to a particular entity on any number of grounds, often igniting a flood of debate online regarding whether or not censorship in this form is just. Recently, CloudFlare pulled the plug on a certain forum devoted to the coordinated harassment of its victims. Earlier examples include the same service blocking a far-right imageboard, or Namecheap cancelling service for a neo-Nazi news site.

In each of these cases, a private company elected to terminate service for a customer voluntarily, without a court order. Absent from these events was any democratic or judicial oversight. A private company which provides some kind of infrastructure for the Internet simply elected to unilaterally terminate service for a customer or class of customers.

When private companies choose with whom they do or do not do business with, this is an exercise of an important freedom: freedom of association. Some companies have this right limited by regulation — for instance, utility companies are often required to provide power to everyone who wants it within their service area. Public entities are required to provide their services to everyone — for instance, the US postal service cannot unilaterally choose not to deliver your mail. However, by default, private companies are generally allowed to deny their services to whomever they please.1

Are they right to?

An argument is often made that, when a platform reaches a given size (e.g. Facebook), or takes on certain ambitions (e.g. CloudFlare), it may become large and entrenched enough in our society that it should self-impose a role more analogous to a public utility than a private company. Under such constraints, such a platform would choose to host any content which is not explicitly illegal, and defer questions over what content is appropriate to the democratic process. There are a number of angles from which we can examine this argument.

For a start, how might we implement the scenario called for by this argument? Consider one option: regulation. Power companies are subject to regulations regarding how and with whom they do business; they must provide service to everyone and they are not generally allowed to shut off your heat in the cold depths of winter. Similarly, we could regulate digital platforms to require them to provide a soapbox for all legally expressible viewpoints, then utilize the democratic process to narrow this soapbox per society’s mutually-agreed-upon views regarding matters such as neo-Nazi propaganda.2

It’s important when making this argument to note that regulation of this sort imposes obligations on private businesses which erode their own right to free association; radical free speech for individuals requires radical curtailing of free association for businesses. Private businesses are owned and staffed by individuals, and requiring them to allow all legal forms of content on their platform is itself a limitation on their freedom. The staff of a newspaper may not appreciate being required by law to provide space in the editorials for KKK members to espouse their racist philosophy, but would nevertheless be required to typeset such articles under such an arrangement.

Another approach to addressing this argument is not to question the rights of a private business, but instead to question whether or not they should be allowed to grow to a size such that their discretion in censorship constitutes a disruption to society due to their scale and entrenched market position. Under this lens, we can suggest another government intervention that does not take the form of regulation, but of an application of antitrust law. With more platforms to choose from, we can explore more approaches to moderation and censorship, and depend on the market’s invisible hand to lead us true.

The free speech absolutist who makes similar arguments may find themselves in a contradiction: expanding free speech for some people (platform users) requires, in this scenario, curtailing freedoms for others (platform owners and staff). Someone in this position may concede that, while they support the rights of individuals, they might not offer the same rights to businesses who resemble utilities. The tools for implementing this worldview, however, introduce further contradictions when combined with the broader political profile of a typical free speech absolutist: calling for regulation isn’t very consistent with any “small government” philosophy; and those who describe themselves as Libertarian and make either of these arguments provide me with no small amount of amusement.

There is another flaw in this line of thinking which I want to highlight: the presumption that the democratic process can address these problems in the first place. Much of the legitimacy of this argument rests on the assumption that the ability for maligned users to litigate their grievances is not only more just, but also equal to the threat posed by hate speech and other concerns which are often the target of censorship on private platforms. I don’t think that this is true.

The democratic and judicial processes are often corrupt and inefficient. It is still the case that the tone of your skin has an outsized effect on the outcome of your court case; why shouldn’t similar patterns emerge when de-platformed racists are given their day before a judge? Furthermore, the pace of government interventions are generally insufficient. Could Facebook appeal a court for the right to remove the Proud Boys from their platform faster than they could organize an attack on the US Capitol building? And can lawmakers keep up with innovation at a pace sufficient to address new forms and mediums for communicating harmful content before they’re a problem?

We should also question if the democratic process will lead to moral outcomes. Minorities are, by definition, in the minority, and a purely democratic process will only favor their needs subject to the will of the majority. Should the rights of trans people to live free of harassment be subject to the pleasure of the cisgendered majority?

These systems, when implemented, will perform as they always have: they will provide disproportionately unfavorable outcomes for disadvantaged members of society. I am a leftist: if asked to imagine a political system which addresses these problems, I will first imagine sweeping reforms to our existing system, point out that the free market isn’t, lean in favor of regulation and nationalization of important industries, and seek to empower the powerless against the powerful. It will require a lot of difficult, ongoing work to get there, and I imagine most of this work will be done in spite of the protests of the typical free speech absolutist.

I am in favor of these reforms, but they are decades away from completion, and many will disagree on the goals and their implementation. But I am also a pragmatic person, and when faced with the system in which we find ourselves today, I seek a pragmatic solution to this problem; ideally one which is not predicated on revolution. When faced with the question, “should private platforms engage in censorship?”, what is the pragmatic answer?

To provide such an answer, we must de-emphasize idealism in favor of an honest examination of the practical context within which our decision-making is done. Consider again the status quo: private companies are generally permitted to exercise their right to free association by kicking people off of their platforms. A pragmatic framework for making these decisions examines the context in which they are made. In the current political climate, this context should consider the threats faced by many different groups of marginalized people today: racism is still alive and strong, what few LGBT rights exist are being dismantled, and many other civil liberties are under attack.

When someone (or some entity such as business) enjoys a particular freedom, the way they exercise it is meaningful. Inaction is a form of complicity; allowing hate to remain on your platform is an acknowledgement of your favor towards the lofty principles outlined in the arguments above in spite of the problems enumerated here and the realities faced by marginalized people today. A purely moral consideration thus suggests that exercising your right to free association in your role as a decision-maker at a business is a just response to this status quo.

I expect the people around me (given a definition of “around me” that extends to the staff at businesses I patronize) to possess a moral compass which is compatible with my own, and to act in accordance with it; in the absence of this I will express my discontent by voting with my feet. However, businesses in the current liberal economic regime often disregard morals in favor of profit-oriented decision making. Therefore, in order for the typical business behave morally, their decision-making must exist within a context where the moral outcomes align with the profitable outcomes.

We are seeing increasing applications of private censorship because this alignment is present. Businesses depend on two economic factors which are related to this issue: access to a pool of profitable users, and access to a labor pool with which to develop and maintain their profits. Businesses which platform bigots are increasingly finding public opinion turning against them; marginalized people and moderates tend to flee to less toxic spaces and staff members are looking to greener pastures. The free market currently rewards private censorship, therefore in a system wherein the free market reigns supreme we observe private censorship.

I reject the idea that it is appropriate for businesses to sideline morality in favor of profit, and I don’t have much faith in the free market to produce moral outcomes. For example, the market is responding poorly to the threat of climate change. However, in the case of private censorship, the incentives are aligned such that the outcomes we’re observing match the outcomes I would expect.

This is a complex topic which we have examined from many angles. In my view, freedom of association is just as important as freedom of speech, and its application to private censorship is not clearly wrong. If you view private censorship as an infringement of the principle of free speech, but agree that freedom of association is nevertheless important, we must resolve this contradiction. The democratic or judicial processes are an enticing and idealistic answer, but these are flawed processes that may not produce just outcomes. If I were to consider these tools to address this question, I’m going to present solutions from a socialist perspective which may or may not jibe with your sensibilities.

Nevertheless, the system as it exists today produces outcomes which approximate both rationality and justice, and I do not stand in opposition to the increased application of private censorship under the current system, flawed though it may be.


  1. There are some nuances omitted here, such as the implications of the DMCA “safe harbor” provisions. ↩︎

  2. Arguments on other issues also call for regulating digital platforms, such as addressing the impact that being binned by Google without recourse can have on your quality-of-life for users who are dependent on Google’s email services. Some nuance is called for; I will elaborate on this in future posts. ↩︎

Articles from blogs I read Generated by openring

Go Turns 15

Happy 15th birthday, Go!

via The Go Blog November 11, 2024

Summary of changes for October 2024

Hey everyone!This is the list of all the changes we've done to our projects during the month of October. Summary Of Changes 100r.co, added Rabbit Waves and Logbooks. Updated woodstove installation, no windlass with 1 photo, mini dodger and Victoria to…

via Hundred Rabbits October 31, 2024

Building a timeseries database for fun

Everyone that has tried to make some nice charts in Grafana has probably come across timeseries databases, for example InfluxDB or Prometheus. I've deployed a few instances of these things for various tasks and I've always been annoyed by how they…

via BrixIT Blog October 28, 2024