The film Demolition Man follows a tough-as-nails police officer (Sylvester Stallone) brought out of suspended animation to pursue a violent criminal (Wesley Snipes), his old nemesis unleashing chaos in the non-violent society of San Angeles.
The virtually crime-less San Angeles is maintained through a tech panopticon of surveillance, cultural conformity and social control. There is even a literal “speech police,” an AI that automatically gives people tickets for using foul language. Most people have acclimated to this tech-utopia (or dystopia, depending on your point of view), but those who don’t want to be controlled live underground in order to maintain their freedom of speech and choice.
The leading rebel Edgar Friendly (Dennis Leary) goes on a passionate monologue defending personal liberty. He says “I want high cholesterol. I want to eat bacon, butter and buckets of cheese, okay? I want to smoke a Cuban cigar the size of Cincinnati in a non-smoking section. I wanna run through the streets naked with green Jello all over my body reading Playboy magazine. Why? Because I suddenly might feel the need to.”
Many of the most popular dystopias in science fiction are concerned with the tyranny of control, restriction, and censorship. From 1984 to Brave New World to The Matrix to Minority Report, and even to Star Wars, creators imagine that dystopia will result from powerful aristocracies restricting freedoms of speech, thought and association.
In America, there is a libertarian impulse for unfettered free speech and choice. Discourse about censorship is longstanding, but modern technologies have lately increased its resonance. Arguments have flared up over topics like cancel culture, Google’s alleged bias against conservatives, and whether Twitter should have temporarily restricted Donald Trump Jr.’s account after he posted a video claiming masks are unnecessary. These issues are muddy, but people across the political spectrum tend to assert the merits of an unrestricted marketplace of ideas.
But what if dystopia could also come from openness?
It’s a counterintuitive (blasphemous, for some) notion, since freedom is foundational in maintaining a liberal society. The reflex of many Americans is to defend it as an a priori value. But if Americans don’t wrestle with dystopias of openness, we may unwittingly create something more akin to a marketplace of disinformation. By this I mean a culture of perpetual discussion where it becomes less and less likely that bad ideas can ever successfully be debated out of society. In such a cacophony, disinformation not only spreads, but is incentivized.
One argument for the marketplace of ideas is that good ideas will eventually win out. The best evidence will ostensibly settle debates and become facts. In a way, the unstated power in the marketplace of ideas is that it “cancels” bad information. But in a truly open society, there’s also a risk that bad ideas — conspiracy theories, pseudoscience, eugenics, and more — are amplified and adopted by a growing number of people.
Every new communication technology has sparked similar conversations. Whether it’s the television, radio, or Twitter, technologies carries both gifts and curses. To quote the cultural theorist Paul Virilio, “When you invent the ship, you also invent the shipwreck.”
What’s different about today is that through digital technologies like the internet and social media, bad ideas now have global reach. Though tech creates the potential for the spread of good ideas as well, the online commodification of ideas allows the worst ideas to survive in the marketplace, but thrive in it.
The more an idea is worth, the harder it is to push it out of the marketplace. Metrics such as views and shares give ideas worth, regardless of their veracity. And since the content that gets the most engagement is that which triggers strong emotions, social media algorithms amplify arguments that provoke rage and sew division. This creates competing interests with tech companies, between penalizing free speech that hurts society and rewarding free speech that engages users.
For example, anti-vaccination content shared on Facebook could make it harder for public health officials to contain the spread of COVID-19. Health misinformation has received almost four times as many Facebook views as information from reliable sources. The site’s algorithms are funneling users towards disinformation.
The possibility for bad actors to profit from the spread of bad ideas is expanding. Since users can retain anonymity on large, free platforms, the entry costs (both economic and social) of dealing in bad ideas are decreasing. Aided by echo-chamber effects, bad actors (who can hide behind the plausible deniability of “censorship”) not only prosper, but can find each other and form coalitions that have real effects on society.
There are many examples of such effects: a gunman fired a rifle inside a Washington, D.C. pizzeria after reading false claims of children trapped there in a sex-slave ring; US political candidates openly support the QAnon conspiracy theory; and military officials in Myanmar used Facebook to promote hate speech against the Rohingya Muslim minority. Those who argue that there have always been conspiracy theorists or that these instances are the regrettable costs of freedom are not taking the problem and its complexity seriously.
Writer James Baldwin once said “a complex thing cannot be made simple.” By having a more nuanced discussion, we can consider the concerns of both free speech advocates and tech critics who call for more content moderation. When extolling the virtues of the marketplace of ideas, we should consider how markets actually work—particular in the context of social media. And though it goes against the libertarian impulse of Silicon Valley, those engaged in discourse about open debate need to do the hard, uncomfortable, but more foundational work of negotiating where the boundaries are.
But in order to get there, we need to imagine that we could one day live in (and maybe do live in) a dystopia where nothing can be cancelled. It could be the science fiction dystopia just as bad as the ones we tend to imagine.
This piece contains edits from Amy Nordum of MIT Technology Review
Previously Published on Medium