An interesting article in MIT Tech Review by Sahar Massachi, a former member of Facebook’s integrity engineering team, “How to save our social media by treating it like a city”, makes some very good points about the by now familiar problems of social networks that derive fundamentally from a design that allows, or even encourages, abuse.
Massachi asks what would happen to our cities if we could adopt or steal identities or create zombie-like armies of non-existent people. Sure, in the real world we can change our identity, but the process entails a long and tedious administrative process and is a one-off move. But on social networks, it is possible to generate thousands of fake accounts based on stolen identities that can be used for all kinds of fraudulent or even criminal activities.
There has been flagrant abuse of the rules social networks initially established; what’s more, it has benefitted the companies that created those social networks, and has been driven by them. Facebook, despite its theoretical real-name policy, never did anything to counter the activities of vast bot factories typically located in low-wage countries, because their activity made Facebook look even bigger. Similarly, Twitter never did anything to prevent its users from generating and using multiple accounts, arguing their right to anonymity, and rarely acting to eliminate fake accounts and bot factories — and when it spent more than two months suspending more than a million fake accounts a day, was punished by the markets for declining user numbers.
If engaging in unethical behavior by their users drives profits for companies, there is little incentive to tackle it. Content moderation doesn’t fix the problem either, because the fault lies in a design that allows misuse.
The solution, according to the article, is to introduce friction into the system, as happens in city life. It is possible to create a second identity, but it is not easy. Forcing spammers, bot factories and other wrongdoers to jump through hoops would work as a deterrent, or at least increase their operational costs to a prohibitive level. Spam is cheap and simple to create because it is frictionless, there are no physical barriers, as there were with the old mail shots sent out through the post.
While the article makes many good points, it doesn’t address the question of how to make companies that do not design their social networks properly responsible for the problems they create. There are no penalties for people who misuse social networks, meaning that companies have an incentive to design them in ways that allow misuse. What’s more, this allows companies to simulate faster growth, even if that growth is based on false premises. We need to work out how best to sanction companies that effectively encourage misuse of their networks. Ethics in the ecosystems in which millions of people interact, cannot be left to voluntarism.
Comparing cities to social networks is an interesting approach, even if we are still a long way from finding a way to change the way companies design their business models. But at least it allows us to better understand how social networks have developed into what they now are.
This post was previously published on Enrique Dans’ blog.
You Might Also Like These From The Good Men Project
|You Said ‘Race’, but Are You Actually Talking About Race?||Understanding the Nonbinary: Are You Confusing Gender With Sex?||The Difference Between Compassion for Those With Disabilities & Ableism?||‘Masculinity’ Is Having an Identity Crisis|
Join The Good Men Project as a Premium Member today.
All Premium Members get to view The Good Men Project with NO ADS.
A $50 annual membership gives you an all access pass. You can be a part of every call, group, class and community.
A $25 annual membership gives you access to one class, one Social Interest group and our online communities.
A $12 annual membership gives you access to our Friday calls with the publisher, our online community.
Register New Account
Need more info? A complete list of benefits is here.
Photo credit: iStock