The Most Undervalued Skill of the 21st-Century Economy
The internet depends on people willing to do this work. Are we willing to pay for it?
This is the penultimate installment of Strange New Work, a special series exploring the future of work through the lens of speculative fiction. Find the whole series here.
Exactly one year ago, The Verge posted an article simply titled: "Welcome to hell, Elon."
In it, editor-in-chief Nilay Patel outlines the mess that Elon Musk bought when he bought Twitter. Patel also argues that Musk's personal relationship with the platform he now owns might make it a bigger mess than it already was.
The piece is incisive and cutting. While perhaps not an unexpected perspective, Patel offers a piercing analysis of the many contradictions in Musk's stated goals, vision, and guiding principles for the platform. One part of Patel's analysis really stood out to me, though:
"The essential truth of every social network is that the product is content moderation, and everyone hates the people who decide how content moderation works. Content moderation is what Twitter makes — it is the thing that defines the user experience. It’s what YouTube makes, it’s what Instagram makes, it’s what TikTok makes. They all try to incentivize good stuff, disincentivize bad stuff, and delete the really bad stuff."
I've thought a lot about social media from the supply side over the last few years.
I've considered what standards I want to set for myself in the way I create and share on social platforms. I've dug into the weird labor relations that exist between creators and platforms. And I've thought about the emotional and hermeneutic labor that's required to manage niche platforms, like private communities.
I've spent less time thinking about the demand side of social media—largely because I think there is a lot of agreement that what we see and experience sucks. I also wonder how much of the current social media sphere is really what's "in demand" and how much of it is just an addicting way to hoover up data that can be reconfigured and resold. In other words, how do we even think about "demand" in a market that is consciously manipulating our attention for profit?
For all that thinking, I hadn't considered what Patel argues is the "essential truth"—that the product of every social media company is content moderation. On the surface, we might think of their product as the code that builds the user interface. Or, we might think of it as the advertising system. We might, if we're trying really hard, think of the product as the complex web of property rights that creates the vector these companies profit from.
But I think Patel is right. Any way you slice it, the product depends on people paying attention. And to get people to pay attention, their feeds have to be full of stuff that, on some level, they want to see. Or, at the very least, their feeds have to be absent of the kind of stuff that makes people close the app.
For this and many other reasons, I believe content and community moderation is one of the most vital and most undervalued skills of the 21st-century economy.
In this penultimate installment of Strange New Work, I want to take a close look at the skill of moderation, its role in our evolving tech futures, and the politics that complicate this essential work.
Let's start with a fiction that feels a bit more real than reality.
Keep reading, or listen on your favorite podcast app.
A House Divided in Content Cannot Stand
Neal Stephenson is a futurist and speculative fiction writer who crafts prescient tales about how technology impacts our lives. In his 2019 book Fall; Or, Dodge In Hell, he imagines an uncanny future America in which the nation is functionally divided in two. In the parts of the map that light up blue on election night, the United States continues on more or less how we know it today. One reviewer referred to these areas as "truth-based communities."
But rural areas—and most of the middle of the country—are known as "Ameristan." In Ameristan, the rule of law breaks down such that social order is directed by militaristic reactionary Christian groups. The culprit? Misinformation.
In Stephenson's imagining, the internet becomes a fully immersive information environment. It’s content, specifically who sees what, that divides society. And who sees what is largely a product of class. Honestly, it's not that different from our current reality—just turned up to eleven. Reading back through the passages that describe the divided American reality, I couldn’t help but see the specter of inevitability in the corner of my eye.
In Stephenson’s future America, the way people deal with the flood of misinformation, disinformation, obscenity, and violence that gushes through networks is with the help of professional editors. Instead of relying on a feed provided by an individual platform, people buy into different sorts of moderation schemes to get access to the "flume" of content they want and to avoid the "flume" of content they don't want:
Direct, unfiltered exposure to said flumes—the torrent of porn, propaganda, and death threats, 99.9 percent of which were algorithmically generated and never actually seen by human eyes—was relegated to a combination of AIs and Third World eyeball farms, which was to say huge warehouses in hot places where people sat on benches or milled around gazing at stuff that the AIs had been unable to classify. They were the informational equivalent of the wretches who clambered around mountainous garbage dumps in Delhi or Manila looking for rags. Anything that made it past them—any rag that they pulled out of the garbage pile—began working its way up the editorial hierarchy and, in rare cases, actually got looked at by the kinds of editors—or more likely their junior associates—who worked for people like Sophia. Consequently, Sophia almost never had to look at outright garbage.
That editorial hierarchy? Well, buying in towards the top is expensive. The very wealthy hire their own editors who not only sift through the flume of incoming content but also curate the outgoing data that their clients create simply by going about their days. The vast majority of people either chip in to hire an editor for a small group or resort to buying a subscription to a feed that's curated to a particular set of interests, politics, and worldview.
The editor you hire or the feed you subscribe to has a huge influence over what you believe is true. One of Stephenson's characters mentions how his family suffered because of bad editing:
There’s a whole subtree of cousins who went off the rails because they went in together on a bad editor who ended up mainlining Byelorussian propaganda into their feeds. We lost a whole branch of the family, basically. So my mom in particular is super sensitive about this.
Today's Internet Editors
In Stephenson's future, truth, privacy, and a modicum of psychological safety don't come cheap.
Whether you realize it or not, we already live in an information ecosystem that's dominated by editors. The high-end ones, of course, still have jobs at the New York Times, CNN, and The Atlantic. We actually call them editors, and their job is to decide what gets written up, commented on, and shared with those who opt-in to read their content. To my knowledge, editors like these haven't made the leap to working directly for the ultra-wealthy. But please let me know if you have a tip!
Our social media feeds also have teams of paid editors.
In this case, Stephenson wasn't speculating. These paid editors really do work in "Third World eyeball farms."
How is it that your Instagram or TikTok feed isn't full of vile garbage? You might assume that it's the mythic algorithm. There is some combination of code and artificial intelligence that keeps the bad stuff away from the baby pictures, webinar announcements, and funny cat videos that you ostensibly want to see. And that is true to an extent. But it's not the whole truth.
Jeff Bezos has called the people who do this work "artificial artificial intelligence." A dehumanizing title for dehumanizing work. In his book on the subject of labor under platform capitalism, Philip Jones writes:
Platforms outsource their labour to keep it off the books and hidden from users, investors and customers, to appear more technologically sophisticated than they are, and this is no more the case than it is with the data work that powers artificial intelligence.
The people doing this work are often refugees. Non-profits and NGOs set up ramshackle "offices" full of computers right inside the refugee camps themselves. "Cramped and airless workspaces," describes Jones, "festooned with a jumble of cables and loose wires, are the built antithesis to the near celestial campuses where the new masters of the universe reside." The refugees and other microworkers who flag obscene content, label images to train AI, and do other small digital tasks make pennies—while founders and venture capitalists generate millions from their labor.
Researcher Sarah T. Roberts, who studies commercial content moderation, told Harvard Business Review that that's just how the tech companies like it:
The companies favor this kind of relationship. It sets up a convenient kind of plausible deniability for some of the harms that can come from the work.
Of course, the organizations that set up these workspaces promise opportunities, tech training, and fair pay. Preemptive Love, an organization with a mission to end war and disrupt the cycle of violence, claims on their website that "Jobs are the most powerful weapon against war." In a 2019 blog post on the site, Ben Irwin touted microwork specifically as an antidote to the cycle of violence in Iraq.
For centuries, the overdeveloped world in the Global North has extracted wealth from underdeveloped communities in the Global South. We've exploited their mines, farms, and forests. We've used their wealth to accumulate our own. We’ve enslaved their people to lower the costs of our gluttonous consumption. With natural resources becoming evermore depleted, we're now pillaging their human resources in new ways.
Every hour someone spends on microtasks is an hour not spent building the economy in their own communities.
They're not negotiating supply chains that benefit their region. They're not working together to get everyone fed. They're not repairing homes or building new ones. Instead, their time is cannibalized by the prospect of waged work, and their minds are colonized by our digital garbage.
But don't these people need jobs? Maybe. Or maybe our idea of a job—work that's done for cash compensation—and its position as an economic necessity is a product of capitalist realism. What if we stopped interfering and simply asked what people need to facilitate community-led labor and needs-meeting systems? Our constant push to get people into something that, to us, looks like a job fuels, rather than disrupts, the cycle of poverty.
What's more, in the United States, we've made a big deal about avoiding sweatshop labor. Liberals and progressives buy "ethically made" clothing, housewares, and skincare. Brands leverage their commitment to fair labor practices as a way of attracting business. But we ignore the ways we directly benefit from digital sweatshop labor every time we open our favorite social media app.
Build Back Better Networks?
There's been murmuring about migrating to private or niche communities for years. I've done my own fair share of this murmuring!
But today, I'm much more cautious about how I talk about private or niche communities. The reason is that so little care is often given to moderation as the core product of the community. In many ways, these small communities and social networks—especially the for-profit ones—are driven by the same logic and reproduce the same labor relations that large commercial social networks do.
Sarah Roberts, the commercial content moderation researcher I mentioned earlier, cites Reddit—with its user-moderated subreddits—as a counterexample. She points to how successful subreddits have "community leader-moderators" and "clear rules" for engagement. Redditors know what the policies and expectations are of each subreddit they belong to. By joining a subreddit, the Redditor opts in to following those rules—or risks getting their post taken down or being removed from the group.
However, the effectiveness of unpaid community moderators does not equate to institutional support. “Reddit has contribute hardly anything in terms of moderating tools [or] crowd control,” Shanna Mann, the founder and former moderator of r/AskWomenOver30, explained to me. That subreddit boasted between 60-70,000 members before she stepped down.
“[Reddit] would be nothing without the huge corpus of actual communities there and the value that they drive,” Mann argued. Unfortunately, Reddit has been less than accommodating of its volunteer moderators over the years—and especially in the last 6 months.
In April, after Reddit announced it would begin charging for access to its API, many moderators led a "black out" that effectively shut down their communities.
This shed some light on how valuable—or not—Reddit thought their moderators were.
At issue were apps that moderators relied on to make their responsibilities easier, as well as a suite of apps that improved accessibility for users. While Reddit caved to public pressure to exempt accessibility-focused apps, the moderator tools apps had to shut down. Mann explained that the apps were necessary because Reddit had “failed so miserably in providing anything of substance for so many years.”
Part of the moderator-led blackout included labeling subreddits as Not Safe for Work so that Reddit couldn't monetize their pages. Others completely blocked access or switched them to “Read Only.” And a couple of delightful (and massive) subreddits protested by allowing only images of John Oliver.
As the protest wore on, Reddit executives got antsy. They started to threaten protesting moderators who didn't reinstate their subreddits with the prospect of being expelled and replaced with volunteers who sided with the company. These weren't idle threats, it turned out.
Moderators were, in fact, expelled. And they were replaced by new moderators with questionable experience and expertise. Some moderators that had been purged by Reddit told Ars Technica that they feared moderators without proper vetting could actually jeopardize Redditor safety. The former mods of r/canning worried that bad advice would lead to someone succumbing to botulism poisoning, for example. A researcher at Cornell pointed to the loss of institutional knowledge that occurs when a whole group of experienced users is replaced by a whole group of inexperienced users.
So, while Reddit might be a good example of how moderators create spaces online with clear rules and patterns of management, Reddit itself doesn't seem to view their contributions as particularly valuable.
Which, I have to say, is ludicrous.
In the backlash over the shakeups at both Reddit and Twitter, a different class of social media started to make waves.
Called "federated" social media, these platforms are decentralized—relying on distinct instances of their networks managed by individuals or groups. Think Mastodon and Lemmy. Mann seems skeptical, saying, “They’ve fixed one ill and replaced it with a different [one].” She pointed out that the lack of central planning and policy-making means that there is also no oversight. “That also means that for every good community, you have a bad community.”
If I had to guess, I’d say that ratio is nowhere near one-to-one.
What you get in federated social media depends on what iteration or server you're on. It's sort of like how what state you live in determines what access you have to programs or even how your civil rights are upheld. For example, in Oregon, the minimum wage is over $14, and tipped employees must earn at least that amount plus whatever tips they receive. Here in Pennsylvania, the minimum wage is still $7.25, and tipped employees only earn $2.85 per hour plus tips.
While we have considerably more mobility online than we do geographically, anyone who's spent years building up a presence within one community or on one network feels rooted to that space in much the same way you do when you've lived somewhere for years.
Okay, so what about private communities?
There is a lot of good that happens in private or niche communities. But when things go wrong, it can be devastating in a different way.
The State of Moderation
I ran a paid community of one sort or another for more than a decade. To say I learned a lot about moderation and community management in that time is a severe understatement.
By the time I was burned out and deeply depressed, the most valuable and salient lesson I had learned is that a community survives or thrives based on how leaders invest in its structure. I'd learned to constantly ask the question, as Rosie Sherry asks it:
How can the way that we structure and set the groundwork for our communities impact the type of conversations that take place in our communities and how they're moderated?
The problem all online communities face today—whether they're Facebook groups, Discord servers, Circles, or Mighty Networks—is working against the totalizing culture of the social web. Every moderator and community manager is caught between a rock and a hard place. They're trying to coax lurkers out from the shadows to spark engagement, even when those lurkers might be hiding because they've learned to fear online discourse. At the same time, moderators are trying to manage the speech of those who are perhaps too eager to jump into a conversation without context. To paraphrase Jia Tolentino:
We are all living in the internet that social media has created.
Another problem that most online communities face is the lack of investment in people who are willing or capable of working in this weird—and at times legitimately traumatizing—milieu.
I made a decision early to hire a full-time employee to manage and moderate the business owner community I ran. When I made that hire, my business was very profitable, and the investment felt very doable. But as the business shifted from its highly profitable training and education model to the decidedly less profitable community model, that investment remained critical while also becoming less and less sustainable financially. The community itself never generated the revenue needed to cover our two modest full-time salaries. I was always augmenting the community revenue with higher-priced offers to float our payroll.
As I've observed other paid communities over the years, I rarely see that type of investment being made. Not only are there seldom full-time community managers for a product that may cost anywhere from $25 to $300 per month, but there is seldom an investment in the operational and philosophical structure of the community. Sure, there are values and policies. However, they just don't have the mechanisms required to translate them into the daily experience of users.
To some extent, all for-profit paid communities run on the same logic that Facebook and Twitter do: users create the content for free so that the owners can cover costs and enjoy the profit. I can imagine a scenario in which this is ethical by virtue of the fact that well-managed private paid communities really can deliver outsized value to their members. The problem is that, like Facebook and Twitter, owners try to keep the costs associated with the community as low as possible. And in that, their reliance on free labor is also exploitative.
Because platforms like Facebook and Twitter have made moderation hidden work that we wave away as a product of "the algorithm" or some secret AI sauce, we also disregard the labor of moderators and community managers. It ends up being volunteer labor, or it's folded into the duties of an under-resourced virtual assistant. Again, the idea that anybody can do that work without the proper training, compensation, and mental health support is something we inherit from social media platforms.
May You Live in Less Interesting Times
These are interesting times on the social web. Most of us agree the whole thing sucks. It's damaging our mental health, physical health, civil society, culture, and relationships. It's not providing real business benefits. But we also can’t seem to extricate ourselves from it.
I think we're living in times much closer to what Neal Stephenson imagined than we'd like to be.
Stephenson described how he's come to see the internet—and its near future—as a sort of "doomsday machine." He pointed to the way that social media systems are explicitly designed not to have humans "in the loop" because they're not scalable, which then means they can't be profitable.
I don't disagree. But I do wonder whether Stephenson knew about the humans who have indeed been caught in the loop. He seems to have known. He dubbed their workplaces Third World eyeball farms, remember? Perhaps missing from his response to PC Mag are humans caught in the gears of the machine—the ones stuck in the loop with no hope for rest or satisfaction.
I don't mean to criticize Stephenson for this response. I simply want to point out how easy it is to write off this labor as cheap magic.
He went on to tell PC Mag that "the only way to get good content out of the internet is by having humans in the loop." But unfortunately, he says, "humans are expensive … if you did want a curated, edited stream, ... you would have to pay for it."
Moderation, community management, editing as Stephenson dubs it—these are some of the most valuable skills of the 21st-century economy. But no one seems willing to pay for them. We want the results without the costs. We want the labor without the laborers.
If we want good content, let alone feeds that don't rot from the inside with mis or disinformation, it's going to be expensive.
But, unlike in Stephenson's dystopian vision, that cost doesn't have to be a matter of class and privilege. We don't have to settle for an internet that works for the rich but further divides the poor. We can share the costs.
This isn't the internet we have today. It's not the business model that internet companies are built on. But that doesn't mean that the next generation of companies can't be built on something that truly values the human touch. If we want a better internet, it's going to need a better business model—one that works for everyone instead of a few billionaires.
Another great post, Tara. It's so evocative, I can barely resist going through it paragraph by paragraph and chiming in with ahas or related thoughts. But I'll spare you and your readers.
One thing I'll share, because I've had reason to be active on Mastodon lately and it's on my mind, is that I personally believe the model is working relatively well for the people who manage to stick with it through the learning curve. Though, there's risk of unsustainability when volunteers assume prominent roles (a lesson Reddit and Redditors learned, as you point out). What gets me about Mastodon is that I suspect it views itself as a welcoming, diverse community, when it fact I find it to be — in many ways — extraordinarily homogenous and intolerant.
I left Twitter years ago (I laugh when people say Musk ruined Twitter as if, before he bought it, it was some shining beacon of truth). But, as far as I can tell, once Musk took over, throngs of leftists (I don't use the word loosely), nonbinary persons, and tech experts departed Twitter and migrated to Mastodon. It leaves us with an even more polarized, segregated social media environment. It's like what I imagine FOX and MSNBC to be like. God forbid anyone should be exposed to a viewpoint different from their own.