The laborers who keep dick pics and beheadings out of your Facebook feed

Adrian Chen
October 2014


Google’s offices are legendary: multicoloured bikes and Segways, endless gourmet food, breakout zones with ping-pong and pinball, massages, yoga and free laundry. They are considered the best working conditions in the world. But not everyone working for Google has quite the same experience. In an office in Manila, thousands of people sit at old computers in dingy buildings keeping dick pics and beheadings out of our feeds.

If the space does not resemble a typical startup’s office, the image on [the] screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appears because I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before the moderator disappears it with a casual click of his mouse.

The very reason that teens are fleeing services like Facebook is the reason content moderation has become more and more important. Adrian Chen calls it the “Grandma Problem” — a new wave of older users unfamiliar with “the Internet’s panoply of jerks, racists, creeps, criminals, and bullies” who “won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video”.

It is now estimated that the number of content moderators, employed as contractors, is in the hundreds of thousands, twice the “total headcount at Google and nearly 14 times that of Facebook”.

The process of content screening is fascinating. For a relatively new app called Whisper (where people post anonymous secrets), content is actively moderated, which means that everything uploaded is screened (instead of just being flagged as offensive by users) — a very labour intensive process. A worker looks at a grid of pictures, zooming in on content that clearly contravenes the site’s terms (pornography, gore, minors, sexual solicitation, sexual body parts). But much of the content is ambiguous. The moderator comes across a stock image of a man’s chiseled torso and the text, “I want to have a gay experience, M18 here”. He has only seconds to determine whether this is “the confession of a hidden desire (allowed) or a hookup request (forbidden)”.

Chen spares a thought for the poor souls that shield us from the worst of society. The industry itself seems to have agreed to keep things quiet. Workers are paid a fraction of what an Australian would receive for the same emotionally draining work — and burnout is common. Days on end of “brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents” take their toll.

What is the psychological impact of watching these horrible videos day in, day out? Some workers report supercharged sex drives, others found that they no longer had any desire to be intimate with their partners. The moderators all have their individual torments, the video that they know will always stay with them. While most workers have access to psychologists, it’s often too little, too late. Until a computer can work out the difference between solicitation and hidden desire, “the war will go on”.


Future Perfect's Slow Reading offers a synoptic look at the best of longform journalism