Isn't Facebook great? (It's not.) But isn't it nice and clean and kid friendly? This is true for a very specific reason: the social media giant outsources the gnarly task of finding and deleting inappropriate content. In the November issue of Wired, Adrian Chen offers a peek into the darkest corners of the industry. It's only a little horrifying.
It's not just Facebook, of course. Pretty much any social media site you can think of uses some sort of moderation to keep abusive content off its pages. Chen specifically visited the offices of a company in the Philippines that handles moderation for Whisper, the not-so-anonymous secret-sharing app. There, contracted workers spend their time looking at pictures of everything from child bestiality to brutal violence. This sort of thing takes a toll on content-moderating workers, of whom there are an estimated 100,000 worldwide.
The moderator interviewed in the piece left his job not long after Chen's visit. Apparently, the average length of employment for content moderators is between three and six months; it's somewhat incredible that workers last that long. One of Chen's sources earned just about $300 (£187) per month.
Image via Shutterstock