The Verge has published an expose of life at one of Facebook's content moderation contractors:
The panic attacks started after Chloe watched a man die. She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a “process executive.”The piece doesn't address the political bias guiding Facebook and others' attempts at moderation, probably because the publication and author share it. The article comes with a trigger warning for "mental health issues and racism"--but not for such as the leading paragraph's disturbing account of a man being stabbed to death.
For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play.
The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.
Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the next post, but Chloe cannot concentrate. She leaves the room, and begins to cry so hard that she has trouble breathing.
No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office.
The article focuses on the working conditions of the underpaid employees of Cognizant, a contractor moderating content for Facebook, but stumbles on a serious problem for the socials in their efforts to excise right wing expression and conspiracy theories. The potential for radicalization of the gatekeepers:
The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”I don't know if there's something about our historical moment that makes conspiracy theories more appealing or that exposure to them is so much greater, but it seems some percentage of us is always going to buy into them, so the gross number of casual conspiracy theorists is going to keep rising.
Like most of the former moderators I spoke with, Chloe quit after about a year.
Among other things, she had grown concerned about the spread of conspiracy theories among her colleagues.
One QA often discussed his belief that the Earth is flat with colleagues, and “was actively trying to recruit other people” into believing, another moderator told me. One of Miguel’s colleagues once referred casually to “the Holohoax,” in what Miguel took as a signal that the man was a Holocaust denier.
Conspiracy theories were often well received on the production floor, six moderators told me. After the Parkland shooting last year, moderators were initially horrified by the attacks. But as more conspiracy content was posted to Facebook and Instagram, some of Chloe’s colleagues began expressing doubts.
“People really started to believe these posts they were supposed to be moderating,” she says. “They were saying, ‘Oh gosh, they weren’t really there. Look at this CNN video of David Hogg — he’s too old to be in school.’ People started Googling things instead of doing their jobs and looking into conspiracy theories about them. We were like, ‘Guys, no, this is the crazy stuff we’re supposed to be moderating. What are you doing?’”If people are falling for flat-earth theory and other silliness, just imagine what happens when someone of reasonable intelligence is exposed to the truth about, say, black crime as explicated in the work of Colin Flaherty.
Eventually they'll take moderation out of the hands of humans entirely and turn it over to AI.
That people don’t know there are human beings doing this work is, of course, by design. Facebook would rather talk about its advancements in artificial intelligence, and dangle the prospect that its reliance on human moderators will decline over time.
But given the limits of the technology, and the infinite varieties of human speech, such a day appears to be very far away. In the meantime, the call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.The sheer volume of content monitored by AI will create a great black hole into which human communications disappear.
The very profusion of information may perversely lead to a world where the average person is in fact less informed and less exposed to truth. Brave new world.
1 comment:
"The piece doesn't address the political bias guiding Facebook and others' attempts at moderation, probably because the publication and author share it."
Yup, I had that same thought reading the article.
"Eventually they'll take moderation out of the hands of humans entirely and turn it over to AI."
And I had that same thought reading the article, too, even without reading down to where it said it.
Post a Comment