Tech giants Facebook and Google routinely work with third-party companies to monitor the content users put onto their platforms. Accenture is one of those outside contractors, and it operates Google’s largest so-called moderation site in Austin. There, workers spend hours watching and flagging YouTube videos. The problem is that these workers are constantly exposed to disturbing scenes of graphic violence and sexual crimes.
Casey Newton is Silicon Valley editor for The Verge. He says moderators are experiencing mental health problems because of their work.
“I have talked to an increasing number of people who are very worried about how the job changes their personality. I’ve talked with people who have been diagnosed with post-traumatic stress disorder and other related conditions,” Newton says. “When you’re doing this work, you never know when it is going to be the moment that you see … the thing that haunts you forever.”
Newton says in a way, moderators operate like homicide detectives. But unlike detectives, who see a crime scene maybe once a day, moderators sort through 120 videos per day, many of which contain graphic content.
“Unlike police officers who we all pay for through our tax dollars and who get pensions, these folks tend to be paid on the low end of the scale,” Newton says. “Making money that is similar to what a janitor might make.”
Written by Antonio Cueto.