"Out of sight, out of mind" doesn't work when cleaning out the darkest corners of social media platforms.。
That's what the documentary 。 The Cleaners。, which airs Monday night on PBS, reveals about content moderators in the Philippines who are relentlessly bombarded with violent, graphic, and disturbing images and videos. Ahead of the TV release at a San Francisco screening, the filmmakers Hans Block and Moritz Riesewieck praised the subjects in the documentary for being "brave enough to talk to us." 。
Manila has become a de facto headquarters for content moderation for some of the biggest social media and content services like Facebook, Instagram, YouTube, Google, Twitter, and others. But outsourcing the harmful work doesn't eliminate the problem. 。
The Cleaners' 。 filmmakers, a German duo, spent time getting to know five former moderators in the Filipino capital city earning $1 to $3 per hour for third-party content review companies. The moderators anonymously shared their experiences working for companies filtering out inappropriate content and the roles they play in deciding what's propaganda, art, or news.。
SEE ALSO:Former Facebook content moderator sues company for giving her PTSD。The film takes on the issues the moderators face from viewing thousands of disgusting posts in a shift and also examines the censorship problems that the platforms straddle with their policies about removing certain images. Dangerous fake news has spread on platforms like Facebook in Myanmar, where the Rohingya ethnic minority are persecuted. In the film, a woman shares the horrors she's endured as a Rohingya person and how Facebook helped make her country so unsafe that she fled.。
Thanks for signing up!。 This isn't a new occurrence by any means, but now lawsuits are rolling in on a regular basis. Just a few months ago, a former Facebook moderator sued, accusing the platform of psychological harm. Former Microsoft employees sued for similar reasons after the alleged trauma from reviewing child porn. Yet content moderation is still left to human workers. The short 2017 doc 。The Moderators。 The Moderators。 from India offered a glimpse at how unprepared these workers are for the type of work they're subjected to.。
The Cleaners。
features interviews with people who knew a fellow worker who killed himself. He had specialized in self-harm videos. 。
The solution feels like it should be robot moderators, but as the film explores, AI doesn't yet grasp context and gray areas. The struggle between free speech and censorship keeps humans necessary in the undesirable role. The iconic photo of a girl running naked during the Vietnam War technically falls within nudity guidelines. "Delete!" says a moderator in the film when shown the photo. 。
At the Cinematografo Film Festival screening in San Francisco last week, the filmmakers said humans are better at analyzing a picture and piecing together what it really means. But it comes at a cost. The pair called the digital cleaning job "a form of mental abuse." One moderator was an expert at beheading videos and had seen hundreds of them. Another spoke about how viewing child porn disturbed her forever. 。
The secretive nature of the work compounds the psychological problems. According to the filmmakers, workers are strongly pressured to keep their job title off LinkedIn profiles and not share job details with family members. Even identifying which company's content they are viewing is supposed to be discussed with code names.。
As the people behind the moderation get more attention, the companies have to share more information. Facebook said in July it was growing its safety team to 20,000 workers by the end of this year, which includes 7,500 content reviewers.。"This job is not for everyone," Facebook wrote in a blog post detailing the hiring and training processes and how the company is "taking care of the reviewers" with mental health resources and pleasant work environments.。
airs in the U.S. on PBS Monday at 10 p.m. local time. 。