Social media's dirty secret: workers in the developing world screen graphic content for Facebook and Google
Most people are aware that lots of objectionable images and videos are uploaded to social media platforms every day.
Sexual assault, suicides, beheadings and other violent or graphic material has to be screened or removed before—or at least shortly after—it becomes public.
You might think this work is done by an algorithm. But in fact, most of it is outsourced to the developing world, in darkened offices, by low-wage workers who must view 25,000 disturbing images in a single shift.
And those workers don't work for Facebook or Google or Twitter; they work for third-party companies whose names are kept secret.
It's that shadowy world that the Hans Block and Moritz Riesewieck examined in their film, The Cleaners, which was shown at Toronto's HotDocs film festival this year.
The pair focused on several workers in Manila, in the Philippines, where much of the outsourced content-screening is done.
What does it say that companies like Facebook outsource what would seem to be such an important responsibility?
"In a way it's like an outsourcing of responsibility," Block said.
"It's a very important job, because Facebook and Twitter and Google are not anymore just a cute tool to share holiday pictures. They have become a digital public sphere, where a lot of debates are taking place where political content is uploaded."
So the job of deciding what get allowed online—and what doesn't—is exceptionally important in a world where billions of people get all their news and information from social media.
Facebook and Twitter and Google are not anymore just a cute tool to share holiday pictures. They have become a digital public sphere.- Moritz Riesewieck
The working conditions exact a terrible toll on the people in the film. One worker, whose responsibility was to screen self-harm videos, ended up committing suicide himself, and the company tried to cover it up, Riesewieck said, saying it had nothing to do with his work. Others, after watching thousands of videos of terrorist attacks, are afraid to go outside.
"It's quite difficult to prove that this is a result of the job. But it's quite obvious actually," he said.
In a broader sense, though, these workers are making decisions that can dramatically influence politics around the world, not just by what they take down, but by what they leave up.
The Rohingya crisis in Myanmar presents a clear example.
In Myanmar, Facebook is the internet, Block said.
"There, misinformation and hatred is not being deleted on the platform, and thousands of Rohingya are being displaced out of the country," he said.
"And Facebook plays a very crucial role in that, because they are not deleting content—they are allowing content like this to be uploaded."
It's like an outsourcing of responsibility.- Hans Block
The cultural background of the workers influences their decisions around whether to "ignore" or "delete" content. In one example, a nude painting of U.S. President Donald Trump was deleted, because, according to the worker involved, you cannot criticize the president of a country—which is true in the Philippines, but not in other countries, Block said.
Ultimately, Block and Riesewieck are trying to show that social media companies are not the neutral platforms they claim to be. Further, that as the main source of information for so many around the world, they must take the responsibility for moderating their content much more seriously—and certainly not outsource that job to underpaid workers who have only seconds to make decisions that could result in war, death, or assault.
"They alone have the power to decide what is available and what is not. And we should really question this power, and we should get the power back—and we should take responsibility back for what's going on there," Block said.