TechTech newsTechnology

Meta’s Gruesome Content Broke Him. Now He Wants It to Pay

The case is a first from a content moderator outside the company’s home country. In May 2020, Meta (then Facebook) reached a settlement of $52 million with US-based moderators who developed PTSD from working for the company. But previous reporting has found that many of the company’s international moderators doing nearly identical work face lower pay and receive less support while working in countries with fewer mental health care services and labor rights. While US-based moderators made around $15 per hour, moderators in places like India, the Philippines, and Kenya make much less, according to 2019 reporting from the Verge.

“The whole point of sending content moderation work overseas and far away is to hold it at arm’s length, and to reduce the cost of this business function,” says Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, who authored a 2020 report on outsourced content moderation. But content moderation is critical for platforms to continue to operate, keeping the kind of content that would drive users—and advertisers—away from the platform. “Content moderation is a core vital business function, not something peripheral or an afterthought. But there’s a powerful irony from the fact that the whole arrangement is set up to offload responsibility,” he says. (A summarized version of Barrett’s report was included as evidence in the current case in Kenya on behalf of Motaung.)

Barrett says that other outsourcers, like those in the apparel industry, would find it unthinkable today to say that they bear no responsibility for the conditions in which their clothes are manufactured.

“I think technology companies, being younger and in some ways more arrogant, think that they can kind of pull this trick off,” he says.

A Sama moderator, speaking to WIRED on the condition of anonymity out of concern for retaliation, described needing to review thousands of pieces of content daily, often needing to make a decision about what could and could not stay on the platform in 55 seconds or less. Sometimes that content could be “something graphic, hate speech, bullying, incitement, something sexual,” they say. “You should expect anything.”

Crider, of Foxglove Legal, says that the systems and processes Sama moderators are exposed to—and that have been shown to be mentally and emotionally damaging—are all designed by Meta. (The case also alleges that Sama engaged in labor abuses through union-busting activities, but does not allege that Meta was part of this effort.)

“This is about the wider complaints about the system of work being inherently harmful, inherently toxic, and exposing people to an unacceptable level of risk,” Crider says. “That system is functionally identical, whether the person is in Mountain View, in Austin, in Warsaw, in Barcelona, in Dublin, or in Nairobi. And so from our perspective, the point is that it’s Facebook designing the system that is a driver of injury and a risk for PTSD for people.”

Crider says that in many countries, particularly those that rely on British common law, courts will often look to decisions in other, similar nations to help frame their own, and that Motaung’s case could be a blueprint for outsourced moderators in other countries. “While it doesn’t set any formal precedent, I hope that this case could set a landmark for other jurisdictions considering how to grapple with these large multinationals.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button