This is the first case from a content moderator outside the company’s home country. In May 2020, Meta (then Facebook) reached a $52 million settlement with American moderators who suffered post-traumatic stress disorder as a result of working for the company. But previous reports have shown that many of the company’s international moderators, who do much the same work, face lower pay and less support working in countries with fewer mental health services and labor rights. While American moderators earn about $15 an hour, moderators in places like India, the Philippines, and Kenya earn significantly less, according to a 2019 report by the Verge.

“The whole point of sending content moderation work offshore and far away is to keep it at arm’s length and reduce the cost of that business function,” says Paul Barrett, associate director of New York University’s Center for Business and Human Rights. , who wrote the 2020 Outsourced Content Moderation Report. But content moderation is essential to keep platforms running, keeping content that will push users and advertisers away from the platform. “Content moderation is a core, vital business function, not something peripheral or an afterthought. But there is a strong irony in the fact that this whole arrangement is created to remove responsibility,” he says. (An abridged version of Barrett’s report was entered into evidence in the current case in Kenya on behalf of Motaunga.)

Barrett says that other outsourcers, such as those in the garment industry, today would find it unthinkable to say that they are not responsible for the conditions in which their clothes are made.

“I think the technology companies, being younger and more brash in a way, think they can pull this stunt,” he says.

Moderator Sama, who spoke to WIRED on the condition of anonymity out of concern for retaliation, talked about having to review thousands of pieces of content daily, often having to decide what can and can’t stay on the platform for 55 seconds or less. Sometimes that content can be “something naturalistic, hate speech, bullying, incitement, something sexual,” they say. “You have to expect anything.”

Foxglove Legal’s Crider says the systems and processes Sama’s moderators are subjected to that have been shown to cause mental and emotional harm are designed by Meta. (The case also alleges that Sama participated in workplace abuses through union busting activities, but does not allege that Meta was part of that effort.)

“It’s about a broader complaint that the work system is inherently harmful, toxic, and exposes people to unacceptably high levels of risk,” Kreider says. “This system is functionally identical whether a person is in Mountain View, Austin, Warsaw, Barcelona, ​​Dublin or Nairobi. So, from our perspective, the point is that Facebook is developing a system that is causing trauma and PTSD risk for people.”

Crider says that in many countries, especially those based on British common law, courts will often look to decisions from other, similar countries to help formulate their own, and that the Motaung case could provide a blueprint for outsourced moderators in other countries . “While it does not set any formal precedent, I hope this case can serve as a guide for other jurisdictions considering how to deal with these large multinational companies.”

Source by [author_name]

Previous articleThese apps will keep your electric car from discharging
Next articleElon Musk Says Twitter Is ‘Tending To Breakeven’ After Near Bankruptcy