A former Meta employee who worked on content moderation systems and policies and who spoke to WIRED on condition of anonymity says, however, that mass reporting could at least result in certain pieces of content or accounts being flagged for scrutiny. And the more often a certain type of content is flagged, the more likely the algorithm will flag it in the future. However, with languages ​​where there is less material to train the algorithm on, such as Bulgarian, and the AI ​​may be less accurate, the former employee says it’s possible more it is likely that a human moderator will make the final decision on whether or not to remove a piece of content.

Meta spokesman Ben Walters told WIRED that Meta does not remove content based on the number of posts. “If a piece of content doesn’t violate our community standards, no matter how many posts, it won’t get the content removed,” he says.

Some moderation issues may be the result of human error. “There will be error rates, there will be things removed that Meta didn’t intend to remove. This happens,” they say. And these errors are even more likely in non-English languages. Content moderators are often only given a few seconds to review a post before deciding whether it should stay online, which is the metric by which their performance is measured.

There is also the real possibility that there may be bias among human moderators. “The majority of the population actually supports Russia even after the war in Ukraine,” says Galev. Galev says it is unreasonable to think that some moderators might also hold such views, especially in a country with limited independent media.

“There is a lack of transparency about who is who decides, who takes the decision,” said Ivan Radzev, a board member of the Association of European Journalists in Bulgaria, a non-profit organization that issued a statement condemning Bird.bg’s placement of an information officer. “These sentiments are fueling discontent in Bulgaria.” This opacity can cause confusion.

The imbalance between the ability of coordinated companies to flag content and the ability of individuals or small civil society organizations whose reports are directed to moderators has helped create the impression in Bulgaria that Meta favors pro-Russian content over pro-Ukrainian content. content

Just over half of Bulgaria’s 6.87 million people use Facebook, which is the dominant social media platform in the country. Bulgaria has long been a target of Russian trolls and pro-Russian propaganda, especially since the beginning of the war in Ukraine. Both sympathetic local media and Russian disinformation operations push the pro-Russian narrative, blaming NATO for the conflict.

Ezekiev, a BOEC member, told WIRED that he was never explained why his content was removed or how the choice was made. “If you raise your voice against propaganda and talk about the war in Ukraine, your account can be blocked,” he says. Meta’s own lack of transparency about its moderation processes, Yezekiev says, makes the whole situation even murkier.

It was this frustration that prompted the BOEC to protest at the Telus office in Sofia, and resulted in largely powerless employees being attacked and harassed, although there is no evidence that the Telus moderator deviated from Meta’s instructions.

In February, Bulgarian media reported that Telus was closing its operations in the country and moving operations to Germany. “As part of the consolidation of operations, the work that Telus International does for Meta in Sofia will be transferred to another of our sites,” says Telus spokeswoman Michelle Brodovich. “Telus International continues to work successfully with Meta, ensuring a high level of professional standards.” The company did not specify whether investigations into its work in Bulgaria contributed to the decision.

Source by [author_name]

Previous articleGoogle prohibits payment of remaining medical leave for terminated employees
Next articleChinese version of ChatGPT and data protection of our brain