Breakingnewstexas.com

How the death of a British teenager changed social media

Instagram also hides search terms, but only if the terms or phrases themselves promote or encourage self-harm, says Tara Hopkins, Instagram’s EMEA head of public policy. “For other search terms related to suicide/self-harm that are not inherently offensive, we display a support message before displaying the results.” The company declined to say how many search terms were blocked.

Instagram-owned Meta says it balances concerns for children’s safety with young people’s freedom of expression. The company admitted that the two posts Molly saw and showed in court violated Instagram’s policies at the time. But Elizabeth Lagon, head of policy for health and wellbeing at Meta, told the inquest last week that it was “important to give people that voice” as they struggle with suicidal thoughts. When Russell family attorney Oliver Sanders asked Lagon whether she agreed that the content viewed by Molly and seen in court was “dangerous,” Lagon said, “I think it’s safe for people to be able to express themselves.”

These comments embody what researchers say are the main differences between the two platforms. “Pinterest is much more concerned with being assertive, being clear, and deplatforming content that doesn’t meet their standards,” says Samuel Woolley, program director of the Advocacy Research Lab at the University of Texas at Austin. “Instagram and Facebook … tend to be much more concerned about running afoul of free speech.”

Pinterest didn’t always work this way. Hoffman told the investigation that Pinterest’s guidance was, “when in doubt, lean toward … lighter content moderation.” But Molly’s death in 2017 coincided with the fallout from the 2016 US presidential election, when Pinterest was implicated in the spread of Russian propaganda. Around that time, Pinterest began banning entire topics that didn’t align with the platform’s mission, such as vaccines or conspiracy theories.

This is in stark contrast to Instagram. “Meta-platforms, including Instagram, are driven by the principle of wanting to exist as infrastructural information tools [like] telephone or [telecoms company] AT&T, not as a social media company,” Woolley says. Facebook shouldn’t be the “arbiter of truth,” Meta founder and CEO Mark Zuckerberg argued in 2020.

The investigation also highlighted differences between how transparent the two platforms were willing to be. “Pinterest immediately provided content about Molly’s Pinterest activity, including not only the Pins Molly saved, but also the Pins she [clicked] turned it on and scrolled,” Varney says. Meta has never provided this level of detail to the court, and much of the information the company has shared has been redacted, she adds. For example, the company revealed that in the six months before Molly’s death, 30 accounts were recommended with titles that related to sad or depressing themes. However, the real names of these accounts have been redacted, with the platform citing the privacy of its users.

Varney agrees that both platforms have made improvements since 2017. She says search results for self-harm terms on Pinterest don’t contain the same level of graphic content as they did five years ago. But Instagram’s changes were too little too late, she argues, adding that Meta didn’t ban naturalistic images of self-harm and suicide until 2019.

Source by [author_name]

Exit mobile version