Instagram tweaks algorithm after Palestinian censorship accusations
Instagram is changing its app to show more viral and topical posts amid complaints from its staff that pro-Palestinian content was not seen by users during the recent conflict in Gaza.
Until now, the social media app has prioritized original content in the “stories” it displays at the top of a user’s feed over content that is re-shared or reposted by other people.
Now, Instagram will categorize original and reposted content the same, according to two people familiar with the situation and internal staff posts, in a move that will help breaking news posts find a wider audience.
A spokesperson said there had been an increase in the number of users sharing messages about the recent conflict in Gaza, but the way the app is currently configured had a “bigger impact than expected” on the number of people who saw the messages.
“Stories that re-share the feed posts aren’t getting the reach that people expect from them, and it’s not a good experience,” the spokesperson said. “Over time, we’ll shift to giving re-shared posts equal weight as we do originally produced stories.”
Instagram said the move did not fully address issues with pro-Palestinian content, but had been under consideration for some time.
The spokesperson said the algorithm had “tricked people into believing that we were removing stories on particular topics or viewpoints”, but added: “We want to be really clear – this is not the case. case. This applied to all posts that are reposted in stories, regardless of their subject. “
A group of 50 employees within Facebook, the owner of Instagram, had raised concerns over the suppression of pro-Palestinian voices, an employee involved said.
The employee said the group made more than 80 calls for content censored by the company’s automated moderation system. BuzzFeed also reported the group’s existence earlier.
Facebook’s algorithms had tagged words commonly used by Palestinian users, such as “martyr” and “resistance,” as incitement to violence and deleted posts on the al-Aqsa mosque after mistakenly linking the third site to it. most sacred of Islam to a terrorist organization, according to the United States. media reports.
The employee told the Financial Times he didn’t think there was any deliberate censorship on Facebook’s part, but suggested that “large-scale moderation is biased against marginalized groups” and leads to enforcement. excessive withdrawals.
Facebook said, “We know that several issues have impacted the ability of people to share on our apps. We are sorry for anyone who felt they could not draw attention to important events or felt it was a deliberate suppression of their voice. This was never our intention and we never want to silence any particular community or point of view.