Meta Platforms has reacted quickly after the Instagram Reels feed worldwide apparently showed inappropriate content, that included people being beaten or even killed.
CNN reported that Meta has apologised for a technical error after some users complained they saw violent, graphic videos in their Instagram Reels feed.
It is not clear how many people were impacted by the “technical error” or indeed what the nature of the error was.
Meta apology
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” a Meta spokesperson was quoted by CNN as saying in a statement. “We apologise for the mistake.”
The technical error had reportedly allowed violent and “not safe for work” content in Reels feeds, despite some users having enabled the “sensitive content control” setting meant to filter such material.
It should be noted that violent and graphic videos are prohibited under Meta’s policy and the company usually removes such content to protect users.
However there can be exceptions given for videos that raise awareness on topics including human rights abuse and conflict.
Content moderation scrutiny
The technical error comes at a difficult time for Meta.
Mark Zuckerberg’s firm is competing fiercely with rivals such as TikTok, and Meta has been heavily promoting short-form video engagement on its platforms in recent years.
And Meta’s moderation policies have come under close scrutiny after it decided last month to scrap its US fact-checking program on Facebook, Instagram and Threads.
The axing of fact-checking on three of the world’s biggest social media platforms with more than 3 billion users globally, came as Mark Zuckerberg continued his attempts to woe US President Donald Trump.
Meta in the end replaced its fact checkers with Community Notes.