According to the Daily Mail, social media giant Meta has confirmed that a bug caused a spread of misinformation on Facebook between October and March due to a bug.
Meta, Facebook’s parent company, has acknowledged that a problem caused a “surge of misinformation” and other harmful content to emerge in users’ News Feeds.
Engineers at Meta neglected to delete posts from ‘repeat disinformation offenders’ for over six months, according to an internal memo.
Between October and March, a Facebook bug caused a “surge of misinformation” and other hazardous content to show in users’ News Feeds, according to Meta.
Read more: Meta is looking to develop virtual coins dubbed Zuck bucks for its Metaverse
During the war on Ukraine, Facebook systems failed to disparage or remove content relating to nudity, violence, and Russian state media, according to the paper.
Having Conversation with The Verge, Meta spokesperson Joe Osborne stated, they identified irregularities in downranking, which were connected with minor, transitory spikes in internal measures.
He went on to say that they traced the problem back to a software flaw and fixed it. The glitch, on the other hand, has had no long-term influence on our measurements.
Rather than removing and censoring postings from repeat disinformation offenders that were examined by the company’s network of outside fact-checkers, the posts were instead spread across the News Feed, according to The Verge.
Views on Facebook accounts labelled as repeat ‘misinformation offenders’ increased by up to 30%.
The problem was finally resolved three weeks ago, on March 11, according to the internal document.