Facebook is taking aim at individual user accounts in its latest attempt to stem the flow of misinformation shared across the social network.

The tech giant has long focused much of its efforts on pages and groups when tackling the ever-present issue, which has become even more problematic with the spread of misleading claims about Covid-19 and vaccines.

But now Facebook is taking the battle to individual accounts, effectively burying a user’s posts further down the News Feed if they have repeatedly posted content that has been investigated by one of the firm’s fact-checkers.

Facebook
Facebook has announced new ways it hopes to tackle the spread of misinformation on its platform (Facebook/PA)

This goes beyond Facebook’s existing approach, which reduces a single post’s reach in the News Feed if it has been deemed false.

The social network has also unveiled new ways of letting people know they might be interacting with hotbeds of misinformation, informing them in a pop-up message that a page is known to have shared false claims previously, before they like the page.

Facebook said the move is designed to “help people make an informed decision about whether they want to follow the page”.

“Whether it’s false or misleading content about Covid-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” the company said.