Apps

Facebook steps up efforts to tackle misinformation

Facebook has stepped up its efforts to tackle fake news online by sending warnings to users who may have seen news that is not accurate.

The platform will show messages in the news feed to users who have interacted with content that has been reviewed and considered misinformation.

Whether a user has liked, reacted, commented or has shared the misinformation, they will be alerted, and also the information will be removed.

The company said to date it has directed over 2 billion people to resources on COVID-19 from the World Health Organisation and other authorities. It’s also working with over 60 fact-checking organisations to review and rate content in more than 50 languages.

Facebook said “We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook. People will start seeing these messages in the coming weeks.”

During the month of March, Facebook displayed warnings on about 40 million posts related to COVID-19, based on around 4,000 articles reviewed by independent fact-checking partners. It said 95% of the time, people did not want to view the original content, once there was a warning.

Last month Facebook worked with the Government to launch a Coronavirus Information Service through WhatsApp. The chatbot provides anyone with trustworthy and timely information about coronavirus.

To launch the service send hi to +44 7860 064422 on WhatsApp. It also allows the Government to send messages through WhatsApp to users and for users to find trusted answers to common questions.

Prof Yvonne Doyle, Medical Director, Public Health England, said “This service will help us ensure the public has a trusted source for the right information about coronavirus, updated with the latest public health guidance and providing assurance that they are not misled by any of the false information circulating.”