Facebook announced on Thursday that the company will no longer show health groups in its recommendations, saying it was crucial that people get health information from “authoritative sources.”

It said in a blog post that over the last year, the company took down more than 1 million groups that violated Facebook’s policies on misinformation and harmful content.

Advocacy group Avaaz said in a report last month that misleading health content has racked up an estimated 3.8 billion views on Facebook over the past year, peaking during the coronavirus pandemic.

Facebook, under pressure to curb such misinformation on its platform, has made increasing credible health information a key element of its response. It also removes certain false claims about COVID-19 that it determines could cause imminent harm.

The social media platform also said that it would bar administrators and moderators of groups that have been taken down for policy violations from creating any new groups for a period of time.

Facebook said in the blog post that it also now limits the spread of groups tied to violence by removing them from its recommendations and searches, and soon, by reducing their content in its news feed.

Last month, Facebook removed nearly 800 QAnon conspiracy groups for posts celebrating violence, showing intent to use weapons, or attracting followers with patterns of violent behavior.

Twitter also said in a tweet on Thursday that the platform had reduced impressions on QAnon-related tweets by more than 50 percent through its “work to deamplify content and accounts” associated with the conspiracy theory. In July, the social media company said it would stop recommending QAnon content and accounts in a crackdown it expected would affect about 150,000 accounts.

On Thursday, Twitter laid out in a blog post how it assesses groups and content for coordinated harmful activity, saying it must find evidence that individuals associated with a group or campaign are engaged in some kind of coordination that may harm others.

Twitter said it prohibits all forms of technical coordination, but for social coordination to break its rules, there must be evidence of physical or psychological harm, or ‘informational’ harm caused by false or misleading content.

Source: https://gadgets.ndtv.com/social-networking/news/facebook-says-it-will-no-longer-show-health-groups-in-recommendations-2297581

Leave a Reply

Your email address will not be published. Required fields are marked *

The following GDPR rules must be read and accepted:
This form collects your name, email and content so that we can keep track of the comments placed on the website. For more info check our privacy policy where you will get more info on where, how and why we store your data.