Facebook knows it has a hoax problem

"We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation," Adam Mosseri, the vice president of product management, said in a statement to TechCrunch Thursday following widespread criticism of he platform's performance as news conduit during the election. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. The social network allows any user to report a piece of content as "false news." In January 2015, the social network acknowledged, as it did Thursday, that hoaxes are an issue: We’ve heard from people that they want to see fewer stories that are hoaxes, or misleading news. Today’s update to News Feed reduces the distribution of posts that people have reported as hoaxes and adds an annotation to posts that have received many of these types of reports to warn others on Facebook. In December 2015, it said so it again: One example of a type of viral post that people report they don’t enjoy seeing in their News Feed are hoaxes. If there is a viral story about a hoax, it can get a lot of reshares and comments, which would normally help us infer it might be an interesting story. However, we’ve heard feedback that people don’t want to see these stories as much as other posts in their News Feed. And in August 2016, Facebook referenced the problem once more: One of our News Feed values is to have authentic communication on our platform. That’s why we work hard to understand what type of stories and posts people consider genuine, so we can show more of them in News Feed.

Facebook’s Safety Center relaunch promotes safe sharing worldwide
How to Get More Facebook Traffic by Posting Less
Personalized Calls to Action Perform 202% Better Than Basic CTAs [New Data]
Facebook CEO Mark Zuckerberg at a 2016
Facebook CEO Mark Zuckerberg at a 2016 “Innovation Hub”
event in Berlin, Germany.

Facebook is promising yet again to tackle the spread of
misinformation in your News Feed.

“We value authentic communication, and hear consistently from
those who use Facebook that they prefer not to see misinformation,”
Adam Mosseri, the vice president of product management, said in a

statement
to TechCrunch Thursday following widespread criticism
of he platform’s performance as news conduit during the
election.

It remains unclear what “authentic communication” means, or why
Facebook seems to acknowledge that this is a problem only because
its users say “they prefer not to see misinformation” much in the
same way that you might prefer not to eat a bologna sandwich
covered in sand. The company did not immediately respond to
Mashable‘s request for comment.

Here’s the full statement, as published on TechCrunch, with some
emphasis added by :

We take misinformation on Facebook very seriously. We value
authentic communication, and hear consistently from those who use
Facebook that they prefer not to see misinformation. In Newsfeed
we use various signals based on community feedback to
determine which posts are likely to contain inaccurate
information
, and reduce their distribution. In Trending we
look at a variety of signals to help make sure the topics being
shown are reflective of real-world events, and take
additional steps to prevent false or misleading content from
appearing
. Despite these efforts we understand there’s so
much more we need to do, and that is why it’s important that we
keep improving our ability to detect misinformation. We’re
committed to continuing to work on this issue and improve the
experiences on our platform.

Facebook says it identifies inaccurate information based on “community feedback,” but it’s hard to tell what that really means.
The social network allows any…

COMMENTS

WORDPRESS: 0
DISQUS: 0