Facebook users will be warned before sharing a story that's actually fake news, the social media giant says. Bogus __news sites — such as these stories from "USA Daily __news 24," a site that's registered in Veles, Macedonia — have been blamed for the spread of misinformation online. Raphael Satter/AP hide caption
Facebook users will be warned before sharing a story that's actually fake news, the social media giant says. Bogus news sites — such as these stories from "USA Daily News 24," a site that's registered in Veles, Macedonia — have been blamed for the spread of misinformation online.
Raphael Satter/APProviding new details about how it's trying to counter the spread of fake news on its services, Facebook says it's working with fact-checking groups to identify bogus stories — and to warn users if a story they're trying to share has been reported as fake.
Facebook also says it will let users report a possible hoax by clicking the upper right hand corner of a post and choosing one of four reasons they want to flag it — from "It's spam" to "It's a fake news story."
If a story is deemed false, it will be tagged with an alert message saying it's been "disputed by 3rd party fact-checkers."
A mockup provided by Facebook shows the screens it will use to allow users to report a potential hoax or fake news story. Facebook hide caption
A mockup provided by Facebook shows the screens it will use to allow users to report a potential hoax or fake news story.
FacebookThe social media giant was sharply criticized after the Nov. 8 election, as false stories were blamed for adding confusion to a dynamic campaign season. Since then, fake news and conspiracy theories were also identified as a motivating factor in a man's assault on a pizza restaurant in Washington, D.C.
In the wake of that and other stories, some called for Facebook to hire editors to vet news stories; in today's update from Facebook's vice president in charge of its News Feed feature, Adam Mosseri, the company could be seen to be effectively outsourcing that job to third-party groups that it says have signed on to Poynter's International Fact Checking Code of Principles.
The Two-Way
Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds
The update to Facebook's plan to cope with bogus information comes nearly one month after CEO Mark Zuckerberg acknowledged that Facebook had "much more work" to do in how it handles false stories.
Today's news touches on four of the seven areas that Zuckerberg listed as part of his company's fight against misinformation. It remains to be seen whether the moves will satisfy Facebook's critics — both inside and outside the company's ranks — who've faulted the way it deals with controversial, offensive and/or fake posts. As NPR's Aarti Shahani reported in November, that effort has grown to include thousands of overseas subcontractors.
In a news release outlining how Facebook's new reporting and flagging process will work, Mosseri said the company will rely on its users to report a story as potentially bogus, "along with other signals." The story would then be sent to fact-checkers.
"If the fact-checking organizations identify a story as fake," Mosseri said, "it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed."
Mosseri added, "It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share."
The flagged story will also be rejected if anyone tries to turn it into a promoted ad, Facebook says.
While fake news created a stir because of its intersection with U.S. politics, many of the people behind the sites say they're mainly in it for the money.
Here's how Craig Silverman of BuzzFeed News described what he found in researching the phenomenon, Wednesday's Fresh Air:
"Facebook directly doesn't really earn them a lot of money. But the key thing about Facebook — and this is true whether you're running a politics site out of Macedonia or whether you run a very large website in the U.S. — Facebook is the biggest driver of traffic to, you know, news websites in the world now. You know, 1.8 billion people log into Facebook every month."
Today, Facebook says it has "found that a lot of fake news is financially motivated" — and that it's taking steps to remove some of that incentive.
"On the buying side we've eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications," Mosseri says. "On the publisher side, we are analyzing publisher sites to detect where policy enforcement actions might be necessary."
No comments:
Post a Comment