Home WebMail Thursday, October 31, 2024, 10:34 PM | Calgary | -3.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2016-11-19T20:21:13Z | Updated: 2016-11-19T20:21:13Z

Facebook Inc, facing withering criticism for failing to stem a flood of phony news articles in the run-up to the U.S. presidential election , is taking a series of steps to weed out hoaxes and other types of false information, chief executive Mark Zuckerberg said in a Facebook post Friday evening.

Facebook has long insisted that it is a technology company and not a publisher, and rejects the idea that it should be held responsible for the content that its users circulate on the platform. Just after the election, Zuckerberg said the notion that fake or misleading news on Facebook had helped swing the election to Donald Trump was a crazy idea .

Zuckerberg then said last Saturday that more than 99 percent of what people see on Facebook is authentic, calling only a very small amount fake news and hoaxes.

But in his Friday posting Zuckerberg struck a decidedly different tone. He said Facebook has been working on the issue of misinformation for a long time, calling the problem complex both technically and philosophically.

While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap, Zuckerberg said.

He outlined a series of steps that were already underway, including greater use of automation to detect what people will flag as false before they do it themselves.

He also said Facebook would make it easier to report false content, work with third-party verification organizations and journalists on fact-checking efforts, and explore posting warning labels on content that has been flagged as false. The company will also try to prevent fake-news providers from making money through its advertising system, as it had previously announced.

Zuckerberg said Facebook must be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties, he said.

Facebook historically has relied on users to report links as false and share links to myth-busting sites, including Snopes, to determine if it can confidently classify stories as misinformation, Zuckerberg said. The service has extensive community standards on what kinds of content are acceptable.

Your Support Has Never Been More Critical

Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

Would you help us provide essential information to our readers during this critical time? We can't do it without you.

You've supported HuffPost before, and we'll be honest we could use your help again . We view our mission to provide free, fair news as critically important in this crucial moment, and we can't do it without you.

Whether you give once or many more times, we appreciate your contribution to keeping our journalism free for all.

You've supported HuffPost before, and we'll be honest we could use your help again . We view our mission to provide free, fair news as critically important in this crucial moment, and we can't do it without you.

Whether you give just one more time or sign up again to contribute regularly, we appreciate you playing a part in keeping our journalism free for all.

Support HuffPost

Facebook faced international outcry earlier this year after it removed an iconic Vietnam War photo due to nudity, a decision that was later reversed. The thorniest content issues are decided by a group of top executives at Facebook, and there have been extensive internal conversations at the company in recent months over content controversies, people familiar with the discussions say.

Among the fake news reports that circulated ahead of the U.S. election were reports erroneously alleging Pope Francis had endorsed Trump and that a federal agent who had been investigating Democratic candidate Hillary Clinton was found dead. (Reporting by David Bailey in Minneapolis; Editing by Jonathan Weber and Mary Milliken)