Home WebMail Saturday, November 2, 2024, 05:17 AM | Calgary | -2.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2016-05-18T16:20:27Z | Updated: 2017-05-19T09:12:01Z Facebook Must Be Held Accountable To The Public | HuffPost Life

Facebook Must Be Held Accountable To The Public

There is no such thing as neutrality when it comes to media. That has long been a fiction, one that traditional news media needs and insists on.
|
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Open Image Modal
FILE - In this Tuesday, April 12, 2016, file photo, Facebook CEO Mark Zuckerberg delivers the keynote address at the F8 Facebook Developer Conference in San Francisco. Facebook is under fire after a report from a Gawker site accused it of manipulating its trending topics feature to promote or suppress certain political perspectives. Facebook has denied the claims, but the GOP-led U.S. Senate Commerce Committee has sent a letter to Zuckerberg requesting answers about the matter. (AP Photo/Eric Risberg, File)

A pair of Gizmodo stories have prompted journalists to ask questions about Facebook 's power to manipulate political opinion in an already heated election year. If the claims are accurate, Facebook contractors have depressed some conservative news, and their curatorial hand affects the Facebook Trending list more than the public realizes. Mark Zuckerberg took to his Facebook page to argue that Facebook does everything possible to be neutral and that there are significant procedures in place to minimize biased coverage. He also promises to look into the accusations.

As this conversation swirls around intentions and explicit manipulation, there are some significant issues missing. First, all systems are biased. There is no such thing as neutrality when it comes to media. That has long been a fiction, one that traditional news media needs and insists on, even as scholars highlight that journalists reveal their biases through everything from small facial twitches to choice of frames and topics of interests. It's also dangerous to assume that the "solution" is to make sure that "both" sides of an argument are heard equally. This is the source of tremendous conflict around how heated topics like climate change and evolution are covered. It is even more dangerous, however, to think that removing humans and relying more on algorithms and automation will remove this bias.

Recognizing bias and enabling processes to grapple with it must be part of any curatorial process, algorithmic or otherwise. As we move into the development of algorithmic models to shape editorial decisions and curation, we need to find a sophisticated way of grappling with the biases that shape development, training sets, quality assurance, and error correction, not to mention an explicit act of "human" judgment.



There never was neutrality, and there never will be.

This issue goes far beyond the Trending box in the corner of your Facebook profile, and this latest wave of concerns is only the tip of the iceberg around how powerful actors can affect or shape political discourse. What is of concern right now is not that human beings are playing a role in shaping the news -- they always have -- it is the veneer of objectivity provided by Facebook's interface, the claims of neutrality enabled by the integration of algorithmic processes, and the assumption that what is prioritized reflects only the interests and actions of the users (the "public sphere") and not those of Facebook, advertisers, or other powerful entities.

The key challenge that emerges out of this debate concerns accountability. In theory, news media is accountable to the public. Like neutrality, this is more of a desired goal than something that's consistently realized. While traditional news media has aspired to -- but not always realized -- meaningful accountability, there are a host of processes in place to address the possibility of manipulation: ombudspeople, whistleblowers, public editors, and myriad alternate media organizations. Facebook and other technology companies have not, historically, been included in that conversation .

I have tremendous respect for Mark Zuckerberg, but I think his stance that Facebook will be neutral as long as he's in charge is a dangerous statement. This is what it means to be a benevolent dictator, and there are plenty of people around the world who disagree with his values, commitments, and logics. As a progressive American, I have a lot more in common with Mark than not, but I am painfully aware of the neoliberal American value systems that are baked into the very architecture of Facebook and our society as a whole.

Who Controls the Public Sphere in an Era of Algorithms?

In light of this public conversation, I'm delighted to announce that Data & Society has been developing a project that asks who controls the public sphere in an era of algorithms. As part of this process, we convened a workshop and have produced a series of documents that we think are valuable to the conversation:

  • Assumptions and Questions , a background primer
  • Case Studies , a complement to the contemporary issues primer
  • Workshop Summary , extended notes from the workshop
  • These documents provide historical context, highlight how media has always been engaged in power struggles, showcase the challenges that new media face, and offer case studies that reveal the complexities going forward.

    This conversation is by no means over. It is only just beginning. My hope is that we quickly leave the state of fear and start imagining mechanisms of accountability that we, as a society, can live with. Institutions like Facebook have tremendous power and they can wield that power for good or evil. But for society to function responsibly, there must be checks and balances regardless of the intentions of any one institution or its leader.

    ******

    Points/public spheres: In "Facebook Must Be Accountable to the Public," danah boyd comments on recent clamor around Facebook's Trending box and introduces Data & Society's developing Algorithms and Publics project, including a set of documents occasioned by the Who Controls the Public Sphere in an Era of Algorithms? workshop. More posts from workshop participants:

    This post originally appeared in Points .

    Your Support Has Never Been More Critical

    Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

    Would you help us provide essential information to our readers during this critical time? We can't do it without you.

    Support HuffPost

    HuffPost Shoppings Best Finds

    MORE IN LIFE