Advertisement

Facebook Trending story: The Wizard of Oz algorithm

Advertisement

Story highlights

Facebook's Trending topics aren't just a product of a computer algorithm; journalists shape them

Ed Finn says Facebook didn't want to admit it, but the reality is that human judgment inevitably enters the picture

Editor’s Note: Ed Finn is the founding director of the Center for Science and the Imagination at Arizona State University, where he is an assistant professor with a joint appointment in the School of Arts, Media and Engineering and the Department of English. A former journalist for Time, Slate and Popular Science, he is the co-editor of “Hieroglyph: Stories and Visions for a Better Future” (William Morrow, September 2014) and author of “Culture Machines,” a book about the present and future of algorithms (MIT Press, forthcoming). The opinions expressed in this commentary are his.

CNN —  

The recent scandal with Facebook’s Trending Topics news module goes deeper than the revelation that it was humans all along hiding behind the algorithm. It should come as no surprise that Facebook has bias – every organization does. It’s what you do about the bias, how you attempt to disclose it and manage it, that makes a difference. News organizations have been grappling with that question for a long time, creating formal and informal codes of conduct, oversight systems and transparency rules.

But of course, Facebook doesn’t want to be a news organization (or be seen as taking a political stance). As Will Oremus pointed out in Slate, that would be bad for business: people think much more favorably of technology companies than they do of the Fourth Estate. So it should come as no surprise that in reacting to the scandal Facebook seems desperate to avoid looking like a news agency. It stands, according to VP Justin Osofsky, for “a free flow of ideas and culture across nations.”

Ed Finn
Ed Finn
Ed Finn

This is a lovely sentiment, and I’m sure many people who work at Facebook and use the platform believe in it. But it’s not what Facebook does. We know this for two reasons. First, imagine if the company took that credo at face value.

A truly democratic network where the most popular content wins would be filled with cute pet videos, ice bucket challenges and, one presumes, vast troves of porn. The company got a wake-up call about this problem during the Ferguson tragedy in 2014, when its algorithms did a poor job of sharing news coverage of that story.