Official Discord for 1stAmender - Click to Join Us!

It’s time to get rid of the Facebook “news feed,” because it’s not news

current events


Tags: Zuckerbook  

current events

It’s time to get rid of the Facebook “news feed,” because it’s not news published by nherting
Writer Rating: 0
Posted on 2016-11-19
Writer Description: current events
This writer has written 195 articles.


In the wake of the US election, critics have blamed Facebook for bringing about—at least in part—Trump's surprise win. A BuzzFeed report showed that Facebook users interacted far more with "fake news" stories about both candidates than they did with mainstream news outlets before the election. This wouldn't seem like such a big deal if it weren't for a Pew Research Center surveyshowing that 44% of Americans rely on Facebook to get all their news.

But proving whether fake news influenced the election more than the usual political propaganda is impossible. What's certain is that fake news on Facebook is a symptom of a larger problem: the company is trying to play contradictory roles as both trustworthy news publisher and fun social media platform for personal sharing. The problem is that it cannot be both—at least not without making some changes.

Facebook shapes people's perceptions

When you log into your Facebook account, your default page is dominated by a cascading "news feed," automatically selected for your pleasure, which consists of whatever your friends have shared. The company uses a mix of secret-sauce algorithms to choose which pieces of news you see. Some items are displayed based on what you've responded to before. For example, if you always like or reply to news from Trevor but ignore news from Mike, you're going to see more Trevor and less Mike.

Other news that Facebook thinks you probably want to see is fed to you based on your profile. The Wall Street Journal has an incredible infographic on this, showing how Democrats on Facebook see liberal-leaning stories and Republicans see conservative-leaning ones.

Mark Zuckerberg has protested since the election that it's preposterous to believe that Facebook could change anyone's minds based on their news feed—yet the company behaves as if it does. Given that Facebook's main goal is to serve you ads and get you to buy things, their number-one priority is keeping you glued to your feed. If you see a bunch of things you hate in your feed, you're going to stop looking at it and take your clicks elsewhere. Common sense dictates that Facebook should avoid showing you news that will upset you or make you angry.

But Facebook's decision goes beyond common sense. It's based on real data the company gathered in a 2012 experiment in which algorithms fed users positive and negative stories to see whether they would affect people's moods. Sure enough, the people fed positive news responded with more indicators of happiness. To the extent that people at Facebook believe their own data analysis, they know that the news feed affects people's emotions and shapes their perceptions of the world. Their business depends on it.

Though there's something ineffably creepy about Facebook manipulating our emotions, the site is no different than any other ad-driven business. Television series toy with your feelings to keep you coming back for more, even after you've seen the same stupid ad on Hulu nine times. Hollywood pumps out sequels to get you to pay $16 for a ticket. Facebook's big innovation was the discovery that it could sell ads against people's friendship networks. We consume each other's posts on Facebook the same way we consume new episodes of Mr. Robot and with the same result. Our eyeballs translate into ratings, which translate into ad dollars.

The trouble with "news feed"

The problem is that Facebook decided to go beyond friendship networks. For several years now, the company has courted professional news outlets, ranging from the New York Times and BuzzFeed to Breitbart and FOX News, promising them prominent placement in users' feeds. This courting intensified with the creation of the Facebook Instant service in 2015, which allows media companies to publish stories directly on Facebook and share ad revenues.

With no human checks on it, that algorithm immediately started posting fake news.

Facebook's algorithms are great at keeping people glued to their screens, but they are terrible at distinguishing real news from fake. Yet the most prominent feature of Facebook is called a "news feed." Given the company's almost hilarious inability to identify news, this feature clearly has a misleading name.

"But what is news?" you might ask. "Are you talking about liberal media?" No. I am talking about stories published by professional media organizations—organizations that take legal and ethical responsibility for what they publish. They pledge to print news that is the truth. They may define truth differently, depending on whether they're Mother Jones or the National Review. Nevertheless, they come up with a definition and attempt to stick to it.

Plus, if those publications post articles that are libelous, infringing, obscene, or otherwise unlawful, they can be held liable in a court of law for what they've published. In short, news comes from organizations that stand by what they post as the truth. That's why the public can trust news from a professional media organization more than that rant Uncle Tommy posted about chemtrails on Facebook. Unfortunately, however, Uncle Tommy's rant is classified as part of the same "news feed" that contains headlines from your most trusted professional media sources.

The legal issues

Our concerns with Facebook's news feed aren't just a matter of truthy semantics, either. If you want to understand what's at stake, think about two rules that the US legal system uses to regulate the news media. One is the First Amendment, which has been interpreted to mean that a publisher has the right to publish lawful information without interference from the government. The other is a rather obscure section of the Communications Decency Act known as "CDA 230."

Section 230 offers immunity from legal prosecution to "intermediaries" or "interactive computer services" who publish things created by other people. The classic example of a person protected by CDA 230 is a blogger who allows open comments on their stories. Under CDA 230, the blogger cannot be sued if a commenter says something libelous or posts an infringing movie. The blogger is merely an "intermediary" and thus does not bear responsibility for what commenters say.

Facebook enjoys the protection of both the First Amendment and CDA 230, but CDA 230 is key to its survival. It means that Facebook cannot be held liable for anything posted on Facebook, whether written by Uncle Tommy or a BuzzFeed journalist. Without CDA 230, President-Elect Trump could sue Facebook for libel when users post fake news about him. Angry consumers could sue Facebook for fraud when quack doctors post false claims about herbal remedies. The government could sue Facebook for obscenity when a naughty photo goes viral. CDA 230 is what's keeping Facebook from being sued out of existence.

But CDA 230 may not protect Facebook if the company continues to use algorithms to shape people's experience in the news feed. We're entering a legal gray area with CDA 230, where courts are still undecided on what actions boot an organization out of the protected category. If Facebook starts acting too much like a publisher, or what CDA 230 calls an "information content provider," it may open itself up to liability.

Given that one of the traditional roles of publishers is to choose which stories to show their readers, a court might possibly decide that Facebook is actually a publisher. After all, if I'm following 100 people, and Facebook only shows me "news" from 20 of them, is that not taking an editorial role?

Courts may decide it either way, too. It's not out of the realm of possibility that a few important court cases could redefine Facebook as an information content provider. That would force the company into taking responsibility for fake news in the most dramatic possible way. It would also destroy their business.

Put up or shut up, Facebook

Facebook wants to have it both ways, both ethically and legally. The company wants to be seen as a source of news. The company offers media outlets a chance to use Facebook Instant, because having native content from the New York Times enhances Facebook's credibility. And yet it also wants to retreat into the safe harbor of being a mere intermediary that does not distinguish between Uncle Tommy's rants and a Washington Post investigation. That's why Zuckerberg has responded to criticisms of fake news by saying Facebook's job is not to determine what's true.

Facebook needs to to stop playing both sides, but it keeps trying to walk the line. Vice correspondent Nellie Bowles said on Twitter, "On the phone with Facebook PR and they literally ask me 'what is truth'." As Bowles pointed out in an article later, Facebook already has an answer to this question: the company just officially banned fake news sources from buying ads. Clearly, Zuckerberg et al know how to distinguish real from fake, but are choosing to feign ignorance when it suits them.

Facebook, stop it. Stop calling it a "news feed." Call it a "friend feed." Call it a "fun feed." Call it whatever you want—just don't call it news, because it's a dangerous misrepresentation for those 44% of Americans who choose to look to Facebook for actual news.

There are several ways forward from here. Facebook could get out of the news business entirely and go back to being a pure social network. It could abandon media partnership initiatives like Facebook Instant and Facebook Live. It could tune its algorithms to feed people only fun updates from friends, and enjoy the benefits of being one of the world's biggest entertainment companies.

Or the company could go in a more radical direction and declare itself an actual news provider. It could hire a legitimate team of news writers and editors and bring readers the news they seem to crave. But this scenario seems unlikely, given what I said earlier about how Facebook's business model depends on CDA 230 immunity. Almost nothing would get posted to Facebook if the company had to take full legal responsibility for everything in the feed.

Or, as a third option, Facebook could keep walking the line, but in a much more honest fashion. It might do this by creating a separate news section, curated in part by humans and perhaps by better algorithms. This section could draw from a vast array of professional media organizations, the same way Google News does. It would also be separate from the friend feed, to make it easy for users to understand when they're reading something from an accountable organization and when they're reading goofball crap.

No matter what Facebook does, it needs to stop pretending that "news" can be anything from anyone, granting equal weight to the trivial and the truly important. Most people think that news means truth and accountability. We know the public depends on the company to help show them what's important in the world. Facebook needs to take full responsibility for posting news or get out the game. That's not just a matter of ethics—it's good business.

   

Sources:
http://arstechnica.com/staff/2016/11/its-time-to-get-rid-of-the-facebook-news-feed-because-its-not-news/

Article Rating: 1.0000



You have the right to stay anonymous in your comments, share at your own discretion.

Anonymous: 2017-01-02 07:14:50 ID:25

It should just draw from the front page of BBC, FOX, CNN, NBC, and take all the ones that they all share and use those.

Anonymous: 2017-01-11 12:24:09 ID:69

all of those are propaganda as well though.

Anonymous: 2017-01-22 09:44:42 ID:96

No different than this website claiming to be freedom based news. The indented "anonymous" replies apparently coming from the website owner unless maybe the "journalists" attached to each article.