Why Facebook won't let you turn off its news feed algorithm - The Washington Post - 0 views
shared by Javier E on 13 Nov 21
- No Cached
-
In at least two experiments over the years, Facebook has explored what happens when it turns off its controversial news feed ranking system — the software that decides for each user which posts they’ll see and in what order, internal documents show. That leaves users to see all the posts from all of their friends in simple, chronological order.
-
The internal research documents, some previously unreported, help to explain why Facebook seems so wedded to its automated ranking system, known as the news feed algorithm.
-
previously reported internal documents, which Haugen provided to regulators and media outlets, including The Washington Post, have shown how Facebook crafts its ranking system to keep users hooked, sometimes at the cost of angering or misinforming them.
- ...25 more annotations...
-
In testimony to U.S. Congress and abroad, whistleblower Frances Haugen has pointed to the algorithm as central to the social network’s problems, arguing that it systematically amplifies and rewards hateful, divisive, misleading and sometimes outright false content by putting it at the top of users’ feeds.
-
The political push raises an old question for Facebook: Why not just give users the power to turn off their feed ranking algorithms voluntarily? Would letting users opt to see every post from the people they follow, in chronological order, be so bad?
-
The documents suggest that Facebook’s defense of algorithmic rankings stems not only from its business interests, but from a paternalistic conviction, backed by data, that its sophisticated personalization software knows what users want better than the users themselves
-
Since 2009, three years after it launched the news feed, Facebook has used software that predicts which posts each user will find most interesting and places those at the top of their feeds while burying others. That system, which has evolved in complexity to take in as many as 10,000 pieces of information about each post, has fueled the news feed’s growth into a dominant information source.
-
The proliferation of false information, conspiracy theories and partisan propaganda on Facebook and other social networks has led some to wonder whether we wouldn’t all be better off with a simpler, older system: one that simply shows people all the messages, pictures and videos from everyone they follow, in the order they were posted.
-
That employee, who said they had worked on and studied the news feed for two years, went on to question whether automated ranking might also come with costs that are harder to measure than the benefits. “Even asking this question feels slightly blasphemous at Facebook,” they added.
-
“Whenever we’ve tried to compare ranked and unranked feeds, ranked feeds just seem better,” wrote an employee in a memo titled, “Is ranking good?”, which was posted to the company’s internal network, Facebook Workplace, in 2018
-
In 2014, another internal report, titled “Feed ranking is good,” summarized the results of tests that found allowing users to turn off the algorithm led them to spend less time in their news feeds, post less often and interact less.
-
Without an algorithm deciding which posts to show at the top of users’ feeds, concluded the report’s author, whose name was redacted, “Facebook would probably be shrinking.”
-
there’s a catch: The setting only applies for as long as you stay logged in. When you leave and come back, the ranking algorithm will be back on.
-
What many users may not realize is that Facebook actually does offer an option to see a mostly chronological feed, called “most recent,”
-
The longer Facebook left the user’s feed in chronological order, the less time they spent on it, the less they posted, and the less often they returned to Facebook.
-
A separate report from 2018, first described by Alex Kantrowitz’s newsletter Big Technology, found that turning off the algorithm unilaterally for a subset of Facebook users, and showing them posts mostly in the order they were posted, led to “massive engagement drops.” Notably, it also found that users saw more low-quality content in their feeds, at least at first, although the company’s researchers were able to mitigate that with more aggressive “integrity” measures.
-
Nick Clegg, the company’s vice president of global affairs, said in a TV interview last month that if Facebook were to remove the news feed algorithm, “the first thing that would happen is that people would see more, not less, hate speech; more, not less, misinformation; more, not less, harmful content. Why? Because those algorithmic systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content.”
-
because the algorithm has always been there, Facebook users haven’t been given the time or the tools to curate their feeds for themselves in thoughtful ways. In other words, Facebook has never really given a chronological news feed a fair shot to succeed
-
Some critics say that’s a straw-man argument. Simply removing automated rankings for a subset of users, on a social network that has been built to rely heavily on those systems, is not the same as designing a service to work well without them,
-
Ben Grosser, a professor of new media at University of Illinois at Urbana-Champaign. Those users’ feeds are no longer curated, but the posts they’re seeing are still influenced by the algorithm’s reward systems. That is, they’re still seeing content from people and publishers who are vying for the likes, shares and comments that drive Facebook’s recommendati
-
“My experience from watching a chronological feed within a social network that isn’t always trying to optimize for growth is that a lot of these problems” — such as hate speech, trolling and manipulative media — “just don’t exist.”
-
Facebook has not taken an official stand on the legislation that would require social networks to offer a chronological feed option, but Clegg said in an op-ed last month that the company is open to regulation around algorithms, transparency, and user controls.Twitter, for its part, signaled potential support for the bills.
-
“I think users have the right to expect social media experiences free of recommendation algorithms,” Maréchal added. “As a user, I want to have as much control over my own experience as possible, and recommendation algorithms take that control away from me.”
-
“Only companies themselves can do the experiments to find the answers. And as talented as industry researchers are, we can’t trust executives to make decisions in the public interest based on that research, or to let the public and policymakers access that research.”