How YouTube Drives People to the Internet's Darkest Corners - WSJ - 0 views
www.wsj.com/...ets-darkest-corners-1518020478
shared by Javier E on 10 Feb 18
- No Cached
youtube social media advertising google partisan extremism research bias
-
YouTube is the new television, with more than 1.5 billion users, and videos the site recommends have the power to influence viewpoints around the world.
-
Those recommendations often present divisive, misleading or false content despite changes the site has recently made to highlight more-neutral fare, a Wall Street Journal investigation found.
-
Behind that growth is an algorithm that creates personalized playlists. YouTube says these recommendations drive more than 70% of its viewing time, making the algorithm among the single biggest deciders of what people watch.
- ...25 more annotations...
-
People cumulatively watch more than a billion YouTube hours daily world-wide, a 10-fold increase from 2012
-
After the Journal this week provided examples of how the site still promotes deceptive and divisive videos, YouTube executives said the recommendations were a problem.
-
When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.
-
Such recommendations play into concerns about how social-media sites can amplify extremist voices, sow misinformation and isolate users in “filter bubbles”
-
Unlike Facebook Inc. and Twitter Inc. sites, where users see content from accounts they choose to follow, YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.
-
“That sounded great when it was all about free speech and ‘in the marketplace of ideas, only the best ones win.’ But we’re seeing again and again that that’s not what happens. What’s happening instead is the systems are being gamed and people are being gamed.”
-
YouTube has been tweaking its algorithm since last autumn to surface what its executives call “more authoritative” news source
-
YouTube last week said it is considering a design change to promote relevant information from credible news sources alongside videos that push conspiracy theories.
-
The Journal investigation found YouTube’s recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content.
-
YouTube engineered its algorithm several years ago to make the site “sticky”—to recommend videos that keep users staying to watch still more, said current and former YouTube engineers who helped build it. The site earns money selling ads that run before and during videos.
-
YouTube’s algorithm tweaks don’t appear to have changed how YouTube recommends videos on its home page. On the home page, the algorithm provides a personalized feed for each logged-in user largely based on what the user has watched.
-
There is another way to calculate recommendations, demonstrated by YouTube’s parent, Alphabet Inc.’s Google. It has designed its search-engine algorithms to recommend sources that are authoritative, not just popular.
-
Google spokeswoman Crystal Dahlen said that Google improved its algorithm last year “to surface more authoritative content, to help prevent the spread of blatantly misleading, low-quality, offensive or downright false information,” adding that it is “working with the YouTube team to help share learnings.”
-
In recent weeks, it has expanded that change to other news-related queries. Since then, the Journal’s tests show, news searches in YouTube return fewer videos from highly partisan channels.
-
YouTube’s recommendations became even more effective at keeping people on the site in 2016, when the company began employing an artificial-intelligence technique called a deep neural network that makes connections between videos that humans wouldn’t. The algorithm uses hundreds of signals, YouTube says, but the most important remains what a given user has watched.
-
Using a deep neural network makes the recommendations more of a black box to engineers than previous techniques,
-
“We don’t have to think as much,” he said. “We’ll just give it some raw data and let it figure it out.”
-
To better understand the algorithm, the Journal enlisted former YouTube engineer Guillaume Chaslot, who worked on its recommendation engine, to analyze thousands of YouTube’s recommendations on the most popular news-related queries
-
Mr. Chaslot created a computer program that simulates the “rabbit hole” users often descend into when surfing the site. In the Journal study, the program collected the top five results to a given search. Next, it gathered the top three recommendations that YouTube promoted once the program clicked on each of those results. Then it gathered the top three recommendations for each of those promoted videos, continuing four clicks from the original search.
-
The first analysis, of November’s top search terms, showed YouTube frequently led users to divisive and misleading videos. On the 21 news-related searches left after eliminating queries about entertainment, sports and gaming—such as “Trump,” “North Korea” and “bitcoin”—YouTube most frequently recommended these videos:
-
The algorithm doesn’t seek out extreme videos, they said, but looks for clips that data show are already drawing high traffic and keeping people on the site. Those videos often tend to be sensationalist and on the extreme fringe, the engineers said.
-
Repeated tests by the Journal as recently as this week showed the home page often fed far-right or far-left videos to users who watched relatively mainstream news sources, such as Fox News and MSNBC.
-
Searching some topics and then returning to the home page without doing a new search can produce recommendations that push users toward conspiracy theories even if they seek out just mainstream sources.
-
After searching for “9/11” last month, then clicking on a single CNN clip about the attacks, and then returning to the home page, the fifth and sixth recommended videos were about claims the U.S. government carried out the attacks. One, titled “Footage Shows Military Plane hitting WTC Tower on 9/11—13 Witnesses React”—had 5.3 million views.