- 0 views
-
dr tech on 14 Jun 24"Exposure to false and inflammatory content is remarkably low, with just 1% of Twitter users accounting for 80% of exposure to dubious websites during the 2016 U.S. election. This is heavily concentrated among a small fringe of users actively seeking it out. Examples: 6.3% of YouTube users were responsible for 79.8% of exposure to extremist channels from July to December 2020, 85% of vaccine-sceptical content was consumed by less than 1% of US citizens in the 2016-2019 period. Conventional wisdom blames platform algorithms for spreading misinformation. However, evidence suggests user preferences play an outsized role. For instance, a mere 0.04% of YouTube's algorithmic recommendations directed users to extremist content. It's tempting to draw a straight line between social media usage and societal ills. But studies rigorously designed to untangle cause and effect often come up short. "