Skip to main content

Home/ Digit_al Society/ Group items tagged algorithm Trust

Rss Feed Group items tagged

dr tech

Who do you trust? How data is helping us decide | Technology | The Guardian - 0 views

  •  
    "Should we embrace these new trust algorithms? Baveja and Shapiro acknowledge the responsibility that comes with trying to take ethical decisions and translate them into code. How much of our personal information do we want trawled through in this way? And how comfortable are we with letting an algorithm judge who is trustworthy?"
dr tech

YouTube is more likely to serve problematic videos than useful ones, study (and common ... - 1 views

  •  
    "The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests."
dr tech

To Evaluate Meta's Shift, Focus on the Product Changes, Not the Moderation - 0 views

  •  
    "The announcement that Meta would be changing their approach to political content and discussions of gender is concerning, though it is unclear exactly what those changes are. Given that many product changes regarding those content areas were used in high-risk settings, a change intended to allay US free speech concerns could lead to violence incitement elsewhere. For example, per this post from Meta, reducing "content that has been shared by a chain of two or more people" was a content-neutral product change done to protect people in Ethiopia, where algorithms have been implicated in the spread of ethnic violence. A similar change - removing optimizations for reshared content - was discussed in this post concerning reductions in political content. Will those changes be undone? Globally? Such changes could also lead to increased amplification of attention getting discussions of gender. Per this report from Equimundo and Futures Without Violence, 40% of young men trust at least one "manosphere" influencer - who often exploit algorithmic incentives by posting increasingly extreme, attention-getting mixes of ideas about self-improvement, aggression, and traditional gender roles."
dr tech

- 0 views

  •  
    "There is also a lot of research that both third-party fact-checking and Community Notes can be really effective at reducing misperceptions. But - and this is a significant caveat - neither works well as a complete solution for lies on social media. When Twitter was working on Birdwatch, they claimed it would "not replace other labels and fact checks Twitter currently uses". But as I've written about before, Musk scaled back Twitter's Trust and Safety team significantly and positioned Community Notes as the replacement. As Yoel Roth, Twitter's former head of Trust and Safety, told WIRED, "The intention of Birdwatch was always to be a complement to, rather than a replacement for, Twitter's other misinformation methods." In fact, research on various attempts to mitigate COVID misinformation found that a layered, "Swiss cheese" approach might work best, where some efforts work well sometimes, but collectively the system catches most falsehoods."
dr tech

Are Google search results politically biased? | Jeff Hancock et al | Opinion | The Guar... - 1 views

  •  
    "This way of thinking about search results is wrong. Recent studies suggest that search engines, rather than providing a neutral way to find information, may actually play a major role in shaping public opinion on political issues and candidates. Some research has even argued that search results can affect the outcomes of close elections. In a study aptly titled In Google We Trust participants heavily prioritized the first page of search results, and the order of the results on that page, and continued to do so even when researchers reversed the order of the actual results."
1 - 5 of 5
Showing 20 items per page