Skip to main content

Home/ Dystopias/ Group items tagged reddit

Rss Feed Group items tagged

Ed Webb

At age 13, I joined the alt-right, aided by Reddit and Google - 0 views

  • Now, I’m 16, and I’ve been able to reflect on how I got sucked into that void—and how others do, too. My brief infatuation with the alt-right has helped me understand the ways big tech companies and their algorithms are contributing to the problem of radicalization—and why it’s so important to be skeptical of what you read online.
  • while a quick burst of radiation probably won’t give you cancer, prolonged exposure is far more dangerous. The same is true for the alt-right. I knew that the messages I was seeing were wrong, but the more I saw them, the more curious I became. I was unfamiliar with most of the popular discussion topics on Reddit. And when you want to know more about something, what do you do? You probably don’t think to go to the library and check out a book on that subject, and then fact check and cross reference what you find. If you just google what you want to know, you can get the information you want within seconds.
  • I started googling things like “Illegal immigration,” “Sandy Hook actors,” and “Black crime rate.” And I found exactly what I was looking for.
  • ...11 more annotations...
  • The articles and videos I first found all backed up what I was seeing on Reddit—posts that asserted a skewed version of actual reality, using carefully selected, out-of-context, and dubiously sourced statistics that propped up a hateful world view. On top of that, my online results were heavily influenced by something called an algorithm. I understand algorithms to be secretive bits of code that a website like YouTube will use to prioritize content that you are more likely to click on first. Because all of the content I was reading or watching was from far-right sources, all of the links that the algorithms dangled on my screen for me to click were from far-right perspectives.
  • I spent months isolated in my room, hunched over my computer, removing and approving memes on Reddit and watching conservative “comedians” that YouTube served up to me.
  • The inflammatory language and radical viewpoints used by the alt-right worked to YouTube and Google’s favor—the more videos and links I clicked on, the more ads I saw, and in turn, the more ad revenue they generated.
  • the biggest step in my recovery came when I attended a pro-Trump rally in Washington, D.C., in September 2017, about a month after the “Unite the Right” rally in Charlottesville, Virginia
  • The difference between the online persona of someone who identifies as alt-right and the real thing is so extreme that you would think they are different people. Online, they have the power of fake and biased news to form their arguments. They sound confident and usually deliver their standard messages strongly. When I met them in person at the rally, they were awkward and struggled to back up their statements. They tripped over their own words, and when they were called out by any counter protestors in the crowd, they would immediately use a stock response such as “You’re just triggered.”
  • Seeing for myself that the people I was talking to online were weak, confused, and backwards was the turning point for me.
  • we’re too far gone to reverse the damage that the alt-right has done to the internet and to naive adolescents who don’t know any better—children like the 13-year-old boy I was. It’s convenient for a massive internet company like Google to deliberately ignore why people like me get misinformed in the first place, as their profit-oriented algorithms continue to steer ignorant, malleable people into the jaws of the far-right
  • Dylann Roof, the white supremacist who murdered nine people in a Charleston, South Carolina, church in 2015, was radicalized by far-right groups that spread misinformation with the aid of Google’s algorithms.
  • Over the past couple months, I’ve been getting anti-immigration YouTube ads that feature an incident presented as a “news” story, about two immigrants who raped an American girl. The ad offers no context or sources, and uses heated language to denounce immigration and call for our county to allow ICE to seek out illegal immigrants within our area. I wasn’t watching a video about immigration or even politics when those ads came on; I was watching the old Monty Python “Cheese Shop” sketch. How does British satire, circa 1972, relate to America’s current immigration debate? It doesn’t.
  • tech companies need to be held accountable for the radicalization that results from their systems and standards.
  • anyone can be manipulated like I was. It’s so easy to find information online that we collectively forget that so much of the content the internet offers us is biased
Ed Webb

The "manosphere" is getting more toxic as angry men join the incels - MIT Technology Re... - 0 views

  • speech in the most extreme manosphere groups on Reddit, known as subreddits, was far more hateful than the speech of a random sample of Reddit users, and more on the wavelength of fringe far-right hate groups like those that frequent the social network Gab. And it’s getting worse. Over time the toxicity score has risen across all manosphere forums.
Ed Webb

4channers Hunt Down Detroit Couple Taunting Dying Girl While Reddit Donates to the Vict... - 0 views

  • I love how fast the internet acts on shit like this.This article was posted at 2:30 today. The husband apologized by 6:30. lol
  •  
    Mutual coercion, or simply coercion? Who polices taste? Do we applaud the actions of 4channers and other lulz-seeking internet vigilantes?
Ed Webb

The fight against toxic gamer culture has moved to the classroom - The Verge - 0 views

  • If there were any lessons to be learned from Gamergate — from how to recognize bad faith actors or steps on how to protect yourself, to failings in law enforcement or therapy focused on the internet — the education system doesn’t seem to have fully grasped these concepts.
  • It’s a problem that goes beyond just topics specific to the gaming industry, extending to topics like feminism, politics, or philosophy. “Suddenly everyone who watches Jordan Peterson videos thinks they know what postmodernism is,” says Emma Vossen, a post doctoral fellow with a PhD in gender and games. These problems with students are not about disagreements or debates. It’s not even about kids acting out, but rather harassers in the classroom who have tapped into social media as a powerful weapon. Many educators can’t grasp that, says Vossen. “This is about students who could potentially access this hate movement that’s circling around you and use it against you,” she says. “This is about being afraid to give bad marks to students because they might go to their favorite YouTuber with a little bit of personal information about you that could be used to dox you.” Every word you say can be taken out of context, twisted, and used against you. “Education has no idea how to deal with this problem,” Vossen says. “And I think it’s only going to get worse.
  • An educator’s job is no longer just about teaching, but helping students unlearn false or even harmful information they’ve picked up from the internet.
  • ...1 more annotation...
  • “If we started teaching students the basics of feminism at a very young age,” Wilcox says, “they would have a far better appreciation for how different perspectives will lead to different outcomes, and how the distribution of power and privilege in society can influence who gets to speak in the first place.”
1 - 7 of 7
Showing 20 items per page