Skip to main content

Home/ Dystopias/ Group items tagged memes

Rss Feed Group items tagged

Ed Webb

Endtime for Hitler: On the Downfall of the Downfall Parodies - Mark Dery - Doom Patrol:... - 1 views

  • Endtime for Hitler: On the Downfall of the Downfall Parodies
  • Hitler left an inexhaustible fund of unforgettable images; Riefenstahl’sTriumph of the Will alone is enough to make him a household deity of the TV age.
  • The Third Reich was the first thoroughly modern totalitarian horror, scripted by Hitler and mass-marketed by Goebbels, a tour de force of media spectacle and opinion management that America’s hidden persuaders—admen, P.R. flacks, political campaign managers—studied assiduously.  A Mad Man in both senses, Hitler sold the German volk on a racially cleansed utopia, a thousand-year empire whose kitschy grandeur was strictly Forest Lawn Parthenon.
  • ...13 more annotations...
  • Hitler, unlike Stalin or Mao, was an intuitive master of media stagecraft. David Bowie’s too-clever quip that Hitler was the first rock star, for which Bowie was widely reviled at the time, was spot-on.
  • the media like Hitler because Hitler liked the media
  • Perhaps that’s why he continues to mesmerize us: because he flickers, irresolvably, between the seemingly inhuman and the all too human.
  • His psychopathology is a queasy funhouse reflection, straight out of Nightmare Alley, of the instrumental rationality of the machine age. The genocidal assembly lines of Hitler’s death camps are a grotesque parody of Fordist mechanization, just as the Nazis’ fastidious recycling of every remnant of their victims but their smoke—their gold fillings melted down for bullion, their hair woven into socks for U-boat crewmen—is a depraved caricature of the Taylorist mania for workplace efficiency.
  • there’s something perversely comforting about Hitler’s unchallenged status as the metaphysical gravitational center of all our attempts at philosophizing evil
  • he prefigured postmodernity: the annexation of politics by Hollywood and Madison Avenue, the rise of the celebrity as a secular icon, the confusion of image and reality in a Matrix world. He regarded existence “as a kind of permanent parade before a gigantic audience” (Fest), calculating the visual impact of every histrionic pose, every propaganda tagline, every monumental building
  • By denying everyone’s capability, at least in theory, for Hitlerian evil, we let ourselves off the hook
  • Yet Hitler, paradoxically, is also a shriveled untermensch, the protypical nonentity; a face in the crowd in an age of crowds, instantly forgettable despite his calculated efforts to brand himself (the toothbrush mustache of the military man coupled with the flopping forelock of the art-school bohemian)
  • there was always a comic distance between the public image of the world-bestriding, godlike Fuhrer and his Inner Adolf, a nail-biting nebbish tormented by flatulence. Knowingly or not, the Downfall parodies dance in the gap between the two. More immediately, they rely on the tried-and-true gimmick of bathos. What makes the Downfall parodies so consistently hilarious is the incongruity of whatever viral topic is making the Fuhrer go ballistic and the outsized scale of his gotterdammerung-strength tirade
  • The Downfall meme dramatizes the cultural logic of our remixed, mashed-up times, when digital technology allows us to loot recorded history, prying loose any signifier that catches our magpie eyes and repurposing it to any end. The near-instantaneous speed with which parodists use these viral videos to respond to current events underscores the extent to which the social Web, unlike the media ecologies of Hitler’s day, is a many-to-many phenomenon, more collective cacophony than one-way rant. As well, the furor (forgive pun) over YouTube’s decision to capitulate to the movie studio’s takedown demand, rather than standing fast in defense of Fair Use (a provision in copyright law that protects the re-use of a work for purposes of parody), indicates the extent to which ordinary people feel that commercial culture is somehow theirs, to misread or misuse as the spirit moves them.
  • the closest thing we have to a folk culture, the connective tissue that binds us as a society
  • SPIEGEL: Can you also get your revenge on him by using comedy? Brooks: Yes, absolutely. Of course it is impossible to take revenge for 6 million murdered Jews. But by using the medium of comedy, we can try to rob Hitler of his posthumous power and myths. [...] We take away from him the holy seriousness that always surrounded him and protected him like a cordon.”
  • risking the noose, some Germans laughed off their fears and mocked the Orwellian boot stamping on the human face, giving vent to covert opposition through flüsterwitze (“whispered jokes”). Incredibly, even Jews joked about their plight, drawing on the absurdist humor that is quintessentially Jewish to mock the Nazis even as they lightened the intolerable burden of Jewish life in the shadow of the swastika. Rapaport offers a sample of Jewish humor in Hitler’s Germany: “A Jew is arrested during the war, having been denounced for killing a Nazi at 10 P.M. and even eating the brain of his victim. This is his defense: In the first place, a Nazi hasn’t got any brain. Secondly, a Jew doesn’t eat anything that comes from a pig. And thirdly, he could not have killed the Nazi at 10 P.M. because at that time everybody listens to the BBC broadcast.”
  •  
    Brilliant
Ed Webb

The Digital Maginot Line - 0 views

  • The Information World War has already been going on for several years. We called the opening skirmishes “media manipulation” and “hoaxes”, assuming that we were dealing with ideological pranksters doing it for the lulz (and that lulz were harmless). In reality, the combatants are professional, state-employed cyberwarriors and seasoned amateur guerrillas pursuing very well-defined objectives with military precision and specialized tools. Each type of combatant brings a different mental model to the conflict, but uses the same set of tools.
  • There are also small but highly-skilled cadres of ideologically-motivated shitposters whose skill at information warfare is matched only by their fundamental incomprehension of the real damage they’re unleashing for lulz. A subset of these are conspiratorial — committed truthers who were previously limited to chatter on obscure message boards until social platform scaffolding and inadvertently-sociopathic algorithms facilitated their evolution into leaderless cults able to spread a gospel with ease.
  • There’s very little incentive not to try everything: this is a revolution that is being A/B tested.
  • ...17 more annotations...
  • The combatants view this as a Hobbesian information war of all against all and a tactical arms race; the other side sees it as a peacetime civil governance problem.
  • Our most technically-competent agencies are prevented from finding and countering influence operations because of the concern that they might inadvertently engage with real U.S. citizens as they target Russia’s digital illegals and ISIS’ recruiters. This capability gap is eminently exploitable; why execute a lengthy, costly, complex attack on the power grid when there is relatively no cost, in terms of dollars as well as consequences, to attack a society’s ability to operate with a shared epistemology? This leaves us in a terrible position, because there are so many more points of failure
  • Cyberwar, most people thought, would be fought over infrastructure — armies of state-sponsored hackers and the occasional international crime syndicate infiltrating networks and exfiltrating secrets, or taking over critical systems. That’s what governments prepared and hired for; it’s what defense and intelligence agencies got good at. It’s what CSOs built their teams to handle. But as social platforms grew, acquiring standing audiences in the hundreds of millions and developing tools for precision targeting and viral amplification, a variety of malign actors simultaneously realized that there was another way. They could go straight for the people, easily and cheaply. And that’s because influence operations can, and do, impact public opinion. Adversaries can target corporate entities and transform the global power structure by manipulating civilians and exploiting human cognitive vulnerabilities at scale. Even actual hacks are increasingly done in service of influence operations: stolen, leaked emails, for example, were profoundly effective at shaping a national narrative in the U.S. election of 2016.
  • The substantial time and money spent on defense against critical-infrastructure hacks is one reason why poorly-resourced adversaries choose to pursue a cheap, easy, low-cost-of-failure psy-ops war instead
  • Information war combatants have certainly pursued regime change: there is reasonable suspicion that they succeeded in a few cases (Brexit) and clear indications of it in others (Duterte). They’ve targeted corporations and industries. And they’ve certainly gone after mores: social media became the main battleground for the culture wars years ago, and we now describe the unbridgeable gap between two polarized Americas using technological terms like filter bubble. But ultimately the information war is about territory — just not the geographic kind. In a warm information war, the human mind is the territory. If you aren’t a combatant, you are the territory. And once a combatant wins over a sufficient number of minds, they have the power to influence culture and society, policy and politics.
  • If an operation is effective, the message will be pushed into the feeds of sympathetic real people who will amplify it themselves. If it goes viral or triggers a trending algorithm, it will be pushed into the feeds of a huge audience. Members of the media will cover it, reaching millions more. If the content is false or a hoax, perhaps there will be a subsequent correction article – it doesn’t matter, no one will pay attention to it.
  • The 2014-2016 influence operation playbook went something like this: a group of digital combatants decided to push a specific narrative, something that fit a long-term narrative but also had a short-term news hook. They created content: sometimes a full blog post, sometimes a video, sometimes quick visual memes. The content was posted to platforms that offer discovery and amplification tools. The trolls then activated collections of bots and sockpuppets to blanket the biggest social networks with the content. Some of the fake accounts were disposable amplifiers, used mostly to create the illusion of popular consensus by boosting like and share counts. Others were highly backstopped personas run by real human beings, who developed standing audiences and long-term relationships with sympathetic influencers and media; those accounts were used for precision messaging with the goal of reaching the press. Israeli company Psy Group marketed precisely these services to the 2016 Trump Presidential campaign; as their sales brochure put it, “Reality is a Matter of Perception”.
  • This shift from targeting infrastructure to targeting the minds of civilians was predictable. Theorists  like Edward Bernays, Hannah Arendt, and Marshall McLuhan saw it coming decades ago. As early as 1970, McLuhan wrote, in Culture is our Business, “World War III is a guerrilla information war with no division between military and civilian participation.”
  • Combatants are now focusing on infiltration rather than automation: leveraging real, ideologically-aligned people to inadvertently spread real, ideologically-aligned content instead. Hostile state intelligence services in particular are now increasingly adept at operating collections of human-operated precision personas, often called sockpuppets, or cyborgs, that will escape punishment under the the bot laws. They will simply work harder to ingratiate themselves with real American influencers, to join real American retweet rings. If combatants need to quickly spin up a digital mass movement, well-placed personas can rile up a sympathetic subreddit or Facebook Group populated by real people, hijacking a community in the way that parasites mobilize zombie armies.
  • Attempts to legislate away 2016 tactics primarily have the effect of triggering civil libertarians, giving them an opportunity to push the narrative that regulators just don’t understand technology, so any regulation is going to be a disaster.
  • The entities best suited to mitigate the threat of any given emerging tactic will always be the platforms themselves, because they can move fast when so inclined or incentivized. The problem is that many of the mitigation strategies advanced by the platforms are the information integrity version of greenwashing; they’re a kind of digital security theater, the TSA of information warfare
  • Algorithmic distribution systems will always be co-opted by the best resourced or most technologically capable combatants. Soon, better AI will rewrite the playbook yet again — perhaps the digital equivalent of  Blitzkrieg in its potential for capturing new territory. AI-generated audio and video deepfakes will erode trust in what we see with our own eyes, leaving us vulnerable both to faked content and to the discrediting of the actual truth by insinuation. Authenticity debates will commandeer media cycles, pushing us into an infinite loop of perpetually investigating basic facts. Chronic skepticism and the cognitive DDoS will increase polarization, leading to a consolidation of trust in distinct sets of right and left-wing authority figures – thought oligarchs speaking to entirely separate groups
  • platforms aren’t incentivized to engage in the profoundly complex arms race against the worst actors when they can simply point to transparency reports showing that they caught a fair number of the mediocre actors
  • What made democracies strong in the past — a strong commitment to free speech and the free exchange of ideas — makes them profoundly vulnerable in the era of democratized propaganda and rampant misinformation. We are (rightfully) concerned about silencing voices or communities. But our commitment to free expression makes us disproportionately vulnerable in the era of chronic, perpetual information war. Digital combatants know that once speech goes up, we are loathe to moderate it; to retain this asymmetric advantage, they push an all-or-nothing absolutist narrative that moderation is censorship, that spammy distribution tactics and algorithmic amplification are somehow part of the right to free speech.
  • We need an understanding of free speech that is hardened against the environment of a continuous warm war on a broken information ecosystem. We need to defend the fundamental value from itself becoming a prop in a malign narrative.
  • Unceasing information war is one of the defining threats of our day. This conflict is already ongoing, but (so far, in the United States) it’s largely bloodless and so we aren’t acknowledging it despite the huge consequences hanging in the balance. It is as real as the Cold War was in the 1960s, and the stakes are staggeringly high: the legitimacy of government, the persistence of societal cohesion, even our ability to respond to the impending climate crisis.
  • Influence operations exploit divisions in our society using vulnerabilities in our information ecosystem. We have to move away from treating this as a problem of giving people better facts, or stopping some Russian bots, and move towards thinking about it as an ongoing battle for the integrity of our information infrastructure – easily as critical as the integrity of our financial markets.
Ed Webb

At age 13, I joined the alt-right, aided by Reddit and Google - 0 views

  • Now, I’m 16, and I’ve been able to reflect on how I got sucked into that void—and how others do, too. My brief infatuation with the alt-right has helped me understand the ways big tech companies and their algorithms are contributing to the problem of radicalization—and why it’s so important to be skeptical of what you read online.
  • while a quick burst of radiation probably won’t give you cancer, prolonged exposure is far more dangerous. The same is true for the alt-right. I knew that the messages I was seeing were wrong, but the more I saw them, the more curious I became. I was unfamiliar with most of the popular discussion topics on Reddit. And when you want to know more about something, what do you do? You probably don’t think to go to the library and check out a book on that subject, and then fact check and cross reference what you find. If you just google what you want to know, you can get the information you want within seconds.
  • I started googling things like “Illegal immigration,” “Sandy Hook actors,” and “Black crime rate.” And I found exactly what I was looking for.
  • ...11 more annotations...
  • The articles and videos I first found all backed up what I was seeing on Reddit—posts that asserted a skewed version of actual reality, using carefully selected, out-of-context, and dubiously sourced statistics that propped up a hateful world view. On top of that, my online results were heavily influenced by something called an algorithm. I understand algorithms to be secretive bits of code that a website like YouTube will use to prioritize content that you are more likely to click on first. Because all of the content I was reading or watching was from far-right sources, all of the links that the algorithms dangled on my screen for me to click were from far-right perspectives.
  • I spent months isolated in my room, hunched over my computer, removing and approving memes on Reddit and watching conservative “comedians” that YouTube served up to me.
  • The inflammatory language and radical viewpoints used by the alt-right worked to YouTube and Google’s favor—the more videos and links I clicked on, the more ads I saw, and in turn, the more ad revenue they generated.
  • the biggest step in my recovery came when I attended a pro-Trump rally in Washington, D.C., in September 2017, about a month after the “Unite the Right” rally in Charlottesville, Virginia
  • The difference between the online persona of someone who identifies as alt-right and the real thing is so extreme that you would think they are different people. Online, they have the power of fake and biased news to form their arguments. They sound confident and usually deliver their standard messages strongly. When I met them in person at the rally, they were awkward and struggled to back up their statements. They tripped over their own words, and when they were called out by any counter protestors in the crowd, they would immediately use a stock response such as “You’re just triggered.”
  • Seeing for myself that the people I was talking to online were weak, confused, and backwards was the turning point for me.
  • we’re too far gone to reverse the damage that the alt-right has done to the internet and to naive adolescents who don’t know any better—children like the 13-year-old boy I was. It’s convenient for a massive internet company like Google to deliberately ignore why people like me get misinformed in the first place, as their profit-oriented algorithms continue to steer ignorant, malleable people into the jaws of the far-right
  • Dylann Roof, the white supremacist who murdered nine people in a Charleston, South Carolina, church in 2015, was radicalized by far-right groups that spread misinformation with the aid of Google’s algorithms.
  • Over the past couple months, I’ve been getting anti-immigration YouTube ads that feature an incident presented as a “news” story, about two immigrants who raped an American girl. The ad offers no context or sources, and uses heated language to denounce immigration and call for our county to allow ICE to seek out illegal immigrants within our area. I wasn’t watching a video about immigration or even politics when those ads came on; I was watching the old Monty Python “Cheese Shop” sketch. How does British satire, circa 1972, relate to America’s current immigration debate? It doesn’t.
  • tech companies need to be held accountable for the radicalization that results from their systems and standards.
  • anyone can be manipulated like I was. It’s so easy to find information online that we collectively forget that so much of the content the internet offers us is biased
1 - 4 of 4
Showing 20 items per page