Skip to main content

Home/ Dystopias/ Group items tagged review

Rss Feed Group items tagged

Ed Webb

A woman first wrote the prescient ideas Huxley and Orwell made famous - Quartzy - 1 views

  • In 1919, a British writer named Rose Macaulay published What Not, a novel about a dystopian future—a brave new world if you will—where people are ranked by intelligence, the government mandates mind training for all citizens, and procreation is regulated by the state.You’ve probably never heard of Macaulay or What Not. However, Aldous Huxley, author of the science fiction classic Brave New World, hung out in the same London literary circles as her and his 1932 book contains many concepts that Macaulay first introduced in her work. In 2019, you’ll be able to read Macaulay’s book yourself and compare the texts as the British publisher Handheld Press is planning to re- release the forgotten novel in March. It’s been out of print since the year it was first released.
  • The resurfacing of What Not also makes this a prime time to consider another work that influenced Huxley’s Brave New World, the 1923 novel We by Yvgeny Zamyatin. What Not and We are lost classics about a future that foreshadows our present. Notably, they are also hidden influences on some of the most significant works of 20th century fiction, Brave New World and George Orwell’s 1984.
  • In Macaulay’s book—which is a hoot and well worth reading—a democratically elected British government has been replaced with a “United Council, five minds with but a single thought—if that,” as she put it. Huxley’s Brave New World is run by a similarly small group of elites known as “World Controllers.”
  • ...12 more annotations...
  • citizens of What Not are ranked based on their intelligence from A to C3 and can’t marry or procreate with someone of the same rank to ensure that intelligence is evenly distributed
  • Brave New World is more futuristic and preoccupied with technology than What Not. In Huxley’s world, procreation and education have become completely mechanized and emotions are strictly regulated pharmaceutically. Macaulay’s Britain is just the beginning of this process, and its characters are not yet completely indoctrinated into the new ways of the state—they resist it intellectually and question its endeavors, like the newly-passed Mental Progress Act. She writes:He did not like all this interfering, socialist what-not, which was both upsetting the domestic arrangements of his tenants and trying to put into their heads more learning than was suitable for them to have. For his part he thought every man had a right to be a fool if he chose, yes, and to marry another fool, and to bring up a family of fools too.
  • Where Huxley pairs dumb but pretty and “pneumatic” ladies with intelligent gentlemen, Macaulay’s work is decidedly less sexist.
  • We was published in French, Dutch, and German. An English version was printed and sold only in the US. When Orwell wrote about We in 1946, it was only because he’d managed to borrow a hard-to-find French translation.
  • While Orwell never indicated that he read Macaulay, he shares her subversive and subtle linguistic skills and satirical sense. His protagonist, Winston—like Kitty—works for the government in its Ministry of Truth, or Minitrue in Newspeak, where he rewrites historical records to support whatever Big Brother currently says is good for the regime. Macaulay would no doubt have approved of Orwell’s wit. And his state ministries bear a striking similarity to those she wrote about in What Not.
  • Orwell was familiar with Huxley’s novel and gave it much thought before writing his own blockbuster. Indeed, in 1946, before the release of 1984, he wrote a review of Zamyatin’s We (pdf), comparing the Russian novel with Huxley’s book. Orwell declared Huxley’s text derivative, writing in his review of We in The Tribune:The first thing anyone would notice about We is the fact—never pointed out, I believe—that Aldous Huxley’s Brave New World must be partly derived from it. Both books deal with the rebellion of the primitive human spirit against a rationalised, mechanized, painless world, and both stories are supposed to take place about six hundred years hence. The atmosphere of the two books is similar, and it is roughly speaking the same kind of society that is being described, though Huxley’s book shows less political awareness and is more influenced by recent biological and psychological theories.
  • In We, the story is told by D-503, a male engineer, while in Brave New World we follow Bernard Marx, a protagonist with a proper name. Both characters live in artificial worlds, separated from nature, and they recoil when they first encounter people who exist outside of the state’s constructed and controlled cities.
  • Although We is barely known compared to Orwell and Huxley’s later works, I’d argue that it’s among the best literary science fictions of all time, and it’s highly relevant, as it was when first written. Noam Chomsky calls it “more perceptive” than both 1984 and Brave New World. Zamyatin’s futuristic society was so on point, he was exiled from the Soviet Union because it was such an accurate description of life in a totalitarian regime, though he wrote it before Stalin took power.
  • Macaulay’s work is more subtle and funny than Huxley’s. Despite being a century old, What Not is remarkably relevant and readable, a satire that only highlights how little has changed in the years since its publication and how dangerous and absurd state policies can be. In this sense then, What Not reads more like George Orwell’s 1949 novel 1984 
  • Orwell was critical of Zamyatin’s technique. “[We] has a rather weak and episodic plot which is too complex to summarize,” he wrote. Still, he admired the work as a whole. “[Its] intuitive grasp of the irrational side of totalitarianism—human sacrifice, cruelty as an end in itself, the worship of a Leader who is credited with divine attributes—[…] makes Zamyatin’s book superior to Huxley’s,”
  • Like our own tech magnates and nations, the United State of We is obsessed with going to space.
  • Perhaps in 2019 Macaulay’s What Not, a clever and subversive book, will finally get its overdue recognition.
Ed Webb

Would You Protect Your Computer's Feelings? Clifford Nass Says Yes. - ProfHacker - The ... - 0 views

  • The Man Who Lied to His Laptop condenses for a popular audience an argument that Nass has been making for at least 15 years: humans do not differentiate between computers and people in their social interactions.
  • At first blush, this sounds absurd. Everyone knows that it's "just a computer," and of course computers don't have feelings. And yet. Nass has a slew of amusing stories—and, crucially, studies based on those stories—indicating that, no matter what "everyone knows," people act as if the computer secretly cares. For example: In one study, users reviewed a software package, either on the same computer they'd used it on, or on a different computer. Consistently, participants gave the software better ratings when they reviewed in on the same computer—as if they didn't want the computer to feel bad. What's more, Nass notes, "every one of the participants insisted that she or he would never bother being polite to a computer" (7).
  • Nass found that users given completely random praise by a computer program liked it more than the same program without praise, even though they knew in advance the praise was meaningless. In fact, they liked it as much as the same program, if they were told the praise was accurate. (In other words, flattery was as well received as praise, and both were preferred to no positive comments.) Again, when questioned about the results, users angrily denied any difference at all in their reactions.
  •  
    How do you interact with the computing devices in your life?
Ed Webb

Narrative Napalm | Noah Kulwin - 0 views

  • there are books whose fusion of factual inaccuracy and moral sophistry is so total that they can only be written by Malcolm Gladwell
  • Malcolm Gladwell’s decades-long shtick has been to launder contrarian thought and corporate banalities through his positions as a staff writer at The New Yorker and author at Little, Brown and Company. These insitutitions’ disciplining effect on Gladwell’s prose, getting his rambling mind to conform to clipped sentences and staccato revelations, has belied his sly maliciousness and explosive vacuity: the two primary qualities of Gladwell’s oeuvre.
  • as is typical with Gladwell’s books and with many historical podcasts, interrogation of the actual historical record and the genuine moral dilemmas it poses—not the low-stakes bait that he trots out as an MBA case study in War—is subordinated to fluffy bullshit and biographical color
  • ...13 more annotations...
  • by taking up military history, Gladwell’s half-witted didacticism threatens to convince millions of people that the only solution to American butchery is to continue shelling out for sharper and larger knives
  • Although the phrase “Bomber Mafia” traditionally refers to the pre-World War II staff and graduates of the Air Corps Tactical School, Gladwell’s book expands the term to include both kooky tinkerers and buttoned-down military men. Wild, far-seeing mavericks, they understood that the possibilities of air power had only just been breached. They were also, as Gladwell insists at various points, typical Gladwellian protagonists: secluded oddballs whose technical zealotry and shared mission gave them a sense of community that propelled them beyond any station they could have achieved on their own.
  • Gladwell’s narrative is transmitted as seamlessly as the Wall Street or Silicon Valley koans that appear atop LinkedIn profiles, Clubhouse accounts, and Substack missives.
  • Gladwell has built a career out of making banality seem fresh
  • Drawing a false distinction between the Bomber Mafia and the British and American military leaders who preceded them allows Gladwell to make the case that a few committed brainiacs developed a humane, “tactical” kind of air power that has built the security of the world we live in today.
  • By now, the press cycle for every Gladwell book release is familiar: experts and critics identify logical flaws and factual errors, they are ignored, Gladwell sells a zillion books, and the world gets indisputably dumber for it.
  • “What actually happened?” Gladwell asks of the Blitz. “Not that much! The panic never came,” he answers, before favorably referring to an unnamed “British government film from 1940,” which is in actuality the Academy Award-nominated propaganda short London Can Take It!, now understood to be emblematic of how the myth of the stoic Brit was manufactured.
  • Gladwell goes to great pains to portray Curtis “Bombs Away” LeMay as merely George Patton-like: a prima donna tactician with some masculinity issues. In reality, LeMay bears a closer resemblance to another iconic George C. Scott performance, one that LeMay directly inspired: Dr. Strangelove’s General Buck Turgidson, who at every turn attempts to force World War III and, at the movie’s close, when global annihilation awaits, soberly warns of a “mineshaft gap” between the United States and the Commies. That, as Gladwell might phrase it, was the “real” Curtis LeMay: a violent reactionary who was never killed or tried because he had the luck to wear the brass of the correct country on his uniform. “I suppose if I had lost the war, I would have been tried as a war criminal,” LeMay once told an Air Force cadet. “Fortunately, we were on the winning side.”
  • Why would Malcolm Gladwell, who seems to admire LeMay so much, talk at such great length about the lethality of LeMay’s Japanese firebombing? The answer lies in what this story leaves out. Mentioned only glancingly in Gladwell’s story are the atomic bombs dropped on Japan. The omission allows for a stupid and classically Gladwell argument: that indiscriminate firebombing brought a swift end to the war, and its attendant philosophical innovations continue to envelop us in a blanket of security that has not been adequately appreciated
  • While LeMay’s 1945 firebombing campaign was certainly excessive—and represented the same base indifference to human life that got Nazis strung up at Nuremberg—it did not end the war. The Japanese were not solely holding out because their military men were fanatical in ways that the Americans weren’t, as Gladwell seems to suggest, citing Conrad Crane, an Army staff historian and hagiographer of LeMay’s[1]; they were holding out because they wanted better terms of surrender—terms they had the prospect of negotiating with the Soviet Union. The United States, having already developed an atomic weapon—and having made the Soviet Union aware of it—decided to drop it as it became clear the Soviet Union was readying to invade Japan. On August 6, the United States dropped a bomb on Hiroshima. Three days later, and mere hours after the Soviet Union formally declared war on the morning of August 9, the Americans dropped the second atomic bomb on Nagasaki. An estimated 210,000 people were killed, the majority of them on the days of the bombings. It was the detonation of these bombs that forced the end of the war. The Japanese unconditional surrender to the Americans was announced on August 15 and formalized on the deck of the USS Missouri on September 2. As historians like Martin Sherwin and Tsuyoshi Hasegawa have pointed out, by dropping the bombs, the Truman administration had kept the Communist threat out of Japan. Imperial Japan was staunchly anticommunist, and under American post-war dominion, the country would remain that way. But Gladwell is unequipped to supply the necessary geopolitical context that could meaningfully explain why the American government would force an unconditional surrender when the possibility of negotiation remained totally live.
  • In 1968, he would join forces with segregationist George Wallace as the vice-presidential candidate on his “American Independent Party” ticket, a fact literally relegated to a footnote in Gladwell’s book. This kind of omission is par for the course in The Bomber Mafia. While Gladwell constantly reminds the reader that the air force leadership was trying to wage more effective wars so as to end all wars, he cannot help but shove under the rug that which is inconvenient
  • This is truly a lesson for the McKinsey set and passive-income crowd for whom The Bomber Mafia is intended: doing bad things is fine, so long as you privately feel bad about it.
  • The British advocacy group Action on Armed Violence just this month estimated that between 2016 and 2020 in Afghanistan, there were more than 2,100 civilians killed and 1,800 injured by air strikes; 37 percent of those killed were children.
  •  
    An appropriately savage review of Gladwell's foray into military history. Contrast with the elegance of KSR's The Lucky Strike which actually wrestles with the moral issues.
Ed Webb

Our Digitally Undying Memories - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • as Viktor Mayer-Schönberger argues convincingly in his book Delete: The Virtue of Forgetting in the Digital Age (Princeton University Press, 2009), the costs of such powerful collective memory are often higher than we assume.
  • "Total recall" renders context, time, and distance irrelevant. Something that happened 40 years ago—whether youthful or scholarly indiscretion—still matters and can come back to harm us as if it had happened yesterday.
  • an important "third wave" of work about the digital environment. In the late 1990s and early 2000s, we saw books like Nicholas Negroponte's Being Digital (Knopf, 1995) and Howard Rhein-gold's The Virtual Community: Homesteading on the Electronic Frontier (Addison-Wesley, 1993) and Smart Mobs: The Next Social Revolution (Perseus, 2002), which idealistically described the transformative powers of digital networks. Then we saw shallow blowback, exemplified by Susan Jacoby's The Age of American Unreason (Pantheon, 2008).
  • ...14 more annotations...
  • For most of human history, forgetting was the default and remembering the challenge.
  • Chants, songs, monasteries, books, libraries, and even universities were established primarily to overcome our propensity to forget over time. The physical and economic limitations of all of those technologies and institutions served us well. Each acted not just as memory aids but also as filters or editors. They helped us remember much by helping us discard even more.
    • Ed Webb
       
      Excellent point, well made.
  • Just because we have the vessels, we fill them.
  • Even 10 years ago, we did not consider that words written for a tiny audience could reach beyond, perhaps to someone unforgiving, uninitiated in a community, or just plain unkind.
  • Remembering to forget, as Elvis argued, is also essential to getting over heartbreak. And, as Jorge Luis Borges wrote in his 1942 (yep, I Googled it to find the date) story "Funes el memorioso," it is just as important to the act of thinking. Funes, the young man in the story afflicted with an inability to forget anything, can't make sense of it. He can't think abstractly. He can't judge facts by relative weight or seriousness. He is lost in the details. Painfully, Funes cannot rest.
  • Our use of the proliferating data and rudimentary filters in our lives renders us incapable of judging, discriminating, or engaging in deductive reasoning. And inductive reasoning, which one could argue is entering a golden age with the rise of huge databases and the processing power needed to detect patterns and anomalies, is beyond the reach of lay users of the grand collective database called the Internet.
  • the default habits of our species: to record, retain, and release as much information as possible
  • Perhaps we just have to learn to manage wisely how we digest, discuss, and publicly assess the huge archive we are building. We must engender cultural habits that ensure perspective, calm deliberation, and wisdom. That's hard work.
  • we choose the nature of technologies. They don't choose us. We just happen to choose unwisely with some frequency
  • surveillance as the chief function of electronic government
  • critical information studies
  • Siva Vaidhyanathan is an associate professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything, is forthcoming from the University of California Press.
  • Nietzsche's _On the Use and Disadvantage of History for Life_
  • Google compresses, if not eliminates, temporal context. This is likely only to exacerbate the existing problem in politics of taking one's statements out of context. A politician whose views on a subject have evolved quite logically over decades in light of changing knowledge and/or circumstances is held up in attack ads as a flip-flopper because consecutive Google entries have him/her saying two opposite things about the same subject -- and never mind that between the two statements, the Berlin Wall may have fallen or the economy crashed harder than at any other time since 1929.
weismans95

Why Big Brother Isn't Watching You - 2 views

  •  
    Here's a contrasting but optimistic view
Ed Webb

How they make those adverts go straight to your head - CNN.com - 0 views

  • "neuromarketing"
  • Currently there are three methodologies covered under the term neuromarketing: functional MRI, measuring skin temperature fluctuations, and utilizing Electroencephalography (EEG), which is the main technology currently used.
  • there has been little neuromarketing research published in peer-reviewed scientific journals, and there are too few publicly accessible data sets from controlled studies to demonstrate conclusively that buying behavior can be correlated with specific brain activity. "The major neuromarketing firms say that their client work demonstrates this, but none of this has been published in a way that the scientific community can critique it,"
Ed Webb

What killed Caprica? - 0 views

  • Caprica may have gone too far, tried to cover too much. It broke one of the cardinal rules of mainstream science fiction, which is that if you have a strange alternate universe you'd better populate it with recognizable, ordinary characters. But I like the kind of thought-experiment audaciousness that says, Hell yes we are going to give you complicated characters who defy stereotypes, and put them in a world whose rules you'll have to think hard to understand. It's too late to bring Caprica back. But I hope that this show is the first part of a new wave of science fiction on TV. Like The Sarah Connor Chronicles, Dollhouse, and Fringe, Caprica tackles singularity-level technology as a political and economic phenomenon - not as an escapist fantasy. And that's why it was a show worth watching, even when it stumbled.
Ed Webb

Programmed for Love: The Unsettling Future of Robotics - The Chronicle Review - The Chr... - 0 views

  • Her prediction: Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.
  • y: We are already cyborgs, reliant on digital devices in ways that many of us could not have imagined just a few years ago
  • "We are hard-wired that if something meets extremely primitive standards, either eye contact or recognition or very primitive mutual signaling, to accept it as an Other because as animals that's how we're hard-wired—to recognize other creatures out there."
  • ...4 more annotations...
  • "Can a broken robot break a child?" they asked. "We would not consider the ethics of having children play with a damaged copy of Microsoft Word or a torn Raggedy Ann doll. But sociable robots provoke enough emotion to make this ethical question feel very real."
  • "The concept of robots as baby sitters is, intellectually, one that ought to appeal to parents more than the idea of having a teenager or similarly inexperienced baby sitter responsible for the safety of their infants," he writes. "Their smoke-detection capabilities will be better than ours, and they will never be distracted for the brief moment it can take an infant to do itself some terrible damage or be snatched by a deranged stranger."
  • "What if we get used to relationships that are made to measure?" Turkle asks. "Is that teaching us that relationships can be just the way we want them?" After all, if a robotic partner were to become annoying, we could just switch it off.
  • We've reached a moment, she says, when we should make "corrections"—to develop social norms to help offset the feeling that we must check for messages even when that means ignoring the people around us. "Today's young people have a special vulnerability: Although always connected, they feel deprived of attention," she writes. "Some, as children, were pushed on swings while their parents spoke on cellphones. Now these same parents do their e-mail at the dinner table." One 17-year-old boy even told her that at least a robot would remember everything he said, contrary to his father, who often tapped at a BlackBerry during conversations.
Ed Webb

An Algorithm Summarizes Lengthy Text Surprisingly Well - MIT Technology Review - 0 views

  • As information overload grows ever worse, computers may become our only hope for handling a growing deluge of documents. And it may become routine to rely on a machine to analyze and paraphrase articles, research papers, and other text for you.
  • Parsing language remains one of the grand challenges of artificial intelligence (see “AI’s Language Problem”). But it’s a challenge with enormous commercial potential. Even limited linguistic intelligence—the ability to parse spoken or written queries, and to respond in more sophisticated and coherent ways—could transform personal computing. In many specialist fields—like medicine, scientific research, and law—condensing information and extracting insights could have huge commercial benefits.
  • The system experiments in order to generate summaries of its own using a process called reinforcement learning. Inspired by the way animals seem to learn, this involves providing positive feedback for actions that lead toward a particular objective. Reinforcement learning has been used to train computers to do impressive new things, like playing complex games or controlling robots (see “10 Breakthrough Technologies 2017: Reinforcement Learning”). Those working on conversational interfaces are increasingly now looking at reinforcement learning as a way to improve their systems.
  • ...1 more annotation...
  • “At some point, we have to admit that we need a little bit of semantics and a little bit of syntactic knowledge in these systems in order for them to be fluid and fluent,”
Ed Webb

The future of stupid fears | Bryan Alexander - 0 views

  • Culture of Fear argues that media and political fear-mongering teaches consumers and voters to see problems in terms of stories about heroic individuals, rather than about social or political factors.  The contexts get set aside, replaced with more relatable tales of villainous criminals and virtuous victims, which Glassner calls “neurologizing social problems” (217). There is also a curious, quietly conservative politics of the family involved.  Such fears emphasize stranger danger, which is actually statistically very rare.  Instead, they minimize the far more likely source of harm most American face: our family members (31).
  • fake fears reveal cultural anxieties, much as horror stories do
  • “news is what happens to your editors.”  By that he means “editors – and their bosses… [and] their families, friends, and business associates”(201)
  • ...5 more annotations...
  • Our politics clearly adore fear, notably from the Trump administration and its emphasis on immigrant-driven carnage.  Our news media continue to worship at the altar of “if it bleeds, it leads.”
  • that CNN is the opposite of a fringe news service.  Between Fox and MSNBC it occupies a neutral, middle ground.  It is, putatively, the sober center.  And it simply adores scaring the hell out of us
  • What does the likelihood of even more stupid fear-mongering mean for education?  It simply means, as I said years ago, we have to teach people to resist this stuff.  In our quest to teach digital literacy we should encourage students – of all ages – to avoid tv news, or to sample it judiciously, with great skepticism.  We should assist them in recognizing when politicians fire up fear campaigns based on poor facts.
  • politicians peddle terror because it often works
  • the negative impacts of such fear – the misdirection of resources, the creation of bad policy, the encouragement of mean world syndrome, the furtherance of racism – the promulgation of real damage
Ed Webb

Review: Google Chrome has become surveillance software. It's time to switch. - Silicon ... - 0 views

  • There are ways to defang Chrome, which is much more complicated than just using “Incognito Mode.” But it’s much easier to switch to a browser not owned by an advertising company.
1 - 20 of 25 Next ›
Showing 20 items per page