Skip to main content

Home/ Dystopias/ Group items tagged fashion

Rss Feed Group items tagged

Ed Webb

Look Closely, Doctor - See the Camera? - NYTimes.com - 0 views

  • a delusion is just a delusion, psychosis is psychosis, and the scenery is incidental
  • Some experts studying conditions like Truman Show delusion and other culture-bound delusions, which are specific to a time or place, are questioning the premise that culture is only incidental to psychosis, even as a growing body of evidence has pointed to brain abnormalities and other biological causes for illnesses like schizophrenia.
  • Another patient traveled to New York City and showed up at a federal building in downtown Manhattan seeking asylum so he could get off his reality show, Dr. Gold said. The patient reported that he also came to New York to see if the Twin Towers were still standing, because he believed that seeing their destruction on Sept. 11 on television was part of his reality show. If they were still standing, he said, then he would know that the terrorist attack was all part of the script
Ed Webb

The Social Split Between TV and Movie Dystopias - NYTimes.com - 0 views

  • Dystopian parables like “The Walking Dead,” where zombies rule the earth, are an increasingly fashionable genre of entertainment, but the degree of apocalyptic pessimism is very different depending on the size of the screen.The dividing line between television and movies seems to be class conflict.Television shows posit a hideous future with a silver lining; survivors, good or bad, are more or less equals. Movies like “Divergent,” “Snowpiercer” and “Elysium” foresee societal divisions that last into Armageddon and beyond and that define a new, inevitably Orwellian world order that emerges from the ruins of civilization.
  • Movies project a morose, scary future where man is his own worst enemy, whereas television can’t entirely suppress a smile.There is something positive about the end of the world on shows like “The Walking Dead,” and “Z Nation” on Syfy and “The Last Ship,” on TNT. True, civilization as we know it is gone, but so is social stratification. Survivors don’t group into castes according to birth, race, income or religion. People of all kinds bond with whomever seems friendly, or at least unthreatening.
  • Dystopian movies based on young-adult novels understandably focus on the oppression of young adults, but in “Divergent” and “The Hunger Games,” a despotic elite divides the little people into cliques, only there is no prom in sight.Engels wrote about “contests between exploiting and exploited, ruling and oppressed classes.” He meant in the movies. On TV, all men are equal and equally at peril in the apocalypse.
Ed Webb

Could self-aware cities be the first forms of artificial intelligence? - 1 views

  • People have speculated before about the idea that the Internet might become self-aware and turn into the first "real" A.I., but could it be more likely to happen to cities, in which humans actually live and work and navigate, generating an even more chaotic system?
  • "By connecting and providing visibility into disparate systems, cities and buildings can operate like living organisms, sensing and responding quickly to potential problems before they occur to protect citizens, save resources and reduce energy consumption and carbon emissions," reads the invitation to IBM's PULSE 2010 event.
  • And Cisco is already building the first of these smart cities: Songdo, a Korean "instant city," which will be completely controlled by computer networks — including ubiquitious Telepresence applications, video screens which could be used for surveillance. Cisco's chief globalization officer, Wim Elfrink, told the San Jose Mercury News: Everything will be connected - buildings, cars, energy - everything. This is the tipping point. When we start building cities with technology in the infrastructure, it's beyond my imagination what that will enable.
  • ...9 more annotations...
  • Urbanscale founder Adam Greenfield has written a lot about ubiquitous computing in urban environments, most notably in 2006's Everyware, which posits that computers will "effectively disappear" as objects around us become "smart" in ways that are nearly invisible to lay-people.
  • tailored advertising just about anywhere
  • Some futurists are still predicting that cities will become closer to arcologies — huge slabs of integrated urban life, like a whole city in a single block — as they grapple with the need to house so many people in an efficient fashion. The implications for heating and cooling an arcology, let alone dealing with waste disposal, are mind-boggling. Could a future arcology become our first machine mind?
  • Science fiction gives us the occasional virtual worlds that look rural — like Doctor Who's visions of life inside the Matrix, which mostly looks (not surprisingly) like a gravel quarry — but for the most part, virtual worlds are always urban
  • So here's why cities might have an edge over, say, the Internet as a whole, when it comes to developing self awareness. Because every city is different, and every city has its own identity and sense of self — and this informs everything from urban planning to the ways in which parking and electricity use are mapped out. The more sophisticated the integrated systems associated with a city become, the more they'll reflect the city's unique personality, and the more programmers will try to imbue their computers with a sense of this unique urban identity. And a sense of the city's history, and the ways in which the city has evolved and grown, will be important for a more sophisticated urban planning system to grasp the future — so it's very possible to imagine this leading to a sense of personal history, on the part of a computer that identifies with the city it helps to manage.
  • next time you're wandering around your city, looking up at the outcroppings of huge buildings, the wild tides of traffic and the frenzy of construction and demolition, don't just think of it as a place haunted by history. Try, instead, to imagine it coming to life in a new way, opening its millions of electronic eyes, and greeting you with the first gleaming of independent thought
  • I can't wait for the day when city AI's decide to go to war with other city AI's over allocation of federal funds.
  • John Shirley has San Fransisco as a sentient being in City Come A Walkin
  • I doubt cities will ever be networked so smoothly... they are all about fractions, sections, niches, subcultures, ethicities, neighborhoods, markets, underground markets. It's literally like herding cats... I don't see it as feasible. It would be a schizophrenic intelligence at best. Which, Wintermute was I suppose...
  •  
    This is beginning to sound just like the cities we have read about. To me it sort of reminds me of the Burning chrome stories, as an element in all those stories was machines and technology at every turn. With the recent advances is technology it is alarming to see that an element in many science fiction tales is finally coming true. A city that acts as a machine in its self. Who is to say that this city won't become a city with a highly active hacker underbelly.
Ed Webb

The Imaginative Reality of Ursula K. Le Guin | VQR Online - 1 views

  • The founders of this anarchist society made up a new language because they realized you couldn’t have a new society and an old language. They based the new language on the old one but changed it enormously. It’s simply an illustration of what Orwell was saying in his great essay about how writing English clearly is a political matter.
    • Ed Webb
       
      Le Guin, of course, admires "Politics and the English Language." Real-world examples of people changing languages to change society include the invention of modern Turkish and modern Hebrew.
  • There are advantages and disadvantages to living a very long time, as I have. One of the advantages is that you can’t help having a long view. You’ve seen it come and seen it go. Something that’s being announced as the absolute only way to write, you recognize as a fashion, a fad, trendy—the way to write right now if you want to sell right now to a right now editor. But there’s also the long run to consider. Nothing’s deader than last year’s trend. 
  • Obviously, the present tense has certain uses that it’s wonderfully suited for. But recently it has been adopted blindly, as the only way to tell a story—often by young writers who haven’t read very much. Well, it’s a good way to tell some stories, not a good way to tell others. It’s inherently limiting. I call it “flashlight focus.” You see a spot ahead of you and it is dark all around it. That’s great for high suspense, high drama, cut-to-the-chase writing. But if you want to tell a big, long story, like the books of Elena Ferrante, or Jane Smiley’s The Last Hundred Years trilogy, which moves year by year from 1920 to 2020—the present tense would cripple those books. To assume that the present tense is literally “now” and the past tense literally remote in time is extremely naïve. 
  • ...9 more annotations...
  • Henry James did the limited third person really well, showing us the way to do it. He milked that cow successfully. And it’s a great cow, it still gives lots of milk. But if you read only contemporary stuff, always third-person limited, you don’t realize that point of view in a story is very important and can be very movable. It’s here where I suggest that people read books like Woolf’s To the Lighthouse to see what she does by moving from mind to mind. Or Tolstoy’s War and Peace for goodness’ sake. Wow. The way he slides from one point of view to another without you knowing that you’ve changed point of view—he does it so gracefully. You know where you are, whose eyes you are seeing through, but you don’t have the sense of being jerked from place to place. That’s mastery of a craft.
  • Any of us who grew up reading eighteenth- or nineteenth-century fiction are perfectly at home with what is called “omniscience.” I myself call it “authorial” point of view because the term “omnisicence,” the idea of an author being omniscient, is so often used in a judgmental way, as if it were a bad thing. But the author, after all, is the author of all these characters, the maker, the inventor of them. In fact all the characters are the author if you come right down to the honest truth of it. So the author has the perfect right to know what they’re thinking. If the author doesn’t tell you what they are thinking … why? This is worth thinking about. Often it’s simply to spin out suspense by not telling you what the author knows. Well, that’s legitimate. This is art. But I’m trying to get people to think about their choices here, because there are so many beautiful choices that are going unused. In a way, first person and limited third are the easiest ones, the least interesting. 
  • to preach that story is conflict, always to ask, “Where’s the conflict in your story?”—this needs some thinking about. If you say that story is about conflict, that plot must be based on conflict, you’re limiting your view of the world severely. And in a sense making a political statement: that life is conflict, so in stories conflict is all that really matters. This is simply untrue. To see life as a battle is a narrow, social-Darwinist view, and a very masculine one. Conflict, of course, is part of life, I’m not saying you should try to keep it out of your stories, just that it’s not their only lifeblood. Stories are about a lot of different things
  • The first decade of her career, beginning in the sixties, included some of her most well-known works of fiction: A Wizard of Earthsea, The Left Hand of Darkness, The Dispossessed, and The Lathe of Heaven. Each of these works imagined not just worlds, but homes, homes that became real for her readers, homes where protagonists were women, people of color, gender fluid, anticapitalist—imaginary homes that did not simply spin out our worst dystopic fears for the future like so many of the apocalyptic novels of today, but also modeled other ways of being, other ways to create home.
  • “Children know perfectly well that unicorns aren’t real,” Le Guin once said. “But they also know that books about unicorns, if they are good books, are true books.”
  • “Fake rules” and “alternative facts” are used in our time not to increase moral understanding and social possibility but to increase power for those who already have it. A war on language has unhinged words from their meaning, language from its capacity as truth-teller. But perhaps, counterintuitively, it is in the realm of the imagination, the fictive, where we can best re-ground ourselves in the real and the true.
  • you can’t find your own voice if you aren’t listening for it. The sound of your writing is an essential part of what it’s doing. Our teaching of writing tends to ignore it, except maybe in poetry. And so we get prose that goes clunk, clunk, clunk. And we don’t know what’s wrong with it
  • You emphasize the importance of understanding grammar and grammar terminology but also the importance of interrogating its rules. You point out that it is a strange phenomenon that grammar is the tool of our trade and yet so many writers steer away from an engagement with it. In my generation and for a while after—I was born in 1929—we were taught grammar right from the start. It was quietly drilled into us. We knew the names of the parts of speech, we had a working acquaintance with how English works, which they don’t get in most schools anymore. There is so much less reading in schools, and very little teaching of grammar. For a writer this is kind of like being thrown into a carpenter’s shop without ever having learned the names of the tools or handled them consciously. What do you do with a Phillips screwdriver? What is a Phillips screwdriver? We’re not equipping people to write; we’re just saying, “You too can write!” or “Anybody can write, just sit down and do it!” But to make anything, you’ve got to have the tools to make it.
  • In your book on writing, Steering the Craft, you say that morality and language are linked, but that morality and correctness are not the same thing. Yet we often confuse them in the realm of grammar. The “grammar bullies”—you read them in places like the New York Times—and they tell you what is correct: You must never use “hopefully.” “Hopefully, we will be going there on Tuesday.” That is incorrect and wrong and you are basically an ignorant pig if you say it. This is judgmentalism. The game that is being played there is a game of social class. It has nothing to do with the morality of writing and speaking and thinking clearly, which Orwell, for instance, talked about so well. It’s just affirming that I am from a higher class than you are. The trouble is that people who aren’t taught grammar very well in school fall for these statements from these pundits, delivered with vast authority from above. I’m fighting that. A very interesting case in point is using “they” as a singular. This offends the grammar bullies endlessly; it is wrong, wrong, wrong! Well, it was right until the eighteenth century, when they invented the rule that “he” includes “she.” It didn’t exist in English before then; Shakespeare used “they” instead of “he or she”—we all do, we always have done, in speaking, in colloquial English. It took the women’s movement to bring it back to English literature. And it is important. Because it’s a crossroads between correctness bullying and the moral use of language. If “he” includes “she” but “she” doesn’t include “he,” a big statement is being made, with huge social and moral implications. But we don’t have to use “he” that way—we’ve got “they.” Why not use it?
1 - 7 of 7
Showing 20 items per page