Skip to main content

Home/ Dystopias/ Group items tagged futurism

Rss Feed Group items tagged

Ed Webb

Does the Digital Classroom Enfeeble the Mind? - NYTimes.com - 0 views

  • My father would have been unable to “teach to the test.” He once complained about errors in a sixth-grade math textbook, so he had the class learn math by designing a spaceship. My father would have been spat out by today’s test-driven educational regime.
  • A career in computer science makes you see the world in its terms. You start to see money as a form of information display instead of as a store of value. Money flows are the computational output of a lot of people planning, promising, evaluating, hedging and scheming, and those behaviors start to look like a set of algorithms. You start to see the weather as a computer processing bits tweaked by the sun, and gravity as a cosmic calculation that keeps events in time and space consistent. This way of seeing is becoming ever more common as people have experiences with computers. While it has its glorious moments, the computational perspective can at times be uniquely unromantic. Nothing kills music for me as much as having some algorithm calculate what music I will want to hear. That seems to miss the whole point. Inventing your musical taste is the point, isn’t it? Bringing computers into the middle of that is like paying someone to program a robot to have sex on your behalf so you don’t have to. And yet it seems we benefit from shining an objectifying digital light to disinfect our funky, lying selves once in a while. It’s heartless to have music chosen by digital algorithms. But at least there are fewer people held hostage to the tastes of bad radio D.J.’s than there once were. The trick is being ambidextrous, holding one hand to the heart while counting on the digits of the other.
  • The future of education in the digital age will be determined by our judgment of which aspects of the information we pass between generations can be represented in computers at all. If we try to represent something digitally when we actually can’t, we kill the romance and make some aspect of the human condition newly bland and absurd. If we romanticize information that shouldn’t be shielded from harsh calculations, we’ll suffer bad teachers and D.J.’s and their wares.
  • ...5 more annotations...
  • Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t. You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do.
  • Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption. We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.) The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.
  • If students don’t learn to think, then no amount of access to information will do them any good.
  • To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension.
  • Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.
  •  
    How do we get this right - use the tech for what it can do well, develop our brains for what the tech can't do? Who's up for building a spaceship?
Ed Webb

Dangerous Minds | Sci-Fi disco hit: Dee D. Jackson's 'Automatic Lover' - 0 views

  • In the future there will be no love, sex will be provided by robots… and we’ll all be listening to eurodisco: “Automatic Lover” a worldwide hit for Dee D. Jackson in 1978
Ed Webb

Near Future Laboratory » Blog Archive » I Quote: …. - 0 views

  • Warren
    • Ed Webb
       
      Warren Ellis, author of Transmetropolitan (also check out his Freakangels - Google will get you there, or warrenellis.com)
  • The Caryatids
    • Ed Webb
       
      Bruce Sterling's most recently published novel, quite dystopian.
  • Neuromancer
    • Ed Webb
       
      William Gibson's first published novel, mostly reckoned to be the first cyberpunk novel, although that's not undisputed, and there were short stories before that, some of which we've read.
  • ...2 more annotations...
  • ((via @russelldavies from his post here.))
    • Ed Webb
       
      Russell Davies is the main force behind the recent revival of Doctor Who, as well as its spin-off Torchwood.
  • Advertising is probably the worst kind of SciFi in this regard. No integrity. No expectations that what is being presented has any relationship to possibility. Maybe SciFi as it was has switched places with advertising: Warren Ellis says SciFi does extrapolation. Advertising-as-SciFi does..what? Dishonest predicting?
Ed Webb

Antimatter? Not such a big deal | Roz Kaveney | Comment is free | The Guardian - 0 views

  • One problem with being a long-term reader of science fiction and fantasy is that you get blase about science itself because you have seen it all before. My sense of wonder was overloaded by the time I was 16; I am never going to get that rush again. Even major breakthroughs make me go 'Whatever!'.Partly that's because, despite all our advances, we still haven't got time travel, reptilian visitors from the Galactic Federation, or telepathy. Instead, we get the depressing environmental disasters that JG Ballard described, and crazed grinning fundamentalist politicians straight out of Philip K Dick. (I'm sure that if I went through all his Ace doubles from the early 1960s, I would find Sarah Palin somewhere.) We don't get the stories where someone smart gets to fix the problem with a bent paperclip; we get the grim logical stories where we are all going to die.
  • One of the reasons why Dick and Ballard speak to our condition so well is that they saw the future and it was pretty rubbish.
  • It is almost a cliché that most sci fi is a way of looking sideways at the time in which it was written – the reason why William Gibson's Neuromancer felt so relevant in the 1980s was simply that it was a book whose imagined technology was mostly just around the corner, and whose doomed hipster technobandits were already walking down mean streets in cities near us. It's significant that Gibson has moved to writing contemporary fiction with hardly a change of register.
Ed Webb

Why Doesn't Anyone Pay Attention Anymore? | HASTAC - 0 views

  • We also need to distinguish what scientists know about human neurophysiology from our all-too-human discomfort with cultural and social change.  I've been an English professor for over twenty years and have heard how students don't pay attention, can't read a long novel anymore, and are in decline against some unspecified norm of an idealized past quite literally every year that I have been in this profession. In fact, how we educators should address this dire problem was the focus of the very first faculty meeting I ever attended.
  • Whenever I hear about attentional issues in debased contemporary society, whether blamed on television, VCR's, rock music, or the desktop, I assume that the critic was probably, like me, the one student who actually read Moby Dick and who had little awareness that no one else did.
  • This is not really a discussion about the biology of attention; it is about the sociology of change.
  • ...3 more annotations...
  • The brain is always changed by what it does.  That's how we learn, from infancy on, and that's how a baby born in New York has different cultural patterns of behavior, language, gesture, interaction, socialization, and attention than a baby born the same day in Beijing. That's as true for the historical moment into which we are born as it is for the geographical location.  Our attention is shaped by all we do, and reshaped by all we do.  That is what learning is.  The best we can do as educators is find ways to improve our institutions of learning to help our kids be prepared for their future--not for our past.
  • I didn't find the article nearly as stigmatizing and retrograde as I do the knee-jerk Don't Tread on Me reactions of everyone I've seen respond--most of which amount to foolish technolibertarian celebrations of the anonymous savior Technology (Cathy, you don't do that there, even if you also have nothing good to say about the NYT piece).If anything, the article showed that these kids (like all of us!) are profoundly distressed by today's media ecology. They seem to have a far more subtle perspective on things than most others. Frankly I'm a bit gobstopped that everyone hates this article so much. As for the old chestnut that "we need new education for the information age," it's worth pointing out that there was no formal, standardized education system before the industrial age. Compulsory education is a century old experiment. And yes, it ought to be discarded. But that's a frightening prospect for almost everyone, including those who advocate for it. I wonder how many of the intelligentsia who raise their fists and cry, "We need a different education system!" still partake of the old system for their own kids. We don't in my house, for what it's worth, and it's a huge pain in the ass.
  • Cathy -- I really appreciate the distinctions you make between the "the biology of attention" and "the sociology of change." And I agree that more complex and nuanced conversations about technology's relationship to attention, diverstion, focus, and immersion will be more productive (than either nostalgia or utopic futurism). For example, it seems like a strange oversight (in the NYT piece) to bemoan the ability of "kids these days" to focus, read immersively, or Pay Attention, yet report without comment that these same kids can edit video for hours on end -- creative, immersive work which, I would imagine, requires more than a little focus. It seems that perhaps the question is not whether we can still pay attention or focus, but what those diverse forms of immersion within different media (will) look like.
  •  
    I recommend both this commentary and the original NYT piece to which it links and on which it comments.
Ed Webb

The stories of Ray Bradbury. - By Nathaniel Rich - Slate Magazine - 0 views

  • Thanks to Fahrenheit 451, now required reading for every American middle-schooler, Bradbury is generally thought of as a writer of novels, but his talents—particularly his mastery of the diabolical premise and the brain-exploding revelation—are best suited to the short form.
  • The best stories have a strange familiarity about them. They're like long-forgotten acquaintances—you know you've met them somewhere before. There is, for instance, the tale of the time traveler who goes back into time and accidentally steps on a butterfly, thereby changing irrevocably the course of history ("A Sound of Thunder"). There's the one about the man who buys a robotic husband to live with his wife so that he can be free to travel and pursue adventure—that's "Marionettes, Inc." (Not to be confused with "I Sing the Body Electric!" about the man who buys a robotic grandmother to comfort his children after his wife dies.) Or "The Playground," about the father who changes places with his son so that he can spare his boy the cruelty of childhood—forgetting exactly how cruel childhood can be. The stories are familiar because they've been adapted, and plundered from, by countless other writers—in books, television shows, and films. To the extent that there is a mythology of our age, Bradbury is one of its creators.
  • "But Bradbury's skill is in evoking exactly how soul-annihilating that world is."    Of course, this also displays one of the key facts of Bradbury's work -- and a trend in science fiction that is often ignored. He's a reactionary of the first order, deeply distrustful of technology and even the notion of progress. Many science fiction writers had begun to rewrite the rules of women in space by the time Bradbury had women in long skirts hauling pots and pans over the Martian landscape. And even he wouldn't disagree. In his famous Playboy interview he responded to a question about predicting the future with, "It's 'prevent the future', that's the way I put it. Not predict it, prevent it."
  • ...5 more annotations...
  • And for the record, I've never understood why a writer who recognizes technology is labeled a "sci-fi writer", as if being a "sci-fi writer" were equal to being some sort of substandard, second-rate hack. The great Kurt Vonnegut managed to get stuck in that drawer after he recognized technolgy in his 1st novel "Player Piano". No matter that he turned out to be (imo) one of the greatest authors of the 20th century, perio
  • it's chilling how prescient he was about modern media culture in Fahrenheit 451. It's not a Luddite screed against TV. It's a speculative piece on what happens when we become divorced from the past and more attuned to images on the screen than we are to each other.
  • ite author of mine since I was in elementary school way back when mammoths roamed the earth. To me, he was an ardent enthusiast of technology, but also recognized its potential for seperating us from one another while at the same time seemingly making us more "connected" in a superficial and transitory way
  • Bradbury is undeniably skeptical of technology and the risks it brings, particularly the risk that what we'd now call "virtualization" will replace actual emotional, intellectual or physical experience. On the other hand, however, I don't think there's anybody who rhapsodizes about the imaginative possibilities of rocketships and robots the way Bradbury does, and he's built entire setpieces around the idea of technological wonders creating new experiences.    I'm not saying he doesn't have a Luddite streak, more that he has feet in both camps and is harder to pin down than a single label allows. And I'll also add that in his public pronouncements of late, the Luddite streak has come out more strongly--but I tend to put much of that down to the curmudgeonliness of a ninety-year-old man.
  • I don't think he is a luddite so much as he is the little voice that whispers "be careful what you wish for." We have been sold the beautiful myth that technology will buy us free time, but we are busier than ever. TV was supposed to enlighten the masses, instead we have "reality TV" and a news network that does not let facts get in the way of its ideological agenda. We romanticize childhood, ignoring children's aggressive impulses, then feed them on a steady diet of violent video games.  
Ed Webb

BBC News - The printed future of Christmas dinner - 1 views

  •  
    Is Christmas Eve the new April 1st?
  •  
    I have been wanting to comment on this for a while but I seem to have a lack of the internet at my house. This takes the artistry and science out of cooking. Many people argue that cooking is an art not a science but I see it as both, if you remove food from cooking though you lose it as both an art and a science. What difficulty is there to mixing brown goop with red goop and getting apple pie? To make a good apple pie you have to experiment with a number of different ingredients. This does look like it could be applicable to problems with over population though.
Ed Webb

Boy of 12 hauled out of class by police over David Cameron Facebook protest - mirror.co.uk - 1 views

  •  
    "If you want a picture of the future, imagine a boot stamping on a human face- forever."
Ed Webb

The Case For Public Shaming of Vancouver Rioters « publicshamingeternus - 0 views

  •  
    The future of justice?
Ed Webb

untitled - 0 views

  •  
    In 1959, a solar storm threw an electromagnetic pulse at Earth so strong it fried the telegraph system. A whole lot more is on the line now.
« First ‹ Previous 41 - 60 of 155 Next › Last »
Showing 20 items per page