Skip to main content

Home/ Dystopias/ Group items tagged learning

Rss Feed Group items tagged

Ed Webb

An Algorithm Summarizes Lengthy Text Surprisingly Well - MIT Technology Review - 0 views

  • As information overload grows ever worse, computers may become our only hope for handling a growing deluge of documents. And it may become routine to rely on a machine to analyze and paraphrase articles, research papers, and other text for you.
  • Parsing language remains one of the grand challenges of artificial intelligence (see “AI’s Language Problem”). But it’s a challenge with enormous commercial potential. Even limited linguistic intelligence—the ability to parse spoken or written queries, and to respond in more sophisticated and coherent ways—could transform personal computing. In many specialist fields—like medicine, scientific research, and law—condensing information and extracting insights could have huge commercial benefits.
  • The system experiments in order to generate summaries of its own using a process called reinforcement learning. Inspired by the way animals seem to learn, this involves providing positive feedback for actions that lead toward a particular objective. Reinforcement learning has been used to train computers to do impressive new things, like playing complex games or controlling robots (see “10 Breakthrough Technologies 2017: Reinforcement Learning”). Those working on conversational interfaces are increasingly now looking at reinforcement learning as a way to improve their systems.
  • ...1 more annotation...
  • “At some point, we have to admit that we need a little bit of semantics and a little bit of syntactic knowledge in these systems in order for them to be fluid and fluent,”
Ed Webb

Stephen Downes: A World to Change - 0 views

  • we need, first, to take charge of our own learning, and next, help others take charge of their own learning. We need to move beyond the idea that an education is something that is provided for us, and toward the idea that an education is something that we create for ourselves. It is time, in other words, that we change out attitude toward learning and the educational system in general. That is not to advocate throwing learners off the bus to fend for themselves. It is hard to be self-reliant, to take charge of one's own learning, and people shouldn't have to do it alone. It is instead to articulate a way we as a society approach education and learning, beginning with an attitude, though the development of supports and a system, through to the techniques and technologies that support that.
  •  
    For those interested in blogging further about education, more food for thought
Ed Webb

What we still haven't learned from Gamergate - Vox - 0 views

  • Harassment and misogyny had been problems in the community for years before this; the deep resentment and anger toward women that powered Gamergate percolated for years on internet forums. Robert Evans, a journalist who specializes in extremist communities and the host of the Behind the Bastards podcast, described Gamergate to me as partly organic and partly born out of decades-long campaigns by white supremacists and extremists to recruit heavily from online forums. “Part of why Gamergate happened in the first place was because you had these people online preaching to these groups of disaffected young men,” he said. But what Gamergate had that those previous movements didn’t was an organized strategy, made public, cloaking itself as a political movement with a flimsy philosophical stance, its goals and targets amplified by the power of Twitter and a hashtag.
  • The hate campaign, we would later learn, was the moment when our ability to repress toxic communities and write them off as just “trolls” began to crumble. Gamergate ultimately gave way to something deeper, more violent, and more uncontrollable.
  • Police have to learn how to keep the rest of us safe from internet mobs
  • ...20 more annotations...
  • the justice system continues to be slow to understand the link between online harassment and real-life violence
  • In order to increase public safety this decade, it is imperative that police — and everyone else — become more familiar with the kinds of communities that engender toxic, militant systems of harassment, and the online and offline spaces where these communities exist. Increasingly, that means understanding social media’s dark corners, and the types of extremism they can foster.
  • Businesses have to learn when online outrage is manufactured
  • There’s a difference between organic outrage that arises because an employee actually does something outrageous, and invented outrage that’s an excuse to harass someone whom a group has already decided to target for unrelated reasons — for instance, because an employee is a feminist. A responsible business would ideally figure out which type of outrage is occurring before it punished a client or employee who was just doing their job.
  • Social media platforms didn’t learn how to shut down disingenuous conversations over ethics and free speech before they started to tear their cultures apart
  • Dedication to free speech over the appearance of bias is especially important within tech culture, where a commitment to protecting free speech is both a banner and an excuse for large corporations to justify their approach to content moderation — or lack thereof.
  • Reddit’s free-speech-friendly moderation stance resulted in the platform tacitly supporting pro-Gamergate subforums like r/KotakuInAction, which became a major contributor to Reddit’s growing alt-right community. Twitter rolled out a litany of moderation tools in the wake of Gamergate, intended to allow harassment targets to perpetually block, mute, and police their own harassers — without actually effectively making the site unwelcome for the harassers themselves. And YouTube and Facebook, with their algorithmic amplification of hateful and extreme content, made no effort to recognize the violence and misogyny behind pro-Gamergate content, or police them accordingly.
  • All of these platforms are wrestling with problems that seem to have grown beyond their control; it’s arguable that if they had reacted more swiftly to slow the growth of the internet’s most toxic and misogynistic communities back when those communities, particularly Gamergate, were still nascent, they could have prevented headaches in the long run — and set an early standard for how to deal with ever-broadening issues of extremist content online.
  • Violence against women is a predictor of other kinds of violence. We need to acknowledge it.
  • Somehow, the idea that all of that sexism and anti-feminist anger could be recruited, harnessed, and channeled into a broader white supremacist movement failed to generate any real alarm, even well into 2016
  • many of the perpetrators of real-world violence are radicalized online first
  • It remains difficult for many to accept the throughline from online abuse to real-world violence against women, much less the fact that violence against women, online and off, is a predictor of other kinds of real-world violence
  • Politicians and the media must take online “ironic” racism and misogyny seriously
  • Gamergate masked its misogyny in a coating of shrill yelling that had most journalists in 2014 writing off the whole incident as “satirical” and immature “trolling,” and very few correctly predicting that Gamergate’s trolling was the future of politics
  • Gamergate was all about disguising a sincere wish for violence and upheaval by dressing it up in hyperbole and irony in order to confuse outsiders and make it all seem less serious.
  • Gamergate simultaneously masqueraded as legitimate concern about ethics that demanded audiences take it seriously, and as total trolling that demanded audiences dismiss it entirely. Both these claims served to obfuscate its real aim — misogyny, and, increasingly, racist white supremacy
  • The public’s failure to understand and accept that the alt-right’s misogyny, racism, and violent rhetoric is serious goes hand in hand with its failure to understand and accept that such rhetoric is identical to that of President Trump
  • deploying offensive behavior behind a guise of mock outrage, irony, trolling, and outright misrepresentation, in order to mask the sincere extremism behind the message.
  • many members of the media, politicians, and members of the public still struggle to accept that Trump’s rhetoric is having violent consequences, despite all evidence to the contrary.
  • The movement’s insistence that it was about one thing (ethics in journalism) when it was about something else (harassing women) provided a case study for how extremists would proceed to drive ideological fissures through the foundations of democracy: by building a toxic campaign of hate beneath a veneer of denial.
Ed Webb

Does the Digital Classroom Enfeeble the Mind? - NYTimes.com - 0 views

  • My father would have been unable to “teach to the test.” He once complained about errors in a sixth-grade math textbook, so he had the class learn math by designing a spaceship. My father would have been spat out by today’s test-driven educational regime.
  • A career in computer science makes you see the world in its terms. You start to see money as a form of information display instead of as a store of value. Money flows are the computational output of a lot of people planning, promising, evaluating, hedging and scheming, and those behaviors start to look like a set of algorithms. You start to see the weather as a computer processing bits tweaked by the sun, and gravity as a cosmic calculation that keeps events in time and space consistent. This way of seeing is becoming ever more common as people have experiences with computers. While it has its glorious moments, the computational perspective can at times be uniquely unromantic. Nothing kills music for me as much as having some algorithm calculate what music I will want to hear. That seems to miss the whole point. Inventing your musical taste is the point, isn’t it? Bringing computers into the middle of that is like paying someone to program a robot to have sex on your behalf so you don’t have to. And yet it seems we benefit from shining an objectifying digital light to disinfect our funky, lying selves once in a while. It’s heartless to have music chosen by digital algorithms. But at least there are fewer people held hostage to the tastes of bad radio D.J.’s than there once were. The trick is being ambidextrous, holding one hand to the heart while counting on the digits of the other.
  • The future of education in the digital age will be determined by our judgment of which aspects of the information we pass between generations can be represented in computers at all. If we try to represent something digitally when we actually can’t, we kill the romance and make some aspect of the human condition newly bland and absurd. If we romanticize information that shouldn’t be shielded from harsh calculations, we’ll suffer bad teachers and D.J.’s and their wares.
  • ...5 more annotations...
  • Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t. You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do.
  • Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption. We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.) The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.
  • If students don’t learn to think, then no amount of access to information will do them any good.
  • To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension.
  • Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.
  •  
    How do we get this right - use the tech for what it can do well, develop our brains for what the tech can't do? Who's up for building a spaceship?
Ed Webb

Hechinger Report | What can we learn from Finland?: A Q&A with Dr. Pasi Sahlberg - 1 views

  • If you want to learn something from Finland, it’s the implementation of ideas. It’s looking at education as nation-building. We have very carefully kept the business of education in the hands of educators. It’s practically impossible to become a superintendent without also being a former teacher. … If you have people [in leadership positions] with no background in teaching, they’ll never have the type of communication they need.
  • Finns don’t believe you can reliably measure the essence of learning. You know, one big difference in thinking about education and the whole discourse is that in the U.S. it’s based on a belief in competition. In my country, we are in education because we believe in cooperation and sharing. Cooperation is a core starting point for growth.
Ed Webb

The quiet rise of machine learning - O'Reilly Radar - 0 views

  • machine learning research mirrors the way cryptography research developed around the middle of the 20th century. Much of the cutting edge research was done in secret, and we're only finding out now, 40 or 50 years later, what GCHQ or the NSA was doing back then. I'm hopeful that it won't take quite that long for Amazon or Google to tell us what they're thinking about today
  • All the components of the system are thought of as agents — effectively "smart" pieces of software
  • increased adaptability in the face of asynchronously arriving data
  • ...2 more annotations...
  • There is no central master-scheduler overseeing the network — optimization arises through emerging complexity and social convention
  • a geographically distributed sensor architecture
Ed Webb

Smartphones are making us stupid - and may be a 'gateway drug' | The Lighthouse - 0 views

  • rather than making us smarter, mobile devices reduce our cognitive ability in measurable ways
  • “There’s lots of evidence showing that the information you learn on a digital device, doesn’t get retained very well and isn’t transferred across to the real world,”
  • “You’re also quickly conditioned to attend to lots of attention-grabbing signals, beeps and buzzes, so you jump from one task to the other and you don’t concentrate.”
  • ...16 more annotations...
  • Not only do smartphones affect our memory and our concentration, research shows they are addictive – to the point where they could be a ‘gateway drug’ making users more vulnerable to other addictions.
  • Smartphones are also linked to reduced social interaction, inadequate sleep, poor real-world navigation, and depression.
  • “The more time that kids spend on digital devices, the less empathetic they are, and the less they are able to process and recognise facial expressions, so their ability to actually communicate with each other is decreased.”
  • “Casino-funded research is designed to keep people gambling, and app software developers use exactly the same techniques. They have lots of buzzes and icons so you attend to them, they have things that move and flash so you notice them and keep your attention on the device.”
  • Around 90 per cent of US university students are thought to experience ‘phantom vibrations', so the researcher took a group to a desert location with no cell reception – and found that even after four days, around half of the students still thought their pocket was buzzing with Facebook or text notifications.
  • “Collaboration is a buzzword with software companies who are targeting schools to get kids to use these collaboration tools on their iPads – but collaboration decreases when you're using these devices,”
  • “All addiction is based on the same craving for a dopamine response, whether it's drug, gambling, alcohol or phone addiction,” he says. “As the dopamine response drops off, you need to increase the amount you need to get the same result, you want a little bit more next time. Neurologically, they all look the same.“We know – there are lots of studies on this – that once we form an addiction to something, we become more vulnerable to other addictions. That’s why there’s concerns around heavy users of more benign, easily-accessed drugs like alcohol and marijuana as there’s some correlation with usage of more physically addictive drugs like heroin, and neurological responses are the same.”
  • parents can also fall victim to screens which distract from their child’s activities or conversations, and most adults will experience this with friends and family members too.
  • “We also know that if you learn something on an iPad you are less likely to be able to transfer that to another device or to the real world,”
  • a series of studies have tested this with children who learn to construct a project with ‘digital’ blocks and then try the project with real blocks. “They can’t do it - they start from zero again,”
  • “Our brains can’t actually multitask, we have to switch our attention from one thing to another, and each time you switch, there's a cost to your attentional resources. After a few hours of this, we become very stressed.” That also causes us to forget things
  • A study from Norway recently tested how well kids remembered what they learned on screens. One group of students received information on a screen and were asked to memorise it; the second group received the same information on paper. Both groups were tested on their recall.Unsurprisingly, the children who received the paper version remembered more of the material. But the children with the electronic version were also found to be more stressed,
  • The famous ‘London taxi driver experiments’ found that memorising large maps caused the hippocampus to expand in size. Williams says that the reverse is going to happen if we don’t use our brain and memory to navigate. “Our brains are just like our muscles. We ‘use it or lose it’ – in other words, if we use navigation devices for directions rather than our brains, we will lose that ability.”
  • numerous studies also link smartphone use with sleeplessness and anxiety. “Some other interesting research has shown that the more friends you have on social media, the less friends you are likely to have in real life, the less actual contacts you have and the greater likelihood you have of depression,”
  • 12-month-old children whose carers regularly use smartphones have poorer facial expression perception
  • turning off software alarms and notifications, putting strict time limits around screen use, keeping screens out of bedrooms, minimising social media and replacing screens with paper books, paper maps and other non-screen activities can all help minimise harm from digital devices including smartphones
Ed Webb

Why Doesn't Anyone Pay Attention Anymore? | HASTAC - 0 views

  • We also need to distinguish what scientists know about human neurophysiology from our all-too-human discomfort with cultural and social change.  I've been an English professor for over twenty years and have heard how students don't pay attention, can't read a long novel anymore, and are in decline against some unspecified norm of an idealized past quite literally every year that I have been in this profession. In fact, how we educators should address this dire problem was the focus of the very first faculty meeting I ever attended.
  • Whenever I hear about attentional issues in debased contemporary society, whether blamed on television, VCR's, rock music, or the desktop, I assume that the critic was probably, like me, the one student who actually read Moby Dick and who had little awareness that no one else did.
  • This is not really a discussion about the biology of attention; it is about the sociology of change.
  • ...3 more annotations...
  • The brain is always changed by what it does.  That's how we learn, from infancy on, and that's how a baby born in New York has different cultural patterns of behavior, language, gesture, interaction, socialization, and attention than a baby born the same day in Beijing. That's as true for the historical moment into which we are born as it is for the geographical location.  Our attention is shaped by all we do, and reshaped by all we do.  That is what learning is.  The best we can do as educators is find ways to improve our institutions of learning to help our kids be prepared for their future--not for our past.
  • I didn't find the article nearly as stigmatizing and retrograde as I do the knee-jerk Don't Tread on Me reactions of everyone I've seen respond--most of which amount to foolish technolibertarian celebrations of the anonymous savior Technology (Cathy, you don't do that there, even if you also have nothing good to say about the NYT piece).If anything, the article showed that these kids (like all of us!) are profoundly distressed by today's media ecology. They seem to have a far more subtle perspective on things than most others. Frankly I'm a bit gobstopped that everyone hates this article so much. As for the old chestnut that "we need new education for the information age," it's worth pointing out that there was no formal, standardized education system before the industrial age. Compulsory education is a century old experiment. And yes, it ought to be discarded. But that's a frightening prospect for almost everyone, including those who advocate for it. I wonder how many of the intelligentsia who raise their fists and cry, "We need a different education system!" still partake of the old system for their own kids. We don't in my house, for what it's worth, and it's a huge pain in the ass.
  • Cathy -- I really appreciate the distinctions you make between the "the biology of attention" and "the sociology of change." And I agree that more complex and nuanced conversations about technology's relationship to attention, diverstion, focus, and immersion will be more productive (than either nostalgia or utopic futurism). For example, it seems like a strange oversight (in the NYT piece) to bemoan the ability of "kids these days" to focus, read immersively, or Pay Attention, yet report without comment that these same kids can edit video for hours on end -- creative, immersive work which, I would imagine, requires more than a little focus. It seems that perhaps the question is not whether we can still pay attention or focus, but what those diverse forms of immersion within different media (will) look like.
  •  
    I recommend both this commentary and the original NYT piece to which it links and on which it comments.
Ed Webb

Duke coed's scandalous sex ratings go viral - TODAY People - TODAYshow.com - 0 views

  •  
    Lessons to learn here include that the quickest way to a publishing contract is notoriety. The media are hungry, but they feed on themselves, ultimately. Lives are catalysts for an otherwise self-sustaining hype cycle.
Ed Webb

Babies treat 'social robots' as sentient beings | KurzweilAI - 1 views

  • UW researchers hypothesized that babies would be more likely to view the robot as a psychological being if they saw other friendly human beings socially interacting with it. “Babies look to us for guidance in how to interpret things, and if we treat something as a psychological agent, they will, too,” Meltzoff said. “Even more remarkably, they will learn from it, because social interaction unlocks the key to early learning.”
  • “The study suggests that if you want to build a companion robot, it is not sufficient to make it look human,” said Rao. “The robot must also be able to interact socially with humans, an interesting challenge for robotics.”
Ed Webb

News: Cheating and the Generational Divide - Inside Higher Ed - 0 views

  • such attitudes among students can develop from the notion that all of education can be distilled into performance on a test -- which today's college students have absorbed from years of schooling under No Child Left Behind -- and not that education is a process in which one grapples with difficult material.
    • Ed Webb
       
      Exactly so. If the focus of education is moved away from testing regurgitated factoids and toward building genuine skills of critical analysis and effective communication, the apparent 'gap' in understanding of what cheating is will surely go away.
  •  
    I'd love to know what you Dystopians think about this.
  •  
    Institutional education puts far too much pressure on students to do well in tests. This I believe forces students to cheat because if you do not perform well in this one form of evaluation you are clearly not educated well enough, not trying hard enough or just plain dumb. I doubt there are many instances outside of institutional education where you would need to memorize a number of facts for a small period of time where your very future is at stake. To me the only cheating is plagarism. If you're taking a standardized test and you don't know the answer to question 60 but the student next to you does how would it hurt anyone to share that answer? You're learning the answer to question 60. It's the same knowledge you'll learn when you get the test back and realize the answer to 60 was A not B. Again though, when will this scenario occur outside of schooling?
Ed Webb

elearnspace › The algorithms that rule our lives - 1 views

  • A significant difficulty that learning analytics needs to address is the possible return to behaviourism where we make decisions about learning only on observable behaviours of learners. Nonetheless, algorithms define our lives and how organizations interact with us. It’s a data-driven world, and the algorithm reigns supreme.
  •  
    Should we be worried about the growing dominance of algorithms in steering our fates?
Ed Webb

Shareable: The Exterminator's Want-Ad - 1 views

  • So, this moldy jail I was in was this old dot-com McMansion, out in the Permanent Foreclosure Zone in the dead suburbs. That's where they cooped us up. This gated community was built for some vanished rich people. That was their low-intensity prison for us rehab detainees.
  • This place outside was a Beltway suburb before Washington was abandoned. The big hurricane ran right over it, and crushed it down pretty good, so now it was a big green hippie jungle. Our prison McMansion had termites, roaches, mold and fleas, but once it was a nice house. This rambling wreck of a town was half storm-debris. All the lawns were replaced with wet, weedy, towering patches of bamboo, or marijuana -- or hops, or kenaf, whatever (I never could tell those farm crops apart). The same goes for the "garden roofs," which were dirt piled on top of the dirty houses. There were smelly goats running loose, chickens cackling. Salvaged umbrellas and chairs toppled in the empty streets. No traffic signs, because there were no cars.
  • The rich elite just blew it totally. They dropped their globalized ball. They panicked. So they're in jail, like I was. Or they're in exile somewhere, or else they jumped out of penthouses screaming when the hyperinflation ate them alive.
  • ...12 more annotations...
  • So, my cellmate Claire was this forty-something career lobbyist who used to be my boss inside the Beltway. Claire was full of horror stories about the cruelty of the socialist regime. Because, in the old days before we got ourselves arrested, alarmist tales of this kind were Claire's day-job. Claire peddled political spin to the LameStream Media to make sure that corporations stayed in command, so that situations like our present world stayed impossible.
  • Claire and I hated the sharing networks, because we were paid to hate them. We hated all social networks, like Facebook, because they destroyed the media that we owned. We certainly hated free software, because it was like some ever-growing anti-commercial fungus. We hated search engines and network aggregators, people like Google -- not because Google was evil, but because they weren't. We really hated "file-sharers" -- the swarming pirates who were chewing up the wealth of our commercial sponsors.
  • We despised green power networks because climate change was a myth. Until the climate actually changed. Then the honchos who paid us started drinking themselves to death.
  • This prison game was diabolical. It was very entertaining, and compulsively playable. This game had been designed by left-wing interaction designers, the kind of creeps who built not-for-profit empires like Wikipedia. Except they'd designed it for losers like us. Everybody in rehab had to role-play. We had to build ourselves another identity, because this new pretend-identity was supposed to help us escape the stifling spiritual limits of our previous, unliberated, greedy individualist identities. In this game, I played an evil dwarf. With an axe. Which would have been okay, because that identity was pretty much me all along. Except that the game's reward system had been jiggered to reward elaborate acts of social collaboration. Of course we wanted to do raids and looting and cool fantasy fighting, but that wasn't on. We were very firmly judged on the way we played this rehab game. It was never about grabbing the gold. It was all about forming trust coalitions so as to collectively readjust our fantasy infrastructure.
  • they were scanning us all the time. Nobody ever gets it about the tremendous power of network surveillance. That's how they ruled the world, though: by valuing every interaction, by counting every click. Every time one termite touched the feelers of another termite, they were adding that up. In a database. Everybody was broke: extremely poor, like preindustrial hard-scrabble poor, very modest, very "green." But still surviving. The one reason we weren't all chewing each other's cannibal thighbones (like the people on certain more disadvantaged continents), was because they'd stapled together this survival regime out of socialist software. It was very social. Ultra-social. No "privatization," no "private sector," and no "privacy." They pretended that it was all about happiness and kindliness and free-spirited cooperation and gay rainbow banners and all that. It was really a system that was firmly based on "social capital." Everything social was your only wealth. In a real "gift economy," you were the gift. You were living by your karma. Instead of a good old hundred-dollar bill, you just had a virtual facebooky thing with your own smiling picture on it, and that picture meant "Please Invest in the Bank of Me!"
  • These Lifestyle of Health and Sustainability geeks were maybe seven percent of America's population. But the termite people had seized power. They were the Last Best Hope of a society on the skids. They owned all the hope because they had always been the ones who knew our civilization was hopeless. So, I was in their prison until I got my head around that new reality. Until I realized that this was inevitable. That it was the way forward. That I loved Little Brother. After that, I could go walkies.
  • I learned to sit still and read a lot. Because that looks like innocent behavior.
  • Jean Paul Sartre (who was still under copyright, so I reckon they stole his work). I learned some things from him. That changed me. "Hell is other people." That is the sinister side of a social-software shared society: that people suck, that hell is other people. Sharing with people is hell. When you share, then no matter how much money you have, they just won't leave you alone. I quoted Jean-Paul Sartre to the parole board. A very serious left-wing philosopher: lots of girlfriends (even feminists), he ate speed all the time, he hung out with Maoists. Except for the Maoist part, Jean-Paul Sartre is my guru. My life today is all about my Existential authenticity. Because I'm a dissident in this society.
  • social networks versus bandit mafias is like Ninjas Versus Pirates: it's a counterculture fight to the finish
  • the European Red Cross happened to show up during that episode (because they like gunfire). The Europeans are all prissy about the situation, of course. They are like: "What's with these illegal detainees in orange jumpsuits, and how come they don’'t have proper medical care?" So, I finally get paroled. I get amnestied.
  • in a network society, the power is ALL personal. "The personal is political." You mess with the tender feelings of a network maven, and she's not an objective bureaucrat following the rule of law. She's more like: "To the Bastille with this subhuman irritation!"
  • like "Heavy Weather" with a post-technology green catastrophe thrown in
Ed Webb

How to Mark a Book - 0 views

  • A book is more like the score of a piece of music than it is like a painting. No great musician confuses a symphony with the printed sheets of music. Arturo Toscanini reveres Brahms, but Toscanini's score of the G minor Symphony is so thoroughly marked up that no one but the maestro himself can read it. The reason why a great conductor makes notations on his musical scores -- marks them up again and again each time he returns to study them--is the reason why you should mark your books.
    • Ed Webb
       
      This is an excellent analogy.
  • the physical act of writing, with your own hand, brings words and sentences more sharply before your mind and preserves them better in your memory. To set down your reaction to important words and sentences you have read, and the questions they have raised in your mind, is to preserve those reactions and sharpen those questions.
    • Ed Webb
       
      The effect of new technologies here is still imperfectly understood. But there is some evidence that typing notes is less efficacious than handwriting them, in terms of inscribing information to memory and developing thought.
  • that is exactly what reading a book should be: a conversation between you and the author. Presumably he knows more about the subject than you do; naturally, you'll have the proper humility as you approach him. But don't let anybody tell you that a reader is supposed to be solely on the receiving end. Understanding is a two-way operation; learning doesn't consist in being an empty receptacle. The learner has to question himself and question the teacher. He even has to argue with the teacher, once he understands what the teacher is saying. And marking a book is literally an expression of differences, or agreements of opinion, with the author
  • ...4 more annotations...
  • Underlining (or highlighting): of major points, of important or forceful statements. Vertical lines at the margin: to emphasize a statement already underlined. Star, asterisk, or other doo-dad at the margin: to be used sparingly, to emphasize the ten or twenty most important statements in the book. (You may want to fold the bottom comer of each page on which you use such marks. It won't hurt the sturdy paper on which most modern books are printed, and you will be able take the book off the shelf at any time and, by opening it at the folded-corner page, refresh your recollection of the book.) Numbers in the margin: to indicate the sequence of points the author makes in developing a single argument. Numbers of other pages in the margin: to indicate where else in the book the author made points relevant to the point marked; to tie up the ideas in a book, which, though they may be separated by many pages, belong together. Circling or highlighting of key words or phrases. Writing in the margin, or at the top or bottom of the page, for the sake of: recording questions (and perhaps answers) which a passage raised in your mind; reducing a complicated discussion to a simple statement; recording the sequence of major points right through the books. I use the end-papers at the back of the book to make a personal index of the author's points in the order of their appearance.
    • Ed Webb
       
      This is a good schema. You can develop your own that accomplishes the same. The key is to have a schema and apply it consistently.
  • you may say that this business of marking books is going to slow up your reading. It probably will. That's one of the reasons for doing it
  • Some things should be read quickly and effortlessly and some should be read slowly and even laboriously.
  • Why is marking up a book indispensable to reading? First, it keeps you awake. (And I don't mean merely conscious; I mean awake.) In the second place; reading, if it is active, is thinking, and thinking tends to express itself in words, spoken or written. The marked book is usually the thought-through book. Finally, writing helps you remember the thoughts you had, or the thoughts the author expressed.
Ed Webb

Artificial Intelligence and the Future of Humans | Pew Research Center - 0 views

  • experts predicted networked artificial intelligence will amplify human effectiveness but also threaten human autonomy, agency and capabilities
  • most experts, regardless of whether they are optimistic or not, expressed concerns about the long-term impact of these new tools on the essential elements of being human. All respondents in this non-scientific canvassing were asked to elaborate on why they felt AI would leave people better off or not. Many shared deep worries, and many also suggested pathways toward solutions. The main themes they sounded about threats and remedies are outlined in the accompanying table.
  • CONCERNS Human agency: Individuals are  experiencing a loss of control over their lives Decision-making on key aspects of digital life is automatically ceded to code-driven, "black box" tools. People lack input and do not learn the context about how the tools work. They sacrifice independence, privacy and power over choice; they have no control over these processes. This effect will deepen as automated systems become more prevalent and complex. Data abuse: Data use and surveillance in complex systems is designed for profit or for exercising power Most AI tools are and will be in the hands of companies striving for profits or governments striving for power. Values and ethics are often not baked into the digital systems making people's decisions for them. These systems are globally networked and not easy to regulate or rein in. Job loss: The AI takeover of jobs will widen economic divides, leading to social upheaval The efficiencies and other economic advantages of code-based machine intelligence will continue to disrupt all aspects of human work. While some expect new jobs will emerge, others worry about massive job losses, widening economic divides and social upheavals, including populist uprisings. Dependence lock-in: Reduction of individuals’ cognitive, social and survival skills Many see AI as augmenting human capacities but some predict the opposite - that people's deepening dependence on machine-driven networks will erode their abilities to think for themselves, take action independent of automated systems and interact effectively with others. Mayhem: Autonomous weapons, cybercrime and weaponized information Some predict further erosion of traditional sociopolitical structures and the possibility of great loss of lives due to accelerated growth of autonomous military applications and the use of weaponized information, lies and propaganda to dangerously destabilize human groups. Some also fear cybercriminals' reach into economic systems.
  • ...18 more annotations...
  • AI and ML [machine learning] can also be used to increasingly concentrate wealth and power, leaving many people behind, and to create even more horrifying weapons
  • “In 2030, the greatest set of questions will involve how perceptions of AI and their application will influence the trajectory of civil rights in the future. Questions about privacy, speech, the right of assembly and technological construction of personhood will all re-emerge in this new AI context, throwing into question our deepest-held beliefs about equality and opportunity for all. Who will benefit and who will be disadvantaged in this new world depends on how broadly we analyze these questions today, for the future.”
  • SUGGESTED SOLUTIONS Global good is No. 1: Improve human collaboration across borders and stakeholder groups Digital cooperation to serve humanity's best interests is the top priority. Ways must be found for people around the world to come to common understandings and agreements - to join forces to facilitate the innovation of widely accepted approaches aimed at tackling wicked problems and maintaining control over complex human-digital networks. Values-based system: Develop policies to assure AI will be directed at ‘humanness’ and common good Adopt a 'moonshot mentality' to build inclusive, decentralized intelligent digital networks 'imbued with empathy' that help humans aggressively ensure that technology meets social and ethical responsibilities. Some new level of regulatory and certification process will be necessary. Prioritize people: Alter economic and political systems to better help humans ‘race with the robots’ Reorganize economic and political systems toward the goal of expanding humans' capacities and capabilities in order to heighten human/AI collaboration and staunch trends that would compromise human relevance in the face of programmed intelligence.
  • “I strongly believe the answer depends on whether we can shift our economic systems toward prioritizing radical human improvement and staunching the trend toward human irrelevance in the face of AI. I don’t mean just jobs; I mean true, existential irrelevance, which is the end result of not prioritizing human well-being and cognition.”
  • We humans care deeply about how others see us – and the others whose approval we seek will increasingly be artificial. By then, the difference between humans and bots will have blurred considerably. Via screen and projection, the voice, appearance and behaviors of bots will be indistinguishable from those of humans, and even physical robots, though obviously non-human, will be so convincingly sincere that our impression of them as thinking, feeling beings, on par with or superior to ourselves, will be unshaken. Adding to the ambiguity, our own communication will be heavily augmented: Programs will compose many of our messages and our online/AR appearance will [be] computationally crafted. (Raw, unaided human speech and demeanor will seem embarrassingly clunky, slow and unsophisticated.) Aided by their access to vast troves of data about each of us, bots will far surpass humans in their ability to attract and persuade us. Able to mimic emotion expertly, they’ll never be overcome by feelings: If they blurt something out in anger, it will be because that behavior was calculated to be the most efficacious way of advancing whatever goals they had ‘in mind.’ But what are those goals?
  • AI will drive a vast range of efficiency optimizations but also enable hidden discrimination and arbitrary penalization of individuals in areas like insurance, job seeking and performance assessment
  • The record to date is that convenience overwhelms privacy
  • As AI matures, we will need a responsive workforce, capable of adapting to new processes, systems and tools every few years. The need for these fields will arise faster than our labor departments, schools and universities are acknowledging
  • AI will eventually cause a large number of people to be permanently out of work
  • Newer generations of citizens will become more and more dependent on networked AI structures and processes
  • there will exist sharper divisions between digital ‘haves’ and ‘have-nots,’ as well as among technologically dependent digital infrastructures. Finally, there is the question of the new ‘commanding heights’ of the digital network infrastructure’s ownership and control
  • As a species we are aggressive, competitive and lazy. We are also empathic, community minded and (sometimes) self-sacrificing. We have many other attributes. These will all be amplified
  • Given historical precedent, one would have to assume it will be our worst qualities that are augmented
  • Our capacity to modify our behaviour, subject to empathy and an associated ethical framework, will be reduced by the disassociation between our agency and the act of killing
  • We cannot expect our AI systems to be ethical on our behalf – they won’t be, as they will be designed to kill efficiently, not thoughtfully
  • the Orwellian nightmare realised
  • “AI will continue to concentrate power and wealth in the hands of a few big monopolies based on the U.S. and China. Most people – and parts of the world – will be worse off.”
  • The remainder of this report is divided into three sections that draw from hundreds of additional respondents’ hopeful and critical observations: 1) concerns about human-AI evolution, 2) suggested solutions to address AI’s impact, and 3) expectations of what life will be like in 2030, including respondents’ positive outlooks on the quality of life and the future of work, health care and education
Ed Webb

Can Economists and Humanists Ever Be Friends? | The New Yorker - 0 views

  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool.
  • education, which they believe is a form of domestication
  • there is no moral dimension to this economic analysis: utility is a fundamentally amoral concept
  • ...11 more annotations...
  • intellectual overextension is often found in economics, as Gary Saul Morson and Morton Schapiro explain in their wonderful book “Cents and Sensibility: What Economics Can Learn from the Humanities” (Princeton). Morson and Schapiro—one a literary scholar and the other an economist—draw on the distinction between hedgehogs and foxes made by Isaiah Berlin in a famous essay from the nineteen-fifties, invoking an ancient Greek fragment: “The fox knows many things, but the hedgehog one big thing.” Economists tend to be hedgehogs, forever on the search for a single, unifying explanation of complex phenomena. They love to look at a huge, complicated mass of human behavior and reduce it to an equation: the supply-and-demand curves; the Phillips curve, which links unemployment and inflation; or mb=mc, which links a marginal benefit to a marginal cost—meaning that the fourth slice of pizza is worth less to you than the first. These are powerful tools, which can be taken too far. Morson and Schapiro cite the example of Gary Becker, the Nobel laureate in economics in 1992. Becker is a hero to many in the field, but, for all the originality of his thinking, to outsiders he can stand for intellectual overconfidence. He thought that “the economic approach is a comprehensive one that is applicable to all human behavior.” Not some, not most—all
  • Becker analyzed, in his own words, “fertility, education, the uses of time, crime, marriage, social interactions, and other ‘sociological,’ ‘legal,’ and ‘political problems,’ ” before concluding that economics explained everything
  • The issue here is one of overreach: taking an argument that has worthwhile applications and extending it further than it usefully goes. Our motives are often not what they seem: true. This explains everything: not true. After all, it’s not as if the idea that we send signals about ourselves were news; you could argue that there is an entire social science, sociology, dedicated to the subject. Classic practitioners of that discipline study the signals we send and show how they are interpreted by those around us, as in Erving Goffman’s “The Presentation of Self in Everyday Life,” or how we construct an entire identity, both internally and externally, from the things we choose to be seen liking—the argument of Pierre Bourdieu’s masterpiece “Distinction.” These are rich and complicated texts, which show how rich and complicated human difference can be. The focus on signalling and unconscious motives in “The Elephant in the Brain,” however, goes the other way: it reduces complex, diverse behavior to simple rules.
  • “A traditional cost-benefit analysis could easily have led to the discontinuation of a project widely viewed as being among the most successful health interventions in African history.”
  • Another part of me, though, is done with it, with the imperialist ambitions of economics and its tendency to explain away differences, to ignore culture, to exalt reductionism. I want to believe Morson and Schapiro and Desai when they posit that the gap between economics and the humanities can be bridged, but my experience in both writing fiction and studying economics leads me to think that they’re wrong. The hedgehog doesn’t want to learn from the fox. The realist novel is a solemn enemy of equations. The project of reducing behavior to laws and the project of attending to human beings in all their complexity and specifics are diametrically opposed. Perhaps I’m only talking about myself, and this is merely an autobiographical reflection, rather than a general truth, but I think that if I committed any further to economics I would have to give up writing fiction. I told an economist I know about this, and he laughed. He said, “Sounds like you’re maximizing your utility.” 
  • finance is full of “attribution errors,” in which people view their successes as deserved and their failures as bad luck. Desai notes that in business, law, or pedagogy we can gauge success only after months or years; in finance, you can be graded hour by hour, day by day, and by plainly quantifiable measures. What’s more, he says, “the ‘discipline of the market’ shrouds all of finance in a meritocratic haze.” And so people who succeed in finance “are susceptible to developing massively outsized egos and appetites.”
  • one of the things I liked about economics, finance, and the language of money was their lack of hypocrisy. Modern life is full of cant, of people saying things they don’t quite believe. The money guys, in private, don’t go in for cant. They’re more like Mafia bosses. I have to admit that part of me resonates to that coldness.
  • Economics, Morson and Schapiro say, has three systematic biases: it ignores the role of culture, it ignores the fact that “to understand people one must tell stories about them,” and it constantly touches on ethical questions beyond its ken. Culture, stories, and ethics are things that can’t be reduced to equations, and economics accordingly has difficulty with them
  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool
  • According to Hanson and Simler, these unschooled workers “won’t show up for work reliably on time, or they have problematic superstitions, or they prefer to get job instructions via indirect hints instead of direct orders, or they won’t accept tasks and roles that conflict with their culturally assigned relative status with co-workers, or they won’t accept being told to do tasks differently than they had done them before.”
  • The idea that Maya Angelou’s career amounts to nothing more than a writer shaking her tail feathers to attract the attention of a dominant male is not just misleading; it’s actively embarrassing.
Ed Webb

The Coronavirus and Our Future | The New Yorker - 0 views

  • I’ve spent my life writing science-fiction novels that try to convey some of the strangeness of the future. But I was still shocked by how much had changed, and how quickly.
  • the change that struck me seemed more abstract and internal. It was a change in the way we were looking at things, and it is still ongoing. The virus is rewriting our imaginations. What felt impossible has become thinkable. We’re getting a different sense of our place in history. We know we’re entering a new world, a new era. We seem to be learning our way into a new structure of feeling.
  • The Anthropocene, the Great Acceleration, the age of climate change—whatever you want to call it, we’ve been out of synch with the biosphere, wasting our children’s hopes for a normal life, burning our ecological capital as if it were disposable income, wrecking our one and only home in ways that soon will be beyond our descendants’ ability to repair. And yet we’ve been acting as though it were 2000, or 1990—as though the neoliberal arrangements built back then still made sense. We’ve been paralyzed, living in the world without feeling it.
  • ...24 more annotations...
  • We realize that what we do now, well or badly, will be remembered later on. This sense of enacting history matters. For some of us, it partly compensates for the disruption of our lives.
  • Actually, we’ve already been living in a historic moment. For the past few decades, we’ve been called upon to act, and have been acting in a way that will be scrutinized by our descendants. Now we feel it. The shift has to do with the concentration and intensity of what’s happening. September 11th was a single day, and everyone felt the shock of it, but our daily habits didn’t shift, except at airports; the President even urged us to keep shopping. This crisis is different. It’s a biological threat, and it’s global. Everyone has to change together to deal with it. That’s really history.
  • There are 7.8 billion people alive on this planet—a stupendous social and technological achievement that’s unnatural and unstable. It’s made possible by science, which has already been saving us. Now, though, when disaster strikes, we grasp the complexity of our civilization—we feel the reality, which is that the whole system is a technical improvisation that science keeps from crashing down
  • Today, in theory, everyone knows everything. We know that our accidental alteration of the atmosphere is leading us into a mass-extinction event, and that we need to move fast to dodge it. But we don’t act on what we know. We don’t want to change our habits. This knowing-but-not-acting is part of the old structure of feeling.
  • remember that you must die. Older people are sometimes better at keeping this in mind than younger people. Still, we’re all prone to forgetting death. It never seems quite real until the end, and even then it’s hard to believe. The reality of death is another thing we know about but don’t feel.
  • it is the first of many calamities that will likely unfold throughout this century. Now, when they come, we’ll be familiar with how they feel.
  • water shortages. And food shortages, electricity outages, devastating storms, droughts, floods. These are easy calls. They’re baked into the situation we’ve already created, in part by ignoring warnings that scientists have been issuing since the nineteen-sixties
  • Imagine what a food scare would do. Imagine a heat wave hot enough to kill anyone not in an air-conditioned space, then imagine power failures happening during such a heat wave.
  • science fiction is the realism of our time
  • Science-fiction writers don’t know anything more about the future than anyone else. Human history is too unpredictable; from this moment, we could descend into a mass-extinction event or rise into an age of general prosperity. Still, if you read science fiction, you may be a little less surprised by whatever does happen. Often, science fiction traces the ramifications of a single postulated change; readers co-create, judging the writers’ plausibility and ingenuity, interrogating their theories of history. Doing this repeatedly is a kind of training. It can help you feel more oriented in the history we’re making now. This radical spread of possibilities, good to bad, which creates such a profound disorientation; this tentative awareness of the emerging next stage—these are also new feelings in our time.
  • Do we believe in science? Go outside and you’ll see the proof that we do everywhere you look. We’re learning to trust our science as a society. That’s another part of the new structure of feeling.
  • This mixture of dread and apprehension and normality is the sensation of plague on the loose. It could be part of our new structure of feeling, too.
  • there are charismatic mega-ideas. “Flatten the curve” could be one of them. Immediately, we get it. There’s an infectious, deadly plague that spreads easily, and, although we can’t avoid it entirely, we can try to avoid a big spike in infections, so that hospitals won’t be overwhelmed and fewer people will die. It makes sense, and it’s something all of us can help to do. When we do it—if we do it—it will be a civilizational achievement: a new thing that our scientific, educated, high-tech species is capable of doing. Knowing that we can act in concert when necessary is another thing that will change us.
  • People who study climate change talk about “the tragedy of the horizon.” The tragedy is that we don’t care enough about those future people, our descendants, who will have to fix, or just survive on, the planet we’re now wrecking. We like to think that they’ll be richer and smarter than we are and so able to handle their own problems in their own time. But we’re creating problems that they’ll be unable to solve. You can’t fix extinctions, or ocean acidification, or melted permafrost, no matter how rich or smart you are. The fact that these problems will occur in the future lets us take a magical view of them. We go on exacerbating them, thinking—not that we think this, but the notion seems to underlie our thinking—that we will be dead before it gets too serious. The tragedy of the horizon is often something we encounter, without knowing it, when we buy and sell. The market is wrong; the prices are too low. Our way of life has environmental costs that aren’t included in what we pay, and those costs will be borne by our descendents. We are operating a multigenerational Ponzi scheme.
  • We’ve decided to sacrifice over these months so that, in the future, people won’t suffer as much as they would otherwise. In this case, the time horizon is so short that we are the future people.
  • Amid the tragedy and death, this is one source of pleasure. Even though our economic system ignores reality, we can act when we have to. At the very least, we are all freaking out together. To my mind, this new sense of solidarity is one of the few reassuring things to have happened in this century. If we can find it in this crisis, to save ourselves, then maybe we can find it in the big crisis, to save our children and theirs.
  • Thatcher said that “there is no such thing as society,” and Ronald Reagan said that “government is not the solution to our problem; government is the problem.” These stupid slogans marked the turn away from the postwar period of reconstruction and underpin much of the bullshit of the past forty years
  • We are individuals first, yes, just as bees are, but we exist in a larger social body. Society is not only real; it’s fundamental. We can’t live without it. And now we’re beginning to understand that this “we” includes many other creatures and societies in our biosphere and even in ourselves. Even as an individual, you are a biome, an ecosystem, much like a forest or a swamp or a coral reef. Your skin holds inside it all kinds of unlikely coöperations, and to survive you depend on any number of interspecies operations going on within you all at once. We are societies made of societies; there are nothing but societies. This is shocking news—it demands a whole new world view.
  • It’s as if the reality of citizenship has smacked us in the face.
  • The neoliberal structure of feeling totters. What might a post-capitalist response to this crisis include? Maybe rent and debt relief; unemployment aid for all those laid off; government hiring for contact tracing and the manufacture of necessary health equipment; the world’s militaries used to support health care; the rapid construction of hospitals.
  • If the project of civilization—including science, economics, politics, and all the rest of it—were to bring all eight billion of us into a long-term balance with Earth’s biosphere, we could do it. By contrast, when the project of civilization is to create profit—which, by definition, goes to only a few—much of what we do is actively harmful to the long-term prospects of our species.
  • Economics is a system for optimizing resources, and, if it were trying to calculate ways to optimize a sustainable civilization in balance with the biosphere, it could be a helpful tool. When it’s used to optimize profit, however, it encourages us to live within a system of destructive falsehoods. We need a new political economy by which to make our calculations. Now, acutely, we feel that need.
  • We’ll remember this even if we pretend not to. History is happening now, and it will have happened. So what will we do with that?
  • How we feel is shaped by what we value, and vice versa. Food, water, shelter, clothing, education, health care: maybe now we value these things more, along with the people whose work creates them. To survive the next century, we need to start valuing the planet more, too, since it’s our only home.
Ed Webb

DK Matai: The Rise of The Bio-Info-Nano Singularity - 0 views

  • The human capability for information processing is limited, yet there is an accelerating change in the development and deployment of new technology. This relentless wave upon wave of new information and technology causes an overload on the human mind by eventually flooding it. The resulting acopia -- inability to cope -- has to be solved by the use of ever more sophisticated information intelligence. Extrapolating these capabilities suggests the near-term emergence and visibility of self-improving neural networks, "artificial" intelligence, quantum algorithms, quantum computing and super-intelligence. This metamorphosis is so much beyond present human capabilities that it becomes impossible to understand it with the pre-conceptions and conditioning of the present mindset, societal make-up and existing technology
  • The Bio-Info-Nano Singularity is a transcendence to a wholly new regime of mind, society and technology, in which we have to learn to think in a new way in order to survive as a species.
  • What is globalized human society going to do with the mass of unemployed human beings that are rendered obsolete by the approaching super-intelligence of the Bio-Info-Nano Singularity?
  • ...5 more annotations...
  • Nothing futurists predict ever comes true, but, by the time the time comes, everybody has forgotten they said it--and then they are free to say something else that never will come true but that everybody will have forgotten they said by the time the time come
  • Most of us will become poisoned troglodytes in a techno dystopia
  • Any engineer can make 'stuff' go faster, kill deader, sort quicker, fly higher, record sharper, destroy more completely, etc.. We have a surfeit of that kind of creativity. What we need is some kind of genius to create a society that treats each other with equality, justice, caring and cooperativeness. The concept of 'singularity' doesn't excite me nearly as much as the idea that sometime we might be able to move beyond the civilization level of a troop of chimpanzees. I'm hoping that genius comes before we manage to destroy what little civilization we have with all our neat "stuff"
  • There's a lot of abstraction in this article, which is a trend of what I have read of a number of various movements taking up the Singularity cause. This nebulous but optimistic prediction of an incomprehensibly advanced future, wherein through technology and science we achieve quasi-immortality, or absolute control of thought, omniscience, or transcendence from the human entirely
  • Welcome to the Frankenstein plot. This is a very common Hollywood plot, the idea of a manmade creation running amok. The concept that the author describes can also be described as an asymtotic curve on a graph where scientific achievement parallels time at first then gradually begins to go vertical until infinite scientific knowledge and invention occurs in an incredibly short time.
Ed Webb

Video: Japanese Fembot Learns to Sing By Mimicking Pop Stars | Popular Science - 0 views

  •  
    via Bryan Alexander's Twitter feed
Ed Webb

Schools Urged To Teach Youth Digital Citizenship : NPR - 0 views

  • not being trained in digital citizenship never caused a problem for me. I knew what was right and wrong, and I did the right thing. Why is this being treated so differently???‎"Nobody has come out and said, 'This is how it's supposed to be.'" This is part of my issue with "education". Students are learning that they only need to do what they are told. If there's isn't a rule, it must be OK. There's no thought, no critical evaluation, no drawing of parallels that if something is wrong in this circumstance, then it must also be in this other situation. People need to be allowed (forced? certainly encouraged) to think for themselves -- and to be responsible for their own actions!
    • Ed Webb
       
      Do you agree with this comment? Are issues such as ethics, courtesy etc different in the digital domain, or can/should values cross over? Is there a need for training or education specific to the online rather than common to the offline and online?
  • "For the most part, kids who are in college today never received any form of digital citizenship or media training when they were in high school or middle school."
1 - 20 of 49 Next › Last »
Showing 20 items per page