Skip to main content

Home/ Mindamp/ Group items tagged distracted

Rss Feed Group items tagged

David McGavock

Multitasking, social media and distraction: Research review Journalist's Resource: Rese... - 0 views

  • researchers have tried to assess how humans are coping in this highly connected environment and how “chronic multitasking” may diminish our capacity to function effectively.
  • Clifford Nass, notes that scholarship has remained firm in the overall assessment: “The research is almost unanimous, which is very rare in social science, and it says that people who chronically multitask show an enormous range of deficits. They’re basically terrible at all sorts of cognitive tasks, including multitasking.”
  • Below are more than a dozen representative studies in these areas:
  • ...9 more annotations...
  • The researchers conclude that the experiments “suggest that heavy media multitaskers are distracted by the multiple streams of media they are consuming, or, alternatively, that those who infrequently multitask are more effective at volitionally allocating their attention in the face of distractions.”
  • Members of the ‘Net Generation’ reported more multitasking than members of ‘Generation X,’ who reported more multitasking than members of the ‘Baby Boomer’ generation. The choices of which tasks to combine for multitasking were highly correlated across generations, as were difficulty ratings of specific multitasking combinations.
  • same time, these experts predicted that the impact of networked living on today’s young will drive them to thirst for instant gratification, settle for quick choices, and lack patience
  • similar mental limitations in the types of tasks that can be multitasked.
  • survey about the future of the Internet, technology experts and stakeholders were fairly evenly split as to whether the younger generation’s always-on connection to people and information will turn out to be a net positive or a net negative by 2020.
  • said many of the young people growing up hyperconnected to each other and the mobile Web and counting on the Internet as their external brain will be nimble, quick-acting multitaskers who will do well in key respects.
  • The educational implications include allowing students short ‘technology breaks’ to reduce distractions and teaching students metacognitive strategies regarding when interruptions negatively impact learning.”
  • The data suggest that “using Facebook and texting while doing schoolwork were negatively predictive of overall GPA.” However, “emailing, talking on the phone, and using IM were not related to overall GPA.”
  • Regression analyses revealed that increased media multitasking was associated with higher depression and social anxiety symptoms, even after controlling for overall media use and the personality traits of neuroticism and extraversion.
  •  
    Clifford Nass, notes that scholarship has remained firm in the overall assessment: "The research is almost unanimous, which is very rare in social science, and it says that people who chronically multitask show an enormous range of deficits. They're basically terrible at all sorts of cognitive tasks, including multitasking." - See more at: http://journalistsresource.org/studies/society/social-media/multitasking-social-media-distraction-what-does-research-say#sthash.I21dv2wV.dpuf
David McGavock

I Was So Right About Distraction in Now You See it: Darn it all! | HASTAC - 1 views

  • I aruge that we are always multitasking and sometimes we do it more adeptly than others and it is incumbent on us to take our own internal inventory and decide what we are doing well and what we are not. And then to ask why.
  • The point is too many new technologies at once are distracting.   So is too much life.  So is too much anything that is new, cumbersome, non-routinized. 
  • But there's been so much punditry about "multitasking," as if Twitter is the only thing that makes our life's tasks multiple.   As I've said many times, heartache (emotional overload) and hearburn (physical ailments) are far more distracting than email . . . and they make it harder to learn new technologies too.
  • ...3 more annotations...
  • Multitasking is not a symptom of technology.   The problem is that I am having to learn everything from scratch, all the time, all at once.
  • The same, by the way, is also true when your worklife depends on technology and the technology changes.
    • David McGavock
       
      This is the most frustrating for my computer clients.
  • I say that unlearning, in fact, makes us pay attention to the world in a new way.  George Lakoff says it is useful to become "reflective about our reflexes."
  •  
    "Blaming "the Internet" or "social media" for contemporary distraction falls into a typical pattern of one genereration blaming any new technology for supposed ills, including supposed shortcomings of the younger generation (who seem to adopt new technologies and adapt to them much more easily than do their parents).  "
David McGavock

I Was So Right About Distraction in Now You See it: Darn it all! | HASTAC - 1 views

  • I aruge that we are always multitasking and sometimes we do it more adeptly than others and it is incumbent on us to take our own internal inventory and decide what we are doing well and what we are not
  • The problem is that I am having to learn everything from scratch, all the time, all at once.
  • Is it the technology or the stream of non-stop decision-making that doesn't seem to stick to a 9-5 workday but follows you home from the office, at night, on weekends, on summer vacation?  
  • ...7 more annotations...
  • I advocate avoiding distraction but going deep, introspective, and finding out what exactly is freaking you out.
  • heartache (emotional overload) and hearburn (physical ailments) are far more distracting than email . . . and they make it harder to learn new technologies too.
  • what makes you distracted is that you are doing too many non-automatic, non-reflexive things at once.
  • The same, by the way, is also true when your worklife depends on technology and the technology changes.
  • your former patterns and reflexes don't serve you invisibly, efficiently, automatically. 
  • unlearning, in fact, makes us pay attention to the world in a new way.  George Lakoff says it is useful to become "reflective about our reflexes."
  • I am hoping that the result of this tedious, difficult, uneven, sometimes triumphant, sometime despairing transition time will be a fresh new way of looking at the world, now that so much of the world I took for granted, so many of the collaborations and processes and bureaucracies and patterns and expertise is so vividly transparent.
  •  
    " I am hoping that the result of this tedious, difficult, uneven, sometimes triumphant, sometime despairing transition time will be a fresh new way of looking at the world, now that so much of the world I took for granted, so many of the collaborations and processes and bureaucracies and patterns and expertise is so vividly transparent."
David McGavock

What Happened to Downtime? The Extinction of Deep Thinking & Sacred Space :: Articles :... - 0 views

  • /// article Appreciate (989) Tweet (512) Comment (106) What Happened to Downtime? The Extinction of Deep Thinking & Sacred Space by Scott Belsky Interruption-free space is sacred. Yet, in the digital era we live in, we are losing hold of the few sacred spaces that remain untouched by email, the internet, people, and other forms of distraction. Our cars now have mobile phone integration and a thousand satellite radio stations. When walking from one place to another, we have our devices streaming data from dozens of sources. Even at our bedside, we now have our iPads with heaps of digital apps and the world's information at our fingertips.
  • Why do we crave distraction over downtime?

Why do we give up our sacred space so easily? Because space is scary.
  • It is now possible to always feel loved and cared for, thanks to the efficiency of our “comment walls” on Facebook and seamless connection with everyone we've ever known. Your confidence and self-esteem can quickly be reassured by checking your number of “followers” on Twitter or the number of “likes” garnered by your photographs and blog posts.
  • ...6 more annotations...
  • Our insatiable need to tune into information – at the expense of savoring our downtime – is a form of “work” (something I call “insecurity work”
  • five potential mindsets and solutions for consideration:
  • 1. Rituals for unplugging.

  • We need some rules. When it comes to scheduling, we will need to allocate blocks of time for deep thinking. Maybe you will carve out a 1-2 hour block on your calendar every day for taking a walk or grabbing a cup of coffee and just pondering some of those bigger things.
  • 3. Meditation and naps to clear the mind.

  • It is supremely important that we recognize the power of our insecurities and, at the very least, acknowledge where our anxiety comes from. Awareness is always the first step in solving any problem.


  •  
    Interruption-free space is sacred. Yet, in the digital era we live in, we are losing hold of the few sacred spaces that remain untouched by email, the internet, people, and other forms of distraction.
David McGavock

Is Google Making Us Stupid? - Nicholas Carr - The Atlantic - 1 views

  • I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy.
  • I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
  • The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes
  • ...24 more annotations...
  • I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link.
  • For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind.
  • As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought.
  • The more they use the Web, the more they have to fight to stay focused on long pieces of writing.
  • “I can’t read War and Peace  anymore,” he admitted. “I’ve lost the ability to do that.
    • David McGavock
       
      Unlikely. He hasn't lost the ability but the desire.
  • recently published study of online research habits , conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think.
  • new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins.
  • we may well be reading more today than we did in the 1970s or 1980s
  • “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace.
  • the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains.
  • even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”
  • Lewis Mumford  described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.”
  • In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.
  • The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
  • The Net’s influence doesn’t end at the edges of a computer screen, either. As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations.
  • The Net’s intellectual ethic remains obscure.
    • David McGavock
       
      So the net has ethics?? This anthropomorphism takes away our responsibility
  • The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.”
  • In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.
  • their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized.
  • there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
  • The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.
    • David McGavock
       
      I find this the most compelling argument. "Business" has an interest in selling things. Moving us faster, increasing our "seeking" instinct is one of the keys to this consumption frenzy. The individual needs to understand and manage these forces.
  • The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds.
  • we make our own associations, draw our own inferences and analogies, foster our own ideas.
  • As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin
    • David McGavock
       
      I like this metaphor. Pancake people
David McGavock

Shambhala Sun - Wisdom 2.0: The Digital World Connects (July 2011) - 3 views

  • Soren Gordhamer, author of Wisdom 2.0: Ancient Secrets for the Creative and Constantly Connected, has made it his business to explore how technology alters our patterns of living and how we can work creatively with those changes.
  • Wisdom 2.0 is a conversation among people who have accepted technology and its benefits, but who are also deeply concerned about the inner life.
  • Gordhamer says that Wisdom 2.0 is occurring whenever people realize that “each of us longs for wisdom, and that is what’s really important in life. It’s easy to get caught in the speed of life and forget what we’re here for, our true path.”
  •  
    "Unprecedented communication and information-but also speed, stress, and 24/7 distraction. BARRY BOYCE reports on a group of far-thinking digital leaders who are using mindfulness to humanize the brave new world they have created. "
  •  
    Soren Gordhamer's book, Wisdom 2.0 is a very good read in t his regard.
David McGavock

Infotention, internal: Attentional strategies | Social Media Classroom - 2 views

  • We can talk about attention training, multitasking, the dangers of distraction.
  • What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it"
  • Infotention
  • ...10 more annotations...
  • brain-powered attention skills and computer-powered information filters.
  • work best together with a third element — sociality
  • Honing the mental ability to employ the form of attention appropriate for each moment
  • put together intelligence dashboards, news radars, and information filters from online tools like persistent search and RSS is the external technical component of information literacy.
  • Infotention involves a third element
  • other people
  • recommendations that make it possible to find fresh and useful signals amid the overwhelming noise of the Internet
  • Mentally trained, technologically augmented, socially mediated
  • detecting information that could be valuable specifically to you, whenever and wherever it is useful to you.
  • crap detection skills and basic mindfulness come in. 
David McGavock

Freedom - Windows and Mac Internet Blocking Software - 0 views

  •  
    Freedom is a simple productivity application that locks you away from the internet on Mac or Windows computers for up to eight hours at a time."
  •  
    Tools that helps one to focus. Turn off the distractions.
David McGavock

The Myth Of AI | Edge.org - 1 views

  • what I'm proposing is that if AI was a real thing, then it probably would be less of a threat to us than it is as a fake thing.
  • it adds a layer of religious thinking to what otherwise should be a technical field.
  • we can talk about pattern classification.
  • ...38 more annotations...
  • But when you add to it this religious narrative that's a version of the Frankenstein myth, where you say well, but these things are all leading to a creation of life, and this life will be superior to us and will be dangerous
  • I'm going to go through a couple of layers of how the mythology does harm.
  • this overall atmosphere of accepting the algorithms as doing a lot more than they do. In the case of Netflix, the recommendation engine is serving to distract you from the fact that there's not much choice anyway.
  • If a program tells you, well, this is how things are, this is who you are, this is what you like, or this is what you should do, we have a tendency to accept that.
  • our economy has shifted to what I call a surveillance economy, but let's say an economy where algorithms guide people a lot, we have this very odd situation where you have these algorithms that rely on big data in order to figure out who you should date, who you should sleep with, what music you should listen to, what books you should read, and on and on and on
  • people often accept that
  • all this overpromising that AIs will be about to do this or that. It might be to become fully autonomous driving vehicles instead of only partially autonomous, or it might be being able to fully have a conversation as opposed to only having a useful part of a conversation to help you interface with the device.
  • other cases where the recommendation engine is not serving that function, because there is a lot of choice, and yet there's still no evidence that the recommendations are particularly good.
  • there's no way to tell where the border is between measurement and manipulation in these systems.
  • if the preponderance of those people have grown up in the system and are responding to whatever choices it gave them, there's not enough new data coming into it for even the most ideal or intelligent recommendation engine to do anything meaningful.
  • it simply turns into a system that measures which manipulations work, as opposed to which ones don't work, which is very different from a virginal and empirically careful system that's trying to tell what recommendations would work had it not intervened
  • What's not clear is where the boundary is.
  • If you ask: is a recommendation engine like Amazon more manipulative, or more of a legitimate measurement device? There's no way to know.
  • we don't know to what degree they're measurement versus manipulation.
  • If people are deciding what books to read based on a momentum within the recommendation engine that isn't going back to a virgin population, that hasn't been manipulated, then the whole thing is spun out of control and doesn't mean anything anymore
  • not so much a rise of evil as a rise of nonsense.
  • because of the mythology about AI, the services are presented as though they are these mystical, magical personas. IBM makes a dramatic case that they've created this entity that they call different things at different times—Deep Blue and so forth.
  • Cortana or a Siri
  • This pattern—of AI only working when there's what we call big data, but then using big data in order to not pay large numbers of people who are contributing—is a rising trend in our civilization, which is totally non-sustainable
    • David McGavock
       
      Key relationship between automation of tasks, downsides, and expectation for AI
  • If you talk about AI as a set of techniques, as a field of study in mathematics or engineering, it brings benefits. If we talk about AI as a mythology of creating a post-human species, it creates a series of problems that I've just gone over, which include acceptance of bad user interfaces, where you can't tell if you're being manipulated or not, and everything is ambiguous.
  • It creates incompetence, because you don't know whether recommendations are coming from anything real or just self-fulfilling prophecies from a manipulative system that spun off on its own, and economic negativity, because you're gradually pulling formal economic benefits away from the people who supply the data that makes the scheme work.
  • I'm going to give you two scenarios.
  • let's suppose somebody comes up with a way to 3-D print a little assassination drone that can go buzz around and kill somebody. Let's suppose that these are cheap to make.
  • Having said all that, let's address directly this problem of whether AI is going to destroy civilization and people, and take over the planet and everything.
  • some disaffected teenagers, or terrorists, or whoever start making a bunch of them, and they go out and start killing people randomly
  • This idea that some lab somewhere is making these autonomous algorithms that can take over the world is a way of avoiding the profoundly uncomfortable political problem, which is that if there's some actuator that can do harm, we have to figure out some way that people don't do harm with it.
    • David McGavock
       
      Another key - focus on the actuator, not the agent that exploits it.
  • the part that causes the problem is the actuator. It's the interface to physicality
  • not so much whether it's a bunch of teenagers or terrorists behind it or some AI
  • The sad fact is that, as a society, we have to do something to not have little killer drones proliferate.
  • What we don't have to worry about is the AI algorithm running them, because that's speculative.
  • another one where there's so-called artificial intelligence, some kind of big data scheme, that's doing exactly the same thing, that is self-directed and taking over 3-D printers, and sending these things off to kill people.
  • There's a whole other problem area that has to do with neuroscience, where if we pretend we understand things before we do, we do damage to science,
  • You have to be able to accept what your ignorances are in order to do good science. To reject your own ignorance just casts you into a silly state where you're a lesser scientist.
  • To my mind, the mythology around AI is a re-creation of some of the traditional ideas about religion, but applied to the technical world.
  • The notion of this particular threshold—which is sometimes called the singularity, or super-intelligence, or all sorts of different terms in different periods—is similar to divinity.
  • In the history of organized religion, it's often been the case that people have been disempowered precisely to serve what were perceived to be the needs of some deity or another, where in fact what they were doing was supporting an elite class that was the priesthood for that deity.
    • David McGavock
       
      Technical priesthood.
  • If AI means this mythology of this new creature we're creating, then it's just a stupid mess that's confusing everybody, and harming the future of the economy. If what we're talking about is a set of algorithms and actuators that we can improve and apply in useful ways, then I'm very interested, and I'm very much a participant in the community that's improving those things.
  • A lot of people in the religious world are just great, and I respect and like them. That goes hand-in-hand with my feeling that some of the mythology in big religion still leads us into trouble that we impose on ourselves and don't need.
  •  
    "The idea that computers are people has a long and storied history. It goes back to the very origins of computers, and even from before. There's always been a question about whether a program is something alive or not since it intrinsically has some kind of autonomy at the very least, or it wouldn't be a program. There has been a domineering subculture-that's been the most wealthy, prolific, and influential subculture in the technical world-that for a long time has not only promoted the idea that there's an equivalence between algorithms and life, and certain algorithms and people, but a historical determinism that we're inevitably making computers that will be smarter and better than us and will take over from us."
1 - 9 of 9
Showing 20 items per page