Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Brain

Rss Feed Group items tagged

Weiye Loh

LRB · Jim Holt · Smarter, Happier, More Productive - 0 views

  • There are two ways that computers might add to our wellbeing. First, they could do so indirectly, by increasing our ability to produce other goods and services. In this they have proved something of a disappointment. In the early 1970s, American businesses began to invest heavily in computer hardware and software, but for decades this enormous investment seemed to pay no dividends. As the economist Robert Solow put it in 1987, ‘You can see the computer age everywhere but in the productivity statistics.’ Perhaps too much time was wasted in training employees to use computers; perhaps the sorts of activity that computers make more efficient, like word processing, don’t really add all that much to productivity; perhaps information becomes less valuable when it’s more widely available. Whatever the case, it wasn’t until the late 1990s that some of the productivity gains promised by the computer-driven ‘new economy’ began to show up – in the United States, at any rate. So far, Europe appears to have missed out on them.
  • The other way computers could benefit us is more direct. They might make us smarter, or even happier. They promise to bring us such primary goods as pleasure, friendship, sex and knowledge. If some lotus-eating visionaries are to be believed, computers may even have a spiritual dimension: as they grow ever more powerful, they have the potential to become our ‘mind children’. At some point – the ‘singularity’ – in the not-so-distant future, we humans will merge with these silicon creatures, thereby transcending our biology and achieving immortality. It is all of this that Woody Allen is missing out on.
  • But there are also sceptics who maintain that computers are having the opposite effect on us: they are making us less happy, and perhaps even stupider. Among the first to raise this possibility was the American literary critic Sven Birkerts. In his book The Gutenberg Elegies (1994), Birkerts argued that the computer and other electronic media were destroying our capacity for ‘deep reading’. His writing students, thanks to their digital devices, had become mere skimmers and scanners and scrollers. They couldn’t lose themselves in a novel the way he could. This didn’t bode well, Birkerts thought, for the future of literary culture.
  • ...6 more annotations...
  • Suppose we found that computers are diminishing our capacity for certain pleasures, or making us worse off in other ways. Why couldn’t we simply spend less time in front of the screen and more time doing the things we used to do before computers came along – like burying our noses in novels? Well, it may be that computers are affecting us in a more insidious fashion than we realise. They may be reshaping our brains – and not for the better. That was the drift of ‘Is Google Making Us Stupid?’, a 2008 cover story by Nicholas Carr in the Atlantic.
  • Carr thinks that he was himself an unwitting victim of the computer’s mind-altering powers. Now in his early fifties, he describes his life as a ‘two-act play’, ‘Analogue Youth’ followed by ‘Digital Adulthood’. In 1986, five years out of college, he dismayed his wife by spending nearly all their savings on an early version of the Apple Mac. Soon afterwards, he says, he lost the ability to edit or revise on paper. Around 1990, he acquired a modem and an AOL subscription, which entitled him to spend five hours a week online sending email, visiting ‘chat rooms’ and reading old newspaper articles. It was around this time that the programmer Tim Berners-Lee wrote the code for the World Wide Web, which, in due course, Carr would be restlessly exploring with the aid of his new Netscape browser.
  • Carr launches into a brief history of brain science, which culminates in a discussion of ‘neuroplasticity’: the idea that experience affects the structure of the brain. Scientific orthodoxy used to hold that the adult brain was fixed and immutable: experience could alter the strengths of the connections among its neurons, it was believed, but not its overall architecture. By the late 1960s, however, striking evidence of brain plasticity began to emerge. In one series of experiments, researchers cut nerves in the hands of monkeys, and then, using microelectrode probes, observed that the monkeys’ brains reorganised themselves to compensate for the peripheral damage. Later, tests on people who had lost an arm or a leg revealed something similar: the brain areas that used to receive sensory input from the lost limbs seemed to get taken over by circuits that register sensations from other parts of the body (which may account for the ‘phantom limb’ phenomenon). Signs of brain plasticity have been observed in healthy people, too. Violinists, for instance, tend to have larger cortical areas devoted to processing signals from their fingering hands than do non-violinists. And brain scans of London cab drivers taken in the 1990s revealed that they had larger than normal posterior hippocampuses – a part of the brain that stores spatial representations – and that the increase in size was proportional to the number of years they had been in the job.
  • The brain’s ability to change its own structure, as Carr sees it, is nothing less than ‘a loophole for free thought and free will’. But, he hastens to add, ‘bad habits can be ingrained in our neurons as easily as good ones.’ Indeed, neuroplasticity has been invoked to explain depression, tinnitus, pornography addiction and masochistic self-mutilation (this last is supposedly a result of pain pathways getting rewired to the brain’s pleasure centres). Once new neural circuits become established in our brains, they demand to be fed, and they can hijack brain areas devoted to valuable mental skills. Thus, Carr writes: ‘The possibility of intellectual decay is inherent in the malleability of our brains.’ And the internet ‘delivers precisely the kind of sensory and cognitive stimuli – repetitive, intensive, interactive, addictive – that have been shown to result in strong and rapid alterations in brain circuits and functions’. He quotes the brain scientist Michael Merzenich, a pioneer of neuroplasticity and the man behind the monkey experiments in the 1960s, to the effect that the brain can be ‘massively remodelled’ by exposure to the internet and online tools like Google. ‘THEIR HEAVY USE HAS NEUROLOGICAL CONSEQUENCES,’ Merzenich warns in caps – in a blog post, no less.
  • It’s not that the web is making us less intelligent; if anything, the evidence suggests it sharpens more cognitive skills than it dulls. It’s not that the web is making us less happy, although there are certainly those who, like Carr, feel enslaved by its rhythms and cheated by the quality of its pleasures. It’s that the web may be an enemy of creativity. Which is why Woody Allen might be wise in avoiding it altogether.
  • empirical support for Carr’s conclusion is both slim and equivocal. To begin with, there is evidence that web surfing can increase the capacity of working memory. And while some studies have indeed shown that ‘hypertexts’ impede retention – in a 2001 Canadian study, for instance, people who read a version of Elizabeth Bowen’s story ‘The Demon Lover’ festooned with clickable links took longer and reported more confusion about the plot than did those who read it in an old-fashioned ‘linear’ text – others have failed to substantiate this claim. No study has shown that internet use degrades the ability to learn from a book, though that doesn’t stop people feeling that this is so – one medical blogger quoted by Carr laments, ‘I can’t read War and Peace any more.’
Weiye Loh

BrainGate gives paralysed the power of mind control | Science | The Observer - 0 views

  • brain-computer interface, or BCI
  • is a branch of science exploring how computers and the human brain can be meshed together. It sounds like science fiction (and can look like it too), but it is motivated by a desire to help chronically injured people. They include those who have lost limbs, people with Lou Gehrig's disease, or those who have been paralysed by severe spinal-cord injuries. But the group of people it might help the most are those whom medicine assumed were beyond all hope: sufferers of "locked-in syndrome".
  • These are often stroke victims whose perfectly healthy minds end up trapped inside bodies that can no longer move. The most famous example was French magazine editor Jean-Dominique Bauby who managed to dictate a memoir, The Diving Bell and the Butterfly, by blinking one eye. In the book, Bauby, who died in 1997 shortly after the book was published, described the prison his body had become for a mind that still worked normally.
  • ...9 more annotations...
  • Now the project is involved with a second set of human trials, pushing the technology to see how far it goes and trying to miniaturise it and make it wireless for a better fit in the brain. BrainGate's concept is simple. It posits that the problem for most patients does not lie in the parts of the brain that control movement, but with the fact that the pathways connecting the brain to the rest of the body, such as the spinal cord, have been broken. BrainGate plugs into the brain, picks up the right neural signals and beams them into a computer where they are translated into moving a cursor or controlling a computer keyboard. By this means, paralysed people can move a robot arm or drive their own wheelchair, just by thinking about it.
  • he and his team are decoding the language of the human brain. This language is made up of electronic signals fired by billions of neurons and it controls everything from our ability to move, to think, to remember and even our consciousness itself. Donoghue's genius was to develop a deceptively small device that can tap directly into the brain and pick up those signals for a computer to translate them. Gold wires are implanted into the brain's tissue at the motor cortex, which controls movement. Those wires feed back to a tiny array – an information storage device – attached to a "pedestal" in the skull. Another wire feeds from the array into a computer. A test subject with BrainGate looks like they have a large plug coming out the top of their heads. Or, as Donoghue's son once described it, they resemble the "human batteries" in The Matrix.
  • BrainGate's highly advanced computer programs are able to decode the neuron signals picked up by the wires and translate them into the subject's desired movement. In crude terms, it is a form of mind-reading based on the idea that thinking about moving a cursor to the right will generate detectably different brain signals than thinking about moving it to the left.
  • The technology has developed rapidly, and last month BrainGate passed a vital milestone when one paralysed patient went past 1,000 days with the implant still in her brain and allowing her to move a computer cursor with her thoughts. The achievement, reported in the prestigious Journal of Neural Engineering, showed that the technology can continue to work inside the human body for unprecedented amounts of time.
  • Donoghue talks enthusiastically of one day hooking up BrainGate to a system of electronic stimulators plugged into the muscles of the arm or legs. That would open up the prospect of patients moving not just a cursor or their wheelchair, but their own bodies.
  • If Nagle's motor cortex was no longer working healthily, the entire BrainGate project could have been rendered pointless. But when Nagle was plugged in and asked to imagine moving his limbs, the signals beamed out with a healthy crackle. "We asked him to imagine moving his arm to the left and to the right and we could hear the activity," Donoghue says. When Nagle first moved a cursor on a screen using only his thoughts, he exclaimed: "Holy shit!"
  • BrainGate and other BCI projects have also piqued the interest of the government and the military. BCI is melding man and machine like no other sector of medicine or science and there are concerns about some of the implications. First, beyond detecting and translating simple movement commands, BrainGate may one day pave the way for mind-reading. A device to probe the innermost thoughts of captured prisoners or dissidents would prove very attractive to some future military or intelligence service. Second, there is the idea that BrainGate or other BCI technologies could pave the way for robot warriors controlled by distant humans using only their minds. At a conference in 2002, a senior American defence official, Anthony Tether, enthused over BCI. "Imagine a warrior with the intellect of a human and the immortality of a machine." Anyone who has seen Terminator might worry about that.
  • Donoghue acknowledges the concerns but has little time for them. When it comes to mind-reading, current BrainGate technology has enough trouble with translating commands for making a fist, let alone probing anyone's mental secrets
  • As for robot warriors, Donoghue was slightly more circumspect. At the moment most BCI research, including BrainGate projects, that touch on the military is focused on working with prosthetic limbs for veterans who have lost arms and legs. But Donoghue thinks it is healthy for scientists to be aware of future issues. "As long as there is a rational dialogue and scientists think about where this is going and what is the reasonable use of the technology, then we are on a good path," he says.
  •  
    The robotic arm clutched a glass and swung it over a series of coloured dots that resembled a Twister gameboard. Behind it, a woman sat entirely immobile in a wheelchair. Slowly, the arm put the glass down, narrowly missing one of the dots. "She's doing that!" exclaims Professor John Donoghue, watching a video of the scene on his office computer - though the woman onscreen had not moved at all. "She actually has the arm under her control," he says, beaming with pride. "We told her to put the glass down on that dot." The woman, who is almost completely paralysed, was using Donoghue's groundbreaking technology to control the robot arm using only her thoughts. Called BrainGate, the device is implanted into her brain and hooked up to a computer to which she sends mental commands. The video played on, giving Donoghue, a silver-haired and neatly bearded man of 62, even more reason to feel pleased. The patient was not satisfied with her near miss and the robot arm lifted the glass again. After a brief hover, the arm positioned the glass on the dot.
Weiye Loh

The internet: is it changing the way we think? | Technology | The Observer - 0 views

  • American magazine the Atlantic lobs an intellectual grenade into our culture. In the summer of 1945, for example, it published an essay by the Massachusetts Institute of Technology (MIT) engineer Vannevar Bush entitled "As We May Think". It turned out to be the blueprint for what eventually emerged as the world wide web. Two summers ago, the Atlantic published an essay by Nicholas Carr, one of the blogosphere's most prominent (and thoughtful) contrarians, under the headline "Is Google Making Us Stupid?".
  • Carr wrote, "I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle."
  • Carr's target was not really the world's leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.
  • ...9 more annotations...
  • Carr's article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he's an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking.
  • Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it's needed again? The web has become, in a way, a global prosthesis for our collective memory.
  • easy to dismiss Carr's concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.
  • many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre's Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that "people's use of the internet has enhanced human intelligence".
  • As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn't the same thing as changing our brains. The brain is like any other muscle – if you don't stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.
  • he internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you've worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it's at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.
  • I've seen students' thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.
  • But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We've all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.
  • In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn't discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet's power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.
  •  
    The internet: is it changing the way we think? American writer Nicholas Carr's claim that the internet is not only shaping our lives but physically altering our brains has sparked a lively and ongoing debate, says John Naughton. Below, a selection of writers and experts offer their opinion
Weiye Loh

Skepticblog » Kurzweil vs Myer on Brain Complexity - 1 views

  • Kurzweil is still claiming that we can infer something about how much complexity is in the brain from the genome. He writes: The amount of information in the genome (after lossless compression, which is feasible because of the massive redundancy in the genome) is about 50 million bytes (down from 800 million bytes in the uncompressed genome). It is true that the information in the genome goes through a complex route to create a brain, but the information in the genome constrains the amount of information in the brain prior to the brain’s interaction with its environment.
  •  
    There is an interesting blog debate going on between PZ Myers and Ray Kurzweil about the complexity of the brain - a topic that I too blog about and so I thought I would offer my thoughts. The "debate" started with a talk by Kurzweil at the Singularity Summit, a press summary of which prompted this response from PZ Myers. Kurzweil then responded here, and Myers responded to his response here.
Weiye Loh

Roger Pielke Jr.'s Blog: Science Impact - 0 views

  • The Guardian has a blog post up by three neuroscientists decrying the state of hype in the media related to their field, which is fueled in part by their colleagues seeking "impact." 
  • Anyone who has followed recent media reports that electrical brain stimulation "sparks bright ideas" or "unshackles the genius within" could be forgiven for believing that we stand on the frontier of a brave new world. As James Gallagher of the BBC put it, "Are we entering the era of the thinking cap – a device to supercharge our brains?" The answer, we would suggest, is a categorical no. Such speculations begin and end in the colourful realm of science fiction. But we are also in danger of entering the era of the "neuro-myth", where neuroscientists sensationalise and distort their own findings in the name of publicity. The tendency for scientists to over-egg the cake when dealing with the media is nothing new, but recent examples are striking in their disregard for accurate reporting to the public. We believe the media and academic community share a collective responsibility to prevent pseudoscience from masquerading as neuroscience.
  • They identify an . . . . . . unacceptable gulf between, on the one hand, the evidence-bound conclusions reached in peer-reviewed scientific journals, and on the other, the heavy spin applied by scientists to achieve publicity in the media. Are we as neuroscientists so unskilled at communicating with the public, or so low in our estimation of the public's intelligence, that we see no alternative but to mislead and exaggerate?
  • ...1 more annotation...
  • Somewhere down the line, achieving an impact in the media seems to have become the goal in itself, rather than what it should be: a way to inform and engage the public with clarity and objectivity, without bias or prejudice. Our obsession with impact is not one-sided. The craving of scientists for publicity is fuelled by a hurried and unquestioning media, an academic community that disproportionately rewards publication in "high impact" journals such as Nature, and by research councils that emphasise the importance of achieving "impact" while at the same time delivering funding cuts. Academics are now pushed to attend media training courses, instructed about "pathways to impact", required to include detailed "impact summaries" when applying for grant funding, and constantly reminded about the importance of media engagement to further their careers. Yet where in all of this strategising and careerism is it made clear why public engagement is important? Where is it emphasised that the most crucial consideration in our interactions with the media is that we are accurate, honest and open about the limitations of our research?
  •  
    The Guardian has a blog post up by three neuroscientists decrying the state of hype in the media related to their field, which is fueled in part by their colleagues seeking "impact." 
Weiye Loh

Being Bilingual May Boost Your Brain Power : NPR - 0 views

  • "There is absolutely no evidence that bilingual acquisition leads to confusion, and there is no evidence that bilingual acquisition leads to delay," she said.
  • no matter what language a person is speaking at the moment, both languages are active in the brain.
  • This means that bilinguals have to do something that monolinguals don't do — they have to keep the two languages separate. Bialystok likens it to tuning into the right signal on the radio or television: The brain has to keep the two channels separate and pay attention to only one.
  • ...1 more annotation...
  • Constantly engaging this executive control function is a form of mental exercise, explains Bialystok, and some researchers, including herself, believe that this can be beneficial for the brain. Bilingual speakers have been shown to perform better on a variety of cognitive tasks, and one study Bialistok did found that dementia set in four to five years later in people who spent their lives speaking two languages instead of one.
  •  
    no matter what language a person is speaking at the moment, both languages are active in the brain.
Weiye Loh

Paul Crowley's Blog - A survey of anti-cryonics writing - 0 views

  • cryonics offers almost eternal life. To its critics, cryonics is pseudoscience; the idea that we could freeze someone today in such a way that future technology might be able to re-animate them is nothing more than wishful thinking on the desire to avoid death. Many who battle nonsense dressed as science have spoken out against it: see for example Nano Nonsense and Cryonics, a 2001 article by celebrated skeptic Michael Shermer; or check the Skeptic’s Dictionary or Quackwatch entries on the subject, or for more detail read the essay Cryonics–A futile desire for everlasting life by “Invisible Flan”.
  • And of course the pro-cryonics people have written reams and reams of material such as Ben Best’s Scientific Justification of Cryonics Practice on why they think this is exactly as plausible as I might think, and going into tremendous technical detail setting out arguments for its plausibility and addressing particular difficulties. It’s almost enough to make you want to sign up on the spot. Except, of course, that plenty of totally unscientific ideas are backed by reams of scientific-sounding documents good enough to fool non-experts like me. Backed by the deep pockets of the oil industry, global warming denialism has produced thousands of convincing-sounding arguments against the scientific consensus on CO2 and AGW. T
  • Nano Nonsense and Cryonics goes for the nitty-gritty right away in the opening paragraph:To see the flaw in this system, thaw out a can of frozen strawberries. During freezing, the water within each cell expands, crystallizes, and ruptures the cell membranes. When defrosted, all the intracellular goo oozes out, turning your strawberries into runny mush. This is your brain on cryonics.This sounds convincing, but doesn’t address what cryonicists actually claim. Ben Best, President and CEO of the Cryonics Institute, replies in the comments:Strawberries (and mammalian tissues) are not turned to mush by freezing because water expands and crystallizes inside the cells. Water crystallizes in the extracellular space because more nucleators are found extracellularly. As water crystallizes in the extracellular space, the extracellular salt concentration increases causing cells to lose water osmotically and shrink. Ultimately the cell membranes are broken by crushing from extracellular ice and/or high extracellular salt concentration. […] Cryonics organizations use vitrification perfusion before cooling to cryogenic temperatures. With good brain perfusion, vitrification can reduce ice formation to negligible amounts.
  • ...6 more annotations...
  • The Skeptic’s Dictionary entry is no advance. Again, it refers erroneously to a “mushy brain”. It points out that the technology to reanimate those in storage does not already exist, but provides no help for us non-experts in assessing whether it is a plausible future technology, like super-fast computers or fusion power, or whether it is as crazy as the sand-powered tank; it simply asserts baldly and to me counterintuitively that it is the latter. Again, perhaps cryonic reanimation is a sand-powered tank, but I can explain to you why a sand-powered tank is implausible if you don’t already know, and if cryonics is in the same league I’d appreciate hearing the explanation.
  • Another part of the article points out the well-known difficulties with whole-body freezing — because the focus is on achieving the best possible preservation of the brain, other parts suffer more. But the reason why the brain is the focus is that you can afford to be a lot bolder in repairing other parts of the body — unlike the brain, if my liver doesn’t survive the freezing, it can be replaced altogether.
  • Further, the article ignores one of the most promising possibilities for reanimation, that of scanning and whole-brain emulation, a route that requires some big advances in computer and scanning technology as well as our understanding of the lowest levels of the brain’s function, but which completely sidesteps any problems with repairing either damage from the freezing process or whatever it was that led to legal death.
  • Sixteen years later, it seems that hasn’t changed; in fact, as far as the issue of technical feasability goes it is starting to look as if on all the Earth, or at least all the Internet, there is not one person who has ever taken the time to read and understand cryonics claims in any detail, still considers it pseudoscience, and has written a paper, article or even a blog post to rebut anything that cryonics advocates actually say. In fact, the best of the comments on my first blog post on the subject are already a higher standard than anything my searches have turned up.
  • I don’t have anything useful to add, I just wanted to say that I feel exactly as you do about cryonics and living forever. And I thought that this statement: I know that I don’t know enough to judge. shows extreme wisdom. If only people wishing to comment on global warming would apply the same test.
  • WRT global warming, the mistake people make is trying to go direct to the first-order evidence, which is much too complicated and too easy to misrepresent to hope to directly interpret unless you make it your life’s work, and even then only in a particular area. The correct thing to do is to collect second-order evidence, such as that every major scientific academy has backed the IPCC.
    • Weiye Loh
       
      First-order evidence vs second-order evidence...
  •  
    Cryonics
Weiye Loh

New Study Shows EMF Effect On Brain - So What? « Health « Skeptic North - 0 views

  • In the past 6 months, Skeptic North has run several articles about WiFi, cell phones and the purported health effects that radio-frequency electro-magnetic fields (RF-EMF) may or may not have on the brain and body.  Despite the overwhelming evidence to the contrary, the controversy remains to be a popular story in the media, perhaps because of the ubiquitous nature of cellular technology and the popular meme of the hidden dangers of modern life.
  • over the news this week were reports of a new study purporting to show a link between cell phone EMF radiation and increased brain metabolism.  The study was conducted by a well-respected group of NIH researchers conducting tests at the Brookhaven National Laboratory in the US.  The study is available in abstract for free here, but you have to pay to have access to the full text version.  Reporters all over network news told a simplified story of the paper by Volkow et al, published in the Journal of the American Medical Association, and drew conclusions outside the scope of the study, inflaming an already overly effulgent debate.
  •  I and others have continued to insist that there is no good evidence of any effect by the microwave radiation emitted by cell phones, cell towers and other electronics on the brain and other body systems, let alone that this radiation causes cancer or other serious illnesses.  The syndrome referred to as electro-hypersensitivity remains un-proved and unfounded, but it seems like there is new evidence that shows that the normal levels of radiation emitted by a cell phone can effect neuronal cells in the brain.
  •  
    This study is what it is: another thread in the fine woven cloth of our knowledge of nature. It is not a panacea, nor does it refute any evidence that there is or is not a risk of health effects from cell phone use, nor does it show a mechanism of how EMF could alter biological systems. Anybody who purports that it does is basing this conclusion on ideology, not science.
Weiye Loh

Your Brain on Computers - Attached to Technology and Paying a Price - NYTimes.com - 0 views

  • The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing. (View an interactive panorama of Mr. Campbell's workstation.)
  • Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
  • “It seems like he can no longer be fully in the moment.”
  • ...4 more annotations...
  • juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.
  • These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive.
  • While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.
  • even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
  •  
    YOUR BRAIN ON COMPUTERS Hooked on Gadgets, and Paying a Mental Price
Weiye Loh

It's Only A Theory: From the 2010 APA in Boston: Neuropsychology and ethics - 0 views

  • Joshua Greene from Harvard, known for his research on "neuroethics," the neurological underpinnings of ethical decision making in humans. The title of Greene's talk was "Beyond point-and-shoot morality: why cognitive neuroscience matters for ethics."
  • What Greene is interested in is to find out to what factors moral judgment is sensitive to, and whether it is sensitive to the relevant factors. He presented his dual process theory of morality. In this respect, he proposed an analogy with a camera. Cameras have automatic (point and shoot) settings as well as manual controls. The first mode is good enough for most purposes, the second allows the user to fine tune the settings more carefully. The two modes allow for a nice combination of efficiency and flexibility.
  • The idea is that the human brain also has two modes, a set of efficient automatic responses and a manual mode that makes us more flexible in response to non standard situations. The non moral example is our response to potential threats. Here the amygdala is very fast and efficient at focusing on potential threats (e.g., the outline of eyes in the dark), even when there actually is no threat (it's a controlled experiment in a lab, no lurking predator around).
  • ...12 more annotations...
  • Delayed gratification illustrates the interaction between the two modes. The brain is attracted by immediate rewards, no matter what kind. However, when larger rewards are eventually going to become available, other parts of the brain come into play to override (sometimes) the immediate urge.
  • Greene's research shows that our automatic setting is "Kantian," meaning that our intuitive responses are deontological, rule driven. The manual setting, on the other hand, tends to be more utilitarian / consequentialist. Accordingly, the first mode involves emotional areas of the brain, the second one involves more cognitive areas.
  • The evidence comes from the (in)famous trolley dilemma and it's many variations.
  • when people refuse to intervene in the footbridge (as opposed to the lever) version of the dilemma, they do so because of a strong emotional response, which contradicts the otherwise utilitarian calculus they make when considering the lever version.
  • psychopaths turn out to be more utilitarian than normal subjects - presumably not because consequentialism is inherently pathological, but because their emotional responses are stunted. Mood also affects the results, with people exposed to comedy (to enhance mood), for instance, more likely to say that it is okay to push the guy off the footbridge.
  • In a more recent experiment, subjects were asked to say which action carried the better consequences, which made them feel worse, and which was overall morally acceptable. The idea was to separate the cognitive, emotional and integrative aspects of moral decision making. Predictably, activity in the amygdala correlated with deontological judgment, activity in more cognitive areas was associated with utilitarianism, and different brain regions became involved in integrating the two.
  • Another recent experiment used visual vs. verbal descriptions of moral dilemmas. Turns out that more visual people tend to behave emotionally / deontologically, while more verbal people are more utilitarian.
  • studies show that interfering with moral judgment by engaging subjects with a cognitive task slows down (though it does not reverse) utilitarian judgment, but has no effect on deontological judgment. Again, in agreement with the conclusion that the first type of modality is the result of cognition, the latter of emotion.
  • Nice to know, by the way, that when experimenters controlled for "real world expectations" that people have about trolleys, or when they used more realistic scenarios than trolleys and bridges, the results don't vary. In other words, trolley thought experiments are actually informative, contrary to popular criticisms.
  • What factors affect people's decision making in moral judgment? The main one is proximity, with people feeling much stronger obligations if they are present to the event posing the dilemma, or even relatively near (a disaster happens in a nearby country), as opposed to when they are far (a country on the other side of the world).
  • Greene's general conclusion is that neuroscience matters to ethics because it reveals the hidden mechanisms of human moral decision making. However, he says this is interesting to philosophers because it may lead to question ethical theories that are implicitly or explicitly based on such judgments. But neither philosophical deontology nor consequentialism are in fact based on common moral judgments, seems to me. They are the result of explicit analysis. (Though Greene raises the possibility that some philosophers engage in rationalizing, rather than reason, as in Kant's famously convoluted idea that masturbation is wrong because one is using oneself as a mean to an end...)
  • this is not to say that understanding moral decision making in humans isn't interesting or in fact even helpful in real life cases. An example of the latter is the common moral condemnation of incest, which is an emotional reaction that probably evolved to avoid genetically diseased offspring. It follows that science can tell us that three is nothing morally wrong in cases of incest when precautions have been taken to avoid pregnancy (and assuming psychological reactions are also accounted for). Greene puts this in terms of science helping us to transform difficult ought questions into easier ought questions.
Weiye Loh

The Creativity Crisis - Newsweek - 0 views

  • The accepted definition of creativity is production of something original and useful, and that’s what’s reflected in the tests. There is never one right answer. To be creative requires divergent thinking (generating many unique ideas) and then convergent thinking (combining those ideas into the best result).
  • Torrance’s tasks, which have become the gold standard in creativity assessment, measure creativity perfectly. What’s shocking is how incredibly well Torrance’s creativity index predicted those kids’ creative accomplishments as adults.
  • The correlation to lifetime creative accomplishment was more than three times stronger for childhood creativity than childhood IQ.
  • ...20 more annotations...
  • there is one crucial difference between IQ and CQ scores. With intelligence, there is a phenomenon called the Flynn effect—each generation, scores go up about 10 points. Enriched environments are making kids smarter. With creativity, a reverse trend has just been identified and is being reported for the first time here: American creativity scores are falling.
  • creativity scores had been steadily rising, just like IQ scores, until 1990. Since then, creativity scores have consistently inched downward.
  • It is the scores of younger children in America—from kindergarten through sixth grade—for whom the decline is “most serious.”
  • It’s too early to determine conclusively why U.S. creativity scores are declining. One likely culprit is the number of hours kids now spend in front of the TV and playing videogames rather than engaging in creative activities. Another is the lack of creativity development in our schools. In effect, it’s left to the luck of the draw who becomes creative: there’s no concerted effort to nurture the creativity of all children.
  • Around the world, though, other countries are making creativity development a national priority.
  • In China there has been widespread education reform to extinguish the drill-and-kill teaching style. Instead, Chinese schools are also adopting a problem-based learning approach.
  • When faculty of a major Chinese university asked Plucker to identify trends in American education, he described our focus on standardized curriculum, rote memorization, and nationalized testing.
  • Overwhelmed by curriculum standards, American teachers warn there’s no room in the day for a creativity class.
  • The age-old belief that the arts have a special claim to creativity is unfounded. When scholars gave creativity tasks to both engineering majors and music majors, their scores laid down on an identical spectrum, with the same high averages and standard deviations.
  • The argument that we can’t teach creativity because kids already have too much to learn is a false trade-off. Creativity isn’t about freedom from concrete facts. Rather, fact-finding and deep research are vital stages in the creative process.
  • The lore of pop psychology is that creativity occurs on the right side of the brain. But we now know that if you tried to be creative using only the right side of your brain, it’d be like living with ideas perpetually at the tip of your tongue, just beyond reach.
  • Creativity requires constant shifting, blender pulses of both divergent thinking and convergent thinking, to combine new information with old and forgotten ideas. Highly creative people are very good at marshaling their brains into bilateral mode, and the more creative they are, the more they dual-activate.
  • “Creativity can be taught,” says James C. Kaufman, professor at California State University, San Bernardino. What’s common about successful programs is they alternate maximum divergent thinking with bouts of intense convergent thinking, through several stages. Real improvement doesn’t happen in a weekend workshop. But when applied to the everyday process of work or school, brain function improves.
  • highly creative adults tended to grow up in families embodying opposites. Parents encouraged uniqueness, yet provided stability. They were highly responsive to kids’ needs, yet challenged kids to develop skills. This resulted in a sort of adaptability: in times of anxiousness, clear rules could reduce chaos—yet when kids were bored, they could seek change, too. In the space between anxiety and boredom was where creativity flourished.
  • highly creative adults frequently grew up with hardship. Hardship by itself doesn’t lead to creativity, but it does force kids to become more flexible—and flexibility helps with creativity.
  • In early childhood, distinct types of free play are associated with high creativity. Preschoolers who spend more time in role-play (acting out characters) have higher measures of creativity: voicing someone else’s point of view helps develop their ability to analyze situations from different perspectives. When playing alone, highly creative first graders may act out strong negative emotions: they’ll be angry, hostile, anguished.
  • In middle childhood, kids sometimes create paracosms—fantasies of entire alternative worlds. Kids revisit their paracosms repeatedly, sometimes for months, and even create languages spoken there. This type of play peaks at age 9 or 10, and it’s a very strong sign of future creativity.
  • From fourth grade on, creativity no longer occurs in a vacuum; researching and studying become an integral part of coming up with useful solutions. But this transition isn’t easy. As school stuffs more complex information into their heads, kids get overloaded, and creativity suffers. When creative children have a supportive teacher—someone tolerant of unconventional answers, occasional disruptions, or detours of curiosity—they tend to excel. When they don’t, they tend to underperform and drop out of high school or don’t finish college at high rates.
  • They’re quitting because they’re discouraged and bored, not because they’re dark, depressed, anxious, or neurotic. It’s a myth that creative people have these traits. (Those traits actually shut down creativity; they make people less open to experience and less interested in novelty.) Rather, creative people, for the most part, exhibit active moods and positive affect. They’re not particularly happy—contentment is a kind of complacency creative people rarely have. But they’re engaged, motivated, and open to the world.
  • A similar study of 1,500 middle schoolers found that those high in creative self-efficacy had more confidence about their future and ability to succeed. They were sure that their ability to come up with alternatives would aid them, no matter what problems would arise.
  •  
    The Creativity Crisis For the first time, research shows that American creativity is declining. What went wrong-and how we can fix it.
Weiye Loh

Inside the Mind of a Psychopath: Scientific American - 0 views

  • Aided by EEGs and brain scans, scientists have discovered that psychopaths possess significant impairments that affect their ability to feel emotions, read other people’s cues and learn from their mistakes.
  • These deficiencies may be apparent in children who are as young as five years old.
  • When you tally trials, prison stays and inflicted damage, psychopaths cost us $250 billion to $400 billion a year.
  • ...1 more annotation...
  • Psychopaths have traditionally been considered untreatable, but novel forms of therapy show promise.
  •  
    Neuroscientists are discovering that some of the most cold-blooded killers aren't bad. They suffer from a brain abnormality that sets them adrift in an emotionless world
Weiye Loh

Twitter, Facebook Won't Make You Immoral - But TV News Might | Wired Science | Wired.com - 1 views

  • It’s too soon to say that Twitter and Facebook destroy the mental foundations of morality, but not too soon to ask what they’re doing.
  • In the paper, published Monday in the Proceedings of the National Academy of Sciences, 13 people were shown documentary-style multimedia narratives designed to arouse empathy. Researchers recorded their brain activity and found that empathy is as deeply rooted in the human psyche as fear and anger.
  • They also noticed that empathic brain systems took an average of six to eight seconds to start up. The researchers didn’t connect this to media consumption habits, but the study’s press release fueled speculation that the Facebook generation could turn into sociopaths.
  • ...6 more annotations...
  • Entitled "Can Twitter Make You Amoral? Rapid-fire Media May Confuse Your Moral Compass," it claimed that the research "raises questions about the emotional cost —particularly for the developing brain — of heavy reliance on a rapid stream of news snippets obtained through television, online feeds or social networks such as Twitter."
  • Compared to in-depth news coverage, first-person Tweets of on-the-ground events, such as the 2008 Mumbai bombings, is generally unmoving. But in those situations, Twitter’s primary use is in gathering useful, immediate facts, not storytelling.
  • Most people who read a handful of words about a friend’s heartache, or see a link to a tragic story, would likely follow it up. But following links to a video news story makes the possibility of a short-circuited neurobiology of compassion becomes more real. Research suggests that people are far more empathic when stories are told in a linear way, without quick shot-to-shot edits. In a 1996 Empirical Studies of the Arts paper, researchers showed three versions of an ostensibly tear-jerking story to 120 test subjects. "Subjects had significantly more favorable impressions of the victimized female protagonist than of her male opponent only when the story structure was linear," they concluded.
  • A review of tabloid news formats in the Journal of Broadcasting & Electronic Media found that jarring, rapid-fire visual storytelling produced a physiological arousal led to better recall of what was seen, but only if the original subject matter was dull. If it was already arousing, tabloid storytelling appeared to produce a cognitive overload that actually prevented stories from sinking in.
  • "Quick cuts will draw and retain a viewer’s focus even if the content is uninteresting," said freelance video producer Jill Bauerle. "MTV-like jump cuts, which have become the standard for many editors, serve as a sort of eye candy to keep eyeballs peeled to screen."
  • f compassion can only be activated by sustained attention, which is prevented by fast-cut editing, then the ability to be genuinely moved by another’s story could atrophy. It might even fail to properly develop in children, whose brains are being formed in ways that will last a lifetime. More research is clearly needed, including a replication of the original empathy findings, but the hypothesis is plausible.
  •  
    Twitter, Facebook Won't Make You Immoral - But TV News Might
Weiye Loh

Expectations can cancel out the benefit of pain drugs - 0 views

  • People who don't believe their pain medicine will work can actually reduce or even cancel out the effectiveness of the drug, and images of their brains show how they are doing it, scientists said
  • Researchers from Britain and Germany used brain scans to map how a person's feelings and past experiences can influence the effectiveness of medicines, and found that a powerful painkilling drug with a true biological effect can appear not to be working if a patient has been primed to expect it to fail.
  • By contrast, positive expectations about the treatment doubled the natural physiological or biochemical effect of an opioid drug among 22 healthy volunteers in the study.
  • ...3 more annotations...
  • "The brain imaging is telling us that patients really are switching on and off parts of their brains through the mechanisms of expectation -- positive and negative," said Irene Tracy of Britain's Oxford University, who led the research. "(The effect of expectations) is powerful enough to give real added benefits of the drug, and unfortunately it is also very capable of overriding the true analgesic effect." The placebo effect is the real benefit seen when patients are given dummy treatments but believe they will do them good. The nocebo effect is the opposite, when patients get real negative effects when they have doubts about a treatment.
  • For their study, the scientists used the drug remifentanil, a potent ultra short-acting synthetic opioid painkiller which is marketed by drugmakers GlaxoSmithKline and Abbott as Ultiva. The study was published in the Science Translational Medicine journal on Wednesday. Volunteers were put in an MRI scanner and had heat applied to one leg. They were asked to rate pain on a 1 to 100 scale. Unknown to the volunteers, the researchers started giving the drug via infusion to see what effects there would be when the volunteers had no knowledge or expectation of treatment. The average initial pain rating of 66 went down to 55. The volunteers were then told they would now start to get the drug, although no change was actually made and they just continued receiving the opioid at the same dose. The average pain ratings dropped further to 39. The volunteers were then told the drug had been stopped and warned that there may be an increase in pain. In reality, the drug was still being given at the same dose, but their pain intensity increased to 64 -- meaning the pain was almost as bad as it had been at the beginning, before they had had any drug.
  • Tracey said there may be lessons for the design of clinical trials, which often compare an experimental drug against a dummy pill to see if there is any effect beyond the placebo effect. "We should control for the effect of people's expectations on the results of any clinical trial," she said. "At the very least we should make sure we minimize any negative expectations to make sure we're not masking true efficacy in a trial drug."
Weiye Loh

Being Bilingual: Beneficial Workout for the Brain - Research - The Chronicle of Higher ... - 0 views

  • "Bilingual babies pay attention to visual information whether it is specific to their language or not," said Janet F. Werker, director of the Infant Studies Centre at the University of British Columbia.
  •  
    In the latest research, described Friday at the American Association for the Advancement of Science, the onset of the symptoms of Alzheimer's disease was delayed by more than four years in elderly bilingual adults, even though they had identical brain damage compared with a group of adults in the study who spoke only one language. "It's not that being bilingual prevents Alzheimer's," said Ellen Bialystok, a professor of psychology at York University, in Toronto. "It's just that you are better able to cope."
Weiye Loh

A Clockwork Chemistry - Guy Kahane - Project Syndicate - 0 views

  •  
    Over the past decade, an army of psychologists, neuroscientists, and evolutionary biologists has been busy trying to uncover the neural "clockwork" that underlies human morality. They have started to trace the evolutionary origins of pro-social sentiments such as empathy, and have begun to uncover the genes that dispose some individuals to senseless violence and others to acts of altruism, and the pathways in our brain that shape our ethical decisions. And to understand how something works is also to begin to see ways to modify and even control it. Indeed, scientists have not only identified some of the brain pathways that shape our ethical decisions, but also chemical substances that modulate this neural activity.
Weiye Loh

How facts backfire - The Boston Globe - 0 views

  • a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
  •  
    How facts backfire Researchers discover a surprising threat to democracy: our brains By Joe Keohane July 11, 2010
Weiye Loh

Rationally Speaking: The sorry state of higher education - 0 views

  • two disconcerting articles crossed my computer screen, both highlighting the increasingly sorry state of higher education, though from very different perspectives. The first is “Ed Dante’s” (actually a pseudonym) piece in the Chronicle of Higher Education, entitled The Shadow Scholar. The second is Gregory Petsko’s A Faustian Bargain, published of all places in Genome Biology.
  • There is much to be learned by educators in the Shadow Scholar piece, except the moral that “Dante” would like us to take from it. The anonymous author writes:“Pointing the finger at me is too easy. Why does my business thrive? Why do so many students prefer to cheat rather than do their own work? Say what you want about me, but I am not the reason your students cheat.
  • The point is that plagiarism and cheating happen for a variety of reasons, one of which is the existence of people like Mr. Dante and his company, who set up a business that is clearly unethical and should be illegal. So, pointing fingers at him and his ilk is perfectly reasonable. Yes, there obviously is a “market” for cheating in higher education, and there are complex reasons for it, but he is in a position similar to that of the drug dealer who insists that he is simply providing the commodity to satisfy society’s demand. Much too easy of a way out, and one that doesn’t fly in the case of drug dealers, and shouldn’t fly in the case of ghost cheaters.
  • ...16 more annotations...
  • As a teacher at the City University of New York, I am constantly aware of the possibility that my students might cheat on their tests. I do take some elementary precautionary steps
  • Still, my job is not that of the policeman. My students are adults who theoretically are there to learn. If they don’t value that learning and prefer to pay someone else to fake it, so be it, ultimately it is they who lose in the most fundamental sense of the term. Just like drug addicts, to return to my earlier metaphor. And just as in that other case, it is enablers like Mr. Dante who simply can’t duck the moral blame.
  • n open letter to the president of SUNY-Albany, penned by molecular biologist Gregory Petsko. The SUNY-Albany president has recently announced the closing — for budgetary reasons — of the departments of French, Italian, Classics, Russian and Theater Arts at his university.
  • Petsko begins by taking on one of the alleged reasons why SUNY-Albany is slashing the humanities: low enrollment. He correctly points out that the problem can be solved overnight at the stroke of a pen: stop abdicating your responsibilities as educators and actually put constraints on what your students have to take in order to graduate. Make courses in English literature, foreign languages, philosophy and critical thinking, the arts and so on, mandatory or one of a small number of options that the students must consider in order to graduate.
  • But, you might say, that’s cheating the market! Students clearly don’t want to take those courses, and a business should cater to its customers. That type of reasoning is among the most pernicious and idiotic I’ve ever heard. Students are not clients (if anything, their parents, who usually pay the tuition, are), they are not shopping for a new bag or pair of shoes. They do not know what is best for them educationally, that’s why they go to college to begin with. If you are not convinced about how absurd the students-as-clients argument is, consider an analogy: does anyone with functioning brain cells argue that since patients in a hospital pay a bill, they should be dictating how the brain surgeon operates? I didn’t think so.
  • Petsko then tackles the second lame excuse given by the president of SUNY-Albany (and common among the upper administration of plenty of public universities): I can’t do otherwise because of the legislature’s draconian cuts. Except that university budgets are simply too complicated for there not to be any other option. I know this first hand, I’m on a special committee at my own college looking at how to creatively deal with budget cuts handed down to us from the very same (admittedly small minded and dysfunctional) New York state legislature that has prompted SUNY-Albany’s action. As Petsko points out, the president there didn’t even think of involving the faculty and staff in a broad discussion of how to deal with the crisis, he simply announced the cuts on a Friday afternoon and then ran for cover. An example of very poor leadership to say the least, and downright hypocrisy considering all the talk that the same administrator has been dishing out about the university “community.”
  • Finally, there is the argument that the humanities don’t pay for their own way, unlike (some of) the sciences (some of the time). That is indubitably true, but irrelevant. Universities are not businesses, they are places of higher learning. Yes, of course they need to deal with budgets, fund raising and all the rest. But the financial and administrative side has one goal and one goal only: to provide the best education to the students who attend that university.
  • That education simply must include the sciences, philosophy, literature, and the arts, as well as more technical or pragmatic offerings such as medicine, business and law. Why? Because that’s the kind of liberal education that makes for an informed and intelligent citizenry, without which our democracy is but empty talk, and our lives nothing but slavery to the marketplace.
  • Maybe this is not how education works in the US. I thought that general (or compulsory) education (ie. up to high school) is designed to make sure that citizens in a democratic country can perform their civil duties. A balanced and well-rounded education, which includes a healthy mixture of science and humanities, is indeed very important for this purpose. However, college-level education is for personal growth and therefore the person must have a large say about what kind of classes he or she chooses to take. I am disturbed by Massimo's hospital analogy. Students are not ill. They don't go to college to be cured, or to be good citizens. They go to college to learn things that *they* want to learn. Patients are passive. Students are not.I agree that students typically do not know what kind of education is good for them. But who does?
  • students do have a saying in their education. They pick their major, and there are electives. But I object to the idea that they can customize their major any way they want. That assumes they know what the best education for them is, they don't. That's the point of education.
  • The students are in your class to get a good grade, any learning that takes place is purely incidental. Those good grades will look good on their transcript and might convince a future employer that they are smart and thus are worth paying more.
  • I don't know what the dollar to GPA exchange rate is these days, but I don't doubt that there is one.
  • Just how many of your students do you think will remember the extensive complex jargon of philosophy more than a couple of months after they leave your classroom?
  • and our lives nothing but slavery to the marketplace.We are there. Welcome. Where have you been all this time? In a capitalistic/plutocratic society money is power (and free speech too according to the supreme court). Money means a larger/better house/car/clothing/vacation than your neighbor and consequently better mating opportunities. You can mostly blame the women for that one I think just like the peacock's tail.
  • If a student of surgery fails to learn they might maim, kill or cripple someone. If an engineer of airplanes fails to learn they might design a faulty aircraft that fails and kills people. If a student of chemistry fails to learn they might design a faulty drug with unintended and unfortunate side effects, but what exactly would be the harm if a student of philosophy fails to learn Aristotle had to say about elements or Plato had to say about perfect forms? These things are so divorced from people's everyday activities as to be rendered all but meaningless.
  • human knowledge grows by leaps and bounds every day, but human brain capacity does not, so the portion of human knowledge you can personally hold gets smaller by the minute. Learn (and remember) as much as you can as fast as you can and you will still lose ground. You certainly have your work cut out for you emphasizing the importance of Thales in the Age of Twitter and whatever follows it next year.
Weiye Loh

Meet the Ethical Placebo: A Story that Heals | NeuroTribes - 0 views

  • In modern medicine, placebos are associated with another form of deception — a kind that has long been thought essential for conducting randomized clinical trials of new drugs, the statistical rock upon which the global pharmaceutical industry was built. One group of volunteers in an RCT gets the novel medication; another group (the “control” group) gets pills or capsules that look identical to the allegedly active drug, but contain only an inert substance like milk sugar. These faux drugs are called placebos.
  • Inevitably, the health of some people in both groups improves, while the health of others grows worse. Symptoms of illness fluctuate for all sorts of reasons, including regression to the mean.
  • Since the goal of an RCT, from Big Pharma’s perspective, is to demonstrate the effectiveness of a new drug, the return to robust health of a volunteer in the control group is considered a statistical distraction. If too many people in the trial get better after downing sugar pills, the real drug will look worse by comparison — sometimes fatally so for the purpose of earning approval from the Food and Drug Adminstration.
  • ...12 more annotations...
  • For a complex and somewhat mysterious set of reasons, it is becoming increasingly difficult for experimental drugs to prove their superiority to sugar pills in RCTs
  • in recent years, however, has it become obvious that the abatement of symptoms in control-group volunteers — the so-called placebo effect — is worthy of study outside the context of drug trials, and is in fact profoundly good news to anyone but investors in Pfizer, Roche, and GlaxoSmithKline.
  • The emerging field of placebo research has revealed that the body’s repertoire of resilience contains a powerful self-healing network that can help reduce pain and inflammation, lower the production of stress chemicals like cortisol, and even tame high blood pressure and the tremors of Parkinson’s disease.
  • more and more studies each year — by researchers like Fabrizio Benedetti at the University of Turin, author of a superb new book called The Patient’s Brain, and neuroscientist Tor Wager at the University of Colorado — demonstrate that the placebo effect might be potentially useful in treating a wide range of ills. Then why aren’t doctors supposed to use it?
  • The medical establishment’s ethical problem with placebo treatment boils down to the notion that for fake drugs to be effective, doctors must lie to their patients. It has been widely assumed that if a patient discovers that he or she is taking a placebo, the mind/body password will no longer unlock the network, and the magic pills will cease to do their job.
  • For “Placebos Without Deception,” the researchers tracked the health of 80 volunteers with irritable bowel syndrome for three weeks as half of them took placebos and the other half didn’t.
  • In a previous study published in the British Medical Journal in 2008, Kaptchuk and Kirsch demonstrated that placebo treatment can be highly effective for alleviating the symptoms of IBS. This time, however, instead of the trial being “blinded,” it was “open.” That is, the volunteers in the placebo group knew that they were getting only inert pills — which they were instructed to take religiously, twice a day. They were also informed that, just as Ivan Pavlov trained his dogs to drool at the sound of a bell, the body could be trained to activate its own built-in healing network by the act of swallowing a pill.
  • In other words, in addition to the bogus medication, the volunteers were given a true story — the story of the placebo effect. They also received the care and attention of clinicians, which have been found in many other studies to be crucial for eliciting placebo effects. The combination of the story and a supportive clinical environment were enough to prevail over the knowledge that there was really nothing in the pills. People in the placebo arm of the trial got better — clinically, measurably, significantly better — on standard scales of symptom severity and overall quality of life. In fact, the volunteers in the placebo group experienced improvement comparable to patients taking a drug called alosetron, the standard of care for IBS. Meet the ethical placebo: a powerfully effective faux medication that meets all the standards of informed consent.
  • The study is hardly the last word on the subject, but more like one of the first. Its modest sample size and brief duration leave plenty of room for followup research. (What if “ethical” placebos wear off more quickly than deceptive ones? Does the fact that most of the volunteers in this study were women have any bearing on the outcome? Were any of the volunteers skeptical that the placebo effect is real, and did that affect their response to treatment?) Before some eager editor out there composes a tweet-baiting headline suggesting that placebos are about to drive Big Pharma out of business, he or she should appreciate the fact that the advent of AMA-approved placebo treatments would open numerous cans of fascinatingly tangled worms. For example, since the precise nature of placebo effects is shaped largely by patients’ expectations, would the advertised potency and side effects of theoretical products like Placebex and Therastim be subject to change by Internet rumors, requiring perpetual updating?
  • It’s common to use the word “placebo” as a synonym for “scam.” Economists talk about placebo solutions to our economic catastrophe (tax cuts for the rich, anyone?). Online skeptics mock the billion-dollar herbal-medicine industry by calling it Big Placebo. The fact that our brains and bodies respond vigorously to placebos given in warm and supportive clinical environments, however, turns out to be very real.
  • We’re also discovering that the power of narrative is embedded deeply in our physiology.
  • in the real world of doctoring, many physicians prescribe medications at dosages too low to have an effect on their own, hoping to tap into the body’s own healing resources — though this is mostly acknowledged only in whispers, as a kind of trade secret.
Weiye Loh

m.guardian.co.uk - 0 views

  • I got an email from a science teacher about a 13-year-old pupil. Both have to remain anonymous. This pupil wrote an article about Brain Gym for her school paper, explaining why it's nonsense: the essay is respectful, straightforward, and factual. But the school decided they couldn't print it, because it would offend teachers in the junior school who use Brain Gym.Now, this is weakminded, and perhaps even vicious. More interesting, though, is how often children are able to spot bullshit, and how often adults want to shut them up.
  • Emily Rosa is the youngest person ever to have published a scientific paper in JAMA , one of the most influential medical journals in the world. At the age of nine she saw a TV programme about nurses who practise "Therapeutic Touch", claiming they can detect and manipulate a "human energy field" by hovering their hands above a patient.
  • Rosa conceived and executed an experiment to test if they really could detect this "field". Twenty-one experienced practitioners put their palms on a table, behind a screen. Rosa flipped a coin, hovered her hand over the therapist's left or right palm accordingly, and waited for them to say which it was. The therapists performed no better than chance, and with 280 attempts there was sufficient statistical power to show that these claims were bunk. Therapeutic Touch practitioners, including some in university posts, were deeply unhappy: they insisted loudly that JAMA was wrong to publish the study.
  • ...2 more annotations...
  • Rhys Morgan , a schoolboy with Crohns disease. Last year, chatting on crohnsforum.com, he saw people recommending "Miracle Mineral Solution", which turned out to be industrial bleach, sold with a dreary conspiracy theory to cure Aids, cancer and so on.Aged 15, he was perfectly capable of exploring the evidence, finding official documents , and explaining why it was dangerous. The adults banned him. Since then he's got his story on The One Show, while the chief medical officer for Wales, the Food Standards Agency and Trading Standards have waded in.
  • If every school taught the basics – randomised trials, blinding, cohort studies, and why systematic reviews are better than cherrypicking your evidence – it would help everyone navigate the world, and learn some of the most important ideas in the whole of science.
  •  
    Information is more easily accessible now than ever before, and smart, motivated people can sidestep traditional routes to obtain knowledge and disseminate it.
1 - 20 of 63 Next › Last »
Showing 20 items per page