Skip to main content

Home/ TOK@ISPrague/ Group items tagged cognition

Rss Feed Group items tagged

Lawrence Hrubes

Henry Gustave Molaison: The Basis for 'Memento' and the World's Most Celebrated Amnesia... - 0 views

  •  
    "Scoville later called the operation "a tragic mistake" and warned neurosurgeons never to repeat it, but neuroscience and cognitive psychology benefitted hugely. The operation could not have been better designed if the intent had been to create a new kind of experimental object that showed where in the brain memory lived: there was no other way that Molaison's brain injuries could have occurred, and no other way that the precision of his memory damage could have been brought about. Molaison gave scientists a way to map cognitive functions onto brain structures. It became possible to subdivide memory into different types and to locate their cerebral Zip Codes."
Lawrence Hrubes

The Interpreter - The New Yorker - 2 views

  • Everett, who this past fall became the chairman of the Department of Languages, Literature, and Cultures at Illinois State University, has been publishing academic books and papers on the Pirahã (pronounced pee-da-HAN) for more than twenty-five years. But his work remained relatively obscure until early in 2005, when he posted on his Web site an article titled “Cultural Constraints on Grammar and Cognition in Pirahã,” which was published that fall in the journal Cultural Anthropology. The article described the extreme simplicity of the tribe’s living conditions and culture. The Pirahã, Everett wrote, have no numbers, no fixed color terms, no perfect tense, no deep memory, no tradition of art or drawing, and no words for “all,” “each,” “every,” “most,” or “few”—terms of quantification believed by some linguists to be among the common building blocks of human cognition. Everett’s most explosive claim, however, was that Pirahã displays no evidence of recursion, a linguistic operation that consists of inserting one phrase inside another of the same type, as when a speaker combines discrete thoughts (“the man is walking down the street,” “the man is wearing a top hat”) into a single sentence (“The man who is wearing a top hat is walking down the street”). Noam Chomsky, the influential linguistic theorist, has recently revised his theory of universal grammar, arguing that recursion is the cornerstone of all languages, and is possible because of a uniquely human cognitive ability. Steven Pinker, the Harvard cognitive scientist, calls Everett’s paper “a bomb thrown into the party.”
  •  
    Piraha tribe in Brazil studied by linguists and anthropologists
markfrankel18

Paul Bloom: The Case Against Empathy : The New Yorker - 0 views

  • Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an “affective revolution.” There is increasing focus on the emotions, especially those involved in moral thought and action. We’ve learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others. Other researchers are exploring how empathy emerges in chimpanzee and other primates, how it flowers in young children, and the sort of circumstances that trigger it.
  • Rifkin calls for us to make the leap to “global empathic consciousness.” He sees this as the last best hope for saving the world from environmental destruction, and concludes with the plaintive question “Can we reach biosphere consciousness and global empathy in time to avoid planetary collapse?”
  • This enthusiasm may be misplaced, however. Empathy has some unfortunate features—it is parochial, narrow-minded, and innumerate. We’re often at our best when we’re smart enough not to rely on it.
  • ...2 more annotations...
  • Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an “affective revolution.” There is increasing focus on the emotions, especially those involved in moral thought and action. We’ve learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others. Other researchers are exploring how empathy emerges in chimpanzee and other primates, how it flowers in young children, and the sort of circumstances that trigger it.
  • This interest isn’t just theoretical. If we can figure out how empathy works, we might be able to produce more of it.
  •  
    "Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an "affective revolution." There is increasing focus on the emotions, especially those involved in moral thought and action. We've learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others."
markfrankel18

Why We Need Answers: The Theory of Cognitive closure : The New Yorker - 0 views

  • The human mind is incredibly averse to uncertainty and ambiguity; from an early age, we respond to uncertainty or lack of clarity by spontaneously generating plausible explanations. What’s more, we hold on to these invented explanations as having intrinsic value of their own. Once we have them, we don’t like to let them go.
  • Heightened need for cognitive closure can bias our choices, change our preferences, and influence our mood. In our rush for definition, we tend to produce fewer hypotheses and search less thoroughly for information. We become more likely to form judgments based on early cues (something known as impressional primacy), and as a result become more prone to anchoring and correspondence biases (using first impressions as anchors for our decisions and not accounting enough for situational variables). And, perversely, we may not even realize how much we are biasing our own judgments.
  • In 2010, Kruglanski and colleagues looked specifically at the need for cognitive closure as part of the response to terrorism.
  • ...1 more annotation...
  • It’s a self-reinforcing loop: we search energetically, but once we’ve seized onto an idea we remain crystallized at that point. And if we’ve externally committed ourselves to our position by tweeting or posting or speaking? We crystallize our judgment all the more, so as not to appear inconsistent. It’s why false rumors start—and why they die such hard deaths. It’s a dynamic that can have consequences far nastier than a minor media snafu.
Lawrence Hrubes

James Flynn: Why our IQ levels are higher than our grandparents' | Video on TED.com - 0 views

  •  
    "It's called the "Flynn effect" -- the fact that each generation scores higher on an IQ test than the generation before it. Are we actually getting smarter, or just thinking differently? In this fast-paced spin through the cognitive history of the 20th century, moral philosopher James Flynn suggests that changes in the way we think have had surprising (and not always positive) consequences. James Flynn challenges our fundamental assumptions about intelligence."
markfrankel18

To Understand Religion, Think Football - Issue 17: Big Bangs - Nautilus - 5 views

  • The invention of religion is a big bang in human history. Gods and spirits helped explain the unexplainable, and religious belief gave meaning and purpose to people struggling to survive. But what if everything we thought we knew about religion was wrong? What if belief in the supernatural is window dressing on what really matters—elaborate rituals that foster group cohesion, creating personal bonds that people are willing to die for. Anthropologist Harvey Whitehouse thinks too much talk about religion is based on loose conjecture and simplistic explanations. Whitehouse directs the Institute of Cognitive and Evolutionary Anthropology at Oxford University. For years he’s been collaborating with scholars around the world to build a massive body of data that grounds the study of religion in science. Whitehouse draws on an array of disciplines—archeology, ethnography, history, evolutionary psychology, cognitive science—to construct a profile of religious practices.
  • I suppose people do try to fill in the gaps in their knowledge by invoking supernatural explanations. But many other situations prompt supernatural explanations. Perhaps the most common one is thinking there’s a ritual that can help us when we’re doing something with a high risk of failure. Lots of people go to football matches wearing their lucky pants or lucky shirt. And you get players doing all sorts of rituals when there’s a high-risk situation like taking a penalty kick.
  • We tend to take a few bits and pieces of the most familiar religions and see them as emblematic of what’s ancient and pan-human. But those things that are ancient and pan-human are actually ubiquitous and not really part of world religions. Again, it really depends on what we mean by “religion.” I think the best way to answer that question is to try and figure out which cognitive capacities came first.
Lawrence Hrubes

An Artist with Amnesia - The New Yorker - 2 views

  • Lately, Johnson draws for pleasure, but for three decades she had a happily hectic career as an illustrator, sometimes presenting clients with dozens of sketches a day. Her playful watercolors once adorned packages of Lotus software; for a program called Magellan, she created a ship whose masts were tethered to billowing diskettes. She made a popular postcard of two red parachutes tied together, forming a heart; several other cards were sold for years at MOMA’s gift shop. Johnson produced half a dozen covers for this magazine, including one, from 1985, that presented a sunny vision of an artist’s life: a loft cluttered with pastel canvases, each of them depicting a fragment of the skyline that is framed by a picture window. It’s as if the paintings were jigsaw pieces, and the city a puzzle being solved. Now Johnson is obsessed with making puzzles. Many times a day, she uses her grids as foundations for elaborate arrangements of letters on a page—word searches by way of Mondrian. For all the dedication that goes into her puzzles, however, they are confounding creations: very few are complete. She is assembling one of the world’s largest bodies of unfinished art.
  • Nicholas Turk-Browne, a cognitive neuroscientist at Princeton, entered the lab and greeted Johnson in the insistently zippy manner of a kindergarten teacher: “Lonni Sue! We’re going to put you in a kind of space machine and take pictures of your brain!” A Canadian with droopy dark-brown hair, he typically speaks with mellow precision. Though they had met some thirty times before, Johnson continued to regard him as an amiable stranger. Turk-Browne is one of a dozen scientists, at Princeton and at Johns Hopkins, who have been studying her, with Aline and Maggi’s consent. Aline told me, “When we realized the magnitude of Lonni Sue’s illness, my mother and I promised each other to turn what could be a tragedy into something which could help others.” Cognitive science has often gained crucial insights by studying people with singular brains, and Johnson is the first person with profound amnesia to be examined extensively with an fMRI. Several papers have been published about Johnson, and the researchers say that she could fuel at least a dozen more.
Lawrence Hrubes

If Animals Have Rights, Should Robots? - The New Yorker - 0 views

  • People projected thoughts into Harambe’s mind. “Our tendency is to see our actions through human lenses,” a neuroscientist named Kurt Gray told the network as the frenzy peaked. “We can’t imagine what it’s like to actually be a gorilla. We can only imagine what it’s like to be us being a gorilla.” This simple fact is responsible for centuries of ethical dispute. One Harambe activist might believe that killing a gorilla as a safeguard against losing human life is unjust due to our cognitive similarity: the way gorillas think is a lot like the way we think, so they merit a similar moral standing. Another might believe that gorillas get their standing from a cognitive dissimilarity: because of our advanced powers of reason, we are called to rise above the cat-eat-mouse game, to be special protectors of animals, from chickens to chimpanzees. (Both views also support untroubled omnivorism: we kill animals because we are but animals, or because our exceptionalism means that human interests win.) These beliefs, obviously opposed, mark our uncertainty about whether we’re rightful peers or masters among other entities with brains.
  • The big difference, they argue, is “sentience.” Many animals have it; zygotes and embryos don’t. Colb and Dorf define sentience as “the ability to have subjective experiences,” which is a little tricky, because animal subjectivity is what’s hard for us to pin down. A famous paper called “What Is It Like to Be a Bat?,” by the philosopher Thomas Nagel, points out that even if humans were to start flying, eating bugs, and getting around by sonar they would not have a bat’s full experience, or the batty subjectivity that the creature had developed from birth.
  • If animals suffer, the philosopher Peter Singer noted in “Animal Liberation” (1975), shouldn’t we include them in the calculus of minimizing pain? Such an approach to peership has advantages: it establishes the moral claims of animals without projecting human motivations onto them. But it introduces other problems. Bludgeoning your neighbor is clearly worse than poisoning a rat.
markfrankel18

The Older Mind May Just Be a Fuller Mind - NYTimes.com - 0 views

  • Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases. Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
Lawrence Hrubes

The Trouble with Snooze Buttons (and with Modern Sleep) : The New Yorker - 1 views

  •  
    Research into sleep patterns, wake times, the effects of light and dark, and how these affect cognitive abilities
artuscaer

What are Ethical Dilemmas? | TOKTalk.net - 2 views

  •  
    This is also related to psychology through an idea of cognitive dissonance.
Lawrence Hrubes

Is Bilingualism Really an Advantage? - The New Yorker - 1 views

  • Many modern language researchers agree with that premise. Not only does speaking multiple languages help us to communicate but bilingualism (or multilingualism) may actually confer distinct advantages to the developing brain. Because a bilingual child switches between languages, the theory goes, she develops enhanced executive control, or the ability to effectively manage what are called higher cognitive processes, such as problem-solving, memory, and thought. She becomes better able to inhibit some responses, promote others, and generally emerges with a more flexible and agile mind. It’s a phenomenon that researchers call the bilingual advantage.
  • For the first half of the twentieth century, researchers actually thought that bilingualism put a child at a disadvantage, something that hurt her I.Q. and verbal development. But, in recent years, the notion of a bilingual advantage has emerged from research to the contrary, research that has seemed both far-reaching and compelling, much of it coming from the careful work of the psychologist Ellen Bialystok. For many tasks, including ones that involve working memory, bilingual speakers seem to have an edge. In a 2012 review of the evidence, Bialystok showed that bilinguals did indeed show enhanced executive control, a quality that has been linked, among other things, to better academic performance. And when it comes to qualities like sustained attention and switching between tasks effectively, bilinguals often come out ahead. It seems fairly evident then that, given a choice, you should raise your child to speak more than one language.
  • Systematically, de Bruin combed through conference abstracts from a hundred and sixty-nine conferences, between 1999 and 2012, that had to do with bilingualism and executive control. The rationale was straightforward: conferences are places where people present in-progress research. They report on studies that they are running, initial results, initial thoughts. If there were a systematic bias in the field against reporting negative results—that is, results that show no effects of bilingualism—then there should be many more findings of that sort presented at conferences than actually become published. That’s precisely what de Bruin found. At conferences, about half the presented results provided either complete or partial support for the bilingual advantage on certain tasks, while half provided partial or complete refutation. When it came to the publications that appeared after the preliminary presentation, though, the split was decidedly different. Sixty-eight per cent of the studies that demonstrated a bilingual advantage found a home in a scientific journal, compared to just twenty-nine per cent of those that found either no difference or a monolingual edge. “Our overview,” de Bruin concluded, “shows that there is a distorted image of the actual study outcomes on bilingualism, with researchers (and media) believing that the positive effect of bilingualism on nonlinguistic cognitive processes is strong and unchallenged.”
Lawrence Hrubes

The Power of Touch - The New Yorker - 0 views

  • At a home in the Romanian city of Iași, Carlson measured cortisol levels in a group of children ranging from two months to three years old. The caregiver-to-child ratio was twenty to one, and most of the children had experienced severe neglect and sensory deprivation. Multiple times a day, Carlson took saliva samples, tracking how cortisol levels fluctuated in response to stressful events. The children, she found, were hormonally off kilter. Under normal conditions, cortisol peaks just before we wake up and then tapers off; in the leagăne infants, it peaked in the afternoon and remained elevated. Those levels, in turn, correlated with faltering performance on numerous cognitive and physical assessments. Then Carlson tried an intervention modelled on the work of Joseph Sparling, a child-development specialist, and the outcomes changed. When half of the orphans received more touching from more caregivers—an increase in hugs, holding, and the making of small adjustments to clothes and hair—their performance markedly improved. They grew bigger, stronger, and more responsive, both cognitively and emotionally, and they reacted better to stress.
  • Touch is the first of the senses to develop in the human infant, and it remains perhaps the most emotionally central throughout our lives. While many researchers have appreciated its power, others have been more circumspect. Writing in 1928, John B. Watson, one of the originators of the behaviorist school of psychology, urged parents to maintain a physical boundary between themselves and their children: “Never hug and kiss them, never let them sit on your lap. If you must, kiss them once on the forehead when they say goodnight. Shake hands with them in the morning. Give them a pat on the head if they have made an extraordinarily good job on a difficult task.” Watson acknowledged that children must be bathed, clothed, and cared for, but he believed that excessive touching—that is, caressing—would create “mawkish” adults. An untouched child, he argued, “enters manhood so bulwarked with stable work and emotional habits that no adversity can quite overwhelm him.” Now we know that, to attain that result, he should have suggested the opposite: touch, as frequent and as caring as possible
  • And yet touch is rarely purely physical. Field’s more recent work has shown that the brain is very good at distinguishing an emotional touch from a similar, but non-emotional, one. A massage chair is not a masseuse. Certain touch receptors exist solely to convey emotion to the brain, rather than sensory information about the external environment. A recent study shows that we can identify other people’s basic emotions based on how they touch us, even when they are separated from us by a curtain. And the emotions that are communicated by touch can go on to shape our behavior. One recent review found that, even if we have no conscious memory of a touch—a hand on the shoulder, say—we may be more likely to agree to a request, respond more (or less) positively to a person or product, or form closer bonds with someone.
markfrankel18

What's a Metaphor For? - The Chronicle Review - The Chronicle of Higher Education - 1 views

  • "Metaphorical thinking—our instinct not just for describing but for comprehending one thing in terms of another—shapes our view of the world, and is essential to how we communicate, learn, discover and invent. ... Our understanding of metaphor is in the midst of a metamorphosis. For centuries, metaphor has been seen as a kind of cognitive frill, a pleasant but essentially useless embellishment to 'normal' thought. Now, the frill is gone. New research in the social and cognitive sciences makes it increasingly plain that metaphorical thinking influences our attitudes, beliefs, and actions in surprising, hidden, and often oddball ways." Geary further unpacks metaphor's influence in his foreword: "Metaphor conditions our interpretations of the stock market and, through advertising, it surreptitiously infiltrates our purchasing decisions. In the mouths of politicians, metaphor subtly nudges public opinion; in the minds of businesspeople, it spurs creativity and innovation. In science, metaphor is the preferred nomenclature for new theories and new discoveries; in psychology, it is the natural language of human relationships and emotions."
  • The upshot of the boom in metaphor studies, Geary makes clear, is the overturning of that presumption toward literalism: Nowadays, it's believers in a literalism that goes all the way down (so to speak) who are on the defensive in intellectual life, and explorers of metaphor who are on the ascendant. As a result, Geary hardly feels a need to address literalism, devoting most of his book to how metaphor connects to etymology, money, mind, politics, pleasure, science, children, the brain, the body, and such literary forms as the proverb and aphorism.
Lawrence Hrubes

Joshua Foer: John Quijada and Ithkuil, the Language He Invented : The New Yorker - 1 views

  •  
    "Languages are something of a mess. They evolve over centuries through an unplanned, democratic process that leaves them teeming with irregularities, quirks, and words like "knight." No one who set out to design a form of communication would ever end up with anything like English, Mandarin, or any of the more than six thousand languages spoken today. "Natural languages are adequate, but that doesn't mean they're optimal," John Quijada told me. Quijada had spent three decades inventing in his spare time. Ithkuil had never been spoken by anyone other than Quijada, and he assumed that it never would be. In his preface, Quijada wrote that his "greater goal" was "to attempt the creation of what human beings, left to their own devices, would never create naturally, but rather only by conscious intellectual effort: an idealized language whose aim is the highest possible degree of logic, efficiency, detail, and accuracy in cognitive expression via spoken human language, while minimizing the ambiguity, vagueness, illogic, redundancy, polysemy (multiple meanings) and overall arbitrariness that is seemingly ubiquitous in natural human language." Ithkuil has two seemingly incompatible ambitions: to be maximally precise but also maximally concise, capable of capturing nearly every thought that a human being could have while doing so in as few sounds as possible. "
Lawrence Hrubes

Arguments Against God - NYTimes.com - 2 views

  • L.A.: O.K. So the question is, why do I say that theism is false, rather than just unproven? Because the question has been settled to my satisfaction. I say “there is no God” with the same confidence I say “there are no ghosts” or “there is no magic.” The main issue is supernaturalism — I deny that there are beings or phenomena outside the scope of natural law.
  • That’s not to say that I think everything is within the scope of human knowledge. Surely there are things not dreamt of in our philosophy, not to mention in our science – but that fact is not a reason to believe in supernatural beings. I think many arguments for the existence of a God depend on the insufficiencies of human cognition. I readily grant that we have cognitive limitations. But when we bump up against them, when we find we cannot explain something — like why the fundamental physical parameters happen to have the values that they have — the right conclusion to draw is that we just can’t explain the thing. That’s the proper place for agnosticism and humility. But getting back to your question: I’m puzzled why you are puzzled how rational people could disagree about the existence of God. Why not ask about disagreements among theists? Jews and Muslims disagree with Christians about the divinity of Jesus; Protestants disagree with Catholics about the virginity of Mary; Protestants disagree with Protestants about predestination, infant baptism and the inerrancy of the Bible. Hindus think there are many gods while Unitarians think there is at most one. Don’t all these disagreements demand explanation too? Must a Christian Scientist say that Episcopalians are just not thinking clearly? Are you going to ask a Catholic if she thinks there are no good reasons for believing in the angel Moroni?
markfrankel18

We are more rational than those who nudge us - Steven Poole - Aeon - 3 views

  • We are told that we are an irrational tangle of biases, to be nudged any which way. Does this claim stand to reason?
  • A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept.
  • Modern skepticism about rationality is largely motivated by years of experiments on cognitive bias.
  • ...5 more annotations...
  • The thorny question is whether these widespread departures from the economic definition of ‘rationality’ should be taken to show that we are irrational, or whether they merely show that the economic definition of rationality is defective.
  • During the development of game theory and decision theory in the mid-20th century, a ‘rational’ person in economic terms became defined as a lone individual whose decisions were calculated to maximise self-interest, and whose preferences were (logically or mathematically) consistent in combination and over time. It turns out that people are not in fact ‘rational’ in this homo economicus way,
  • There has been some controversy over the correct statistical interpretations of some studies, and several experiments that ostensibly demonstrate ‘priming’ effects, in particular, have notoriously proven difficult to replicate. But more fundamentally, the extent to which such findings can show that we are acting irrationally often depends on what we agree should count as ‘rational’ in the first place.
  • if we want to understand others, we can always ask what is making their behaviour ‘rational’ from their point of view. If, on the other hand, we just assume they are irrational, no further conversation can take place.
  • And so there is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival.
Lawrence Hrubes

The Bitter Fight Over the Benefits of Bilingualism - The Atlantic - 0 views

  • It’s an intuitive claim, but also a profound one. It asserts that the benefits of bilingualism extend well beyond the realm of language, and into skills that we use in every aspect of our lives. This view is now widespread, heralded by a large community of scientists, promoted in books and magazines, and pushed by advocacy organizations.
  • But a growing number of psychologists say that this mountain of evidence is actually a house of cards, built upon flimsy foundations.
  • Jon Andoni Duñabeitia, a cognitive neuroscientist at the Basque Center on Cognition, Brain, and Language, was one of them. In two large studies, involving 360 and 504 children respectively, he found no evidence that Basque kids, raised on Basque and Spanish at home and at school, had better mental control than monolingual Spanish children.
  • ...1 more annotation...
  • Similar controversies have popped up throughout psychology, fueling talk of a “reproducibility crisis” in which scientists struggle to duplicate classic textbook results. In many of these cases, classic psychological phenomena that seem to be backed by years of supportive evidence, suddenly become fleeting and phantasmal. The causes are manifold. Journals are more likely to accept positive, attention-grabbing papers than negative, contradictory ones, which pushes scientists towards running small studies or tweaking experiments on the fly—practices that lead to flashy, publishable discoveries that may not actually be true.
1 - 20 of 51 Next › Last »
Showing 20 items per page