Skip to main content

Home/ TOK@ISPrague/ Group items tagged exceptions

Rss Feed Group items tagged

Lawrence Hrubes

Most People Can’t Multitask, But a Few Are Exceptional. : The New Yorker - 0 views

  • In 2012, David Strayer found himself in a research lab, on the outskirts of London, observing something he hadn’t thought possible: extraordinary multitasking. For his entire career, Strayer, a professor of psychology at the University of Utah, had been studying attention—how it works and how it doesn’t. Methods had come and gone, theories had replaced theories, but one constant remained: humans couldn’t multitask. Each time someone tried to focus on more than one thing at a time, performance suffered. Most recently, Strayer had been focussing on people who drive while on the phone. Over the course of a decade, he and his colleagues had demonstrated that drivers using cell phones—even hands-free devices—were at just as high a risk of accidents as intoxicated ones. Reaction time slowed, attention decreased to the point where they’d miss more than half the things they’d otherwise see—a billboard or a child by the road, it mattered not.
  • What, then, was going on here in the London lab? The woman he was looking at—let’s call her Cassie—was an exception to what twenty-five years of research had taught him. As she took on more and more tasks, she didn’t get worse. She got better. There she was, driving, doing complex math, responding to barking prompts through a cell phone, and she wasn’t breaking a sweat. She was, in other words, what Strayer would ultimately decide to call a supertasker.
  • Cassie in particular was the best multitasker he had ever seen. “It’s a really, really hard test,” Strayer recalls. “Some people come out woozy—I have a headache, that really kind of hurts, that sort of thing. But she solved everything.
  • ...1 more annotation...
  • Their task was simple: keep your eyes on the road; keep a safe difference; brake as required. If they failed to do so, they’d eventually collide with their pace car. Then came the multitasking additions. They would have to not only drive the car but follow audio instructions from a cell phone. Specifically, they would hear a series of words, ranging from two to five at a time, and be asked to recall them in the right order. And there was a twist. Interspersed with the words were math problems. If they heard one of those, the drivers had to answer “true,” if the problem was solved correctly, or “false,” if it wasn’t. They would, for instance, hear “cat” and immediately after, “is three divided by one, minus one, equal to two?” followed by “box,” another problem, and so on. Intermittently, they would hear a prompt to “recall,” at which point, they’d have to repeat back all the words they’d heard since the last prompt. The agony lasted about an hour and a half.
markfrankel18

Fake news and gut instincts: Why millions of Americans believe things that aren't true ... - 2 views

  •  
    "There are many individual qualities that seem like they should promote accuracy, but don't. Valuing evidence, however, appears to be an exception. The bigger the role evidence plays in shaping a person's beliefs, the more accurate that person tends to be. We aren't the only ones who have observed a pattern like this. Another recent study shows that people who exhibit higher scientific curiosity also tend to adopt more accurate beliefs about politically charged science topics, such as fracking and global warming."
markfrankel18

The idiotic argument Americans use in almost any political debate - Quartz - 0 views

  • A logical fallacy like the fallacious slippery slope has no place in political discussions. In fact, there is no place for slippery slope arguments in human discourse at all, with the possible exception of, “If you eat that first chip, you’re going to finish the whole bag.” (This isn’t a real slippery slope argument, since there’s a plausible and compelling reason why A will lead to Z: the deliciousness of chips.) Even kindergarteners, who are the most absolutist of beings, understand that just because they can yell and run around outside, that doesn’t mean they can do the same thing inside; and just because the teacher won’t let them eat snacks whenever they want, that doesn’t mean that they’ll never have snacks. As adults, we must navigate an even more complex landscape of rights and restrictions, and for the most part, we do it pretty well.
Lawrence Hrubes

Why Nothing Is Truly Alive - NYTimes.com - 0 views

  • What is life? Science cannot tell us. Since the time of Aristotle, philosophers and scientists have struggled and failed to produce a precise, universally accepted definition of life. To compensate, modern textbooks point to characteristics that supposedly distinguish the living from the inanimate, the most important of which are organization, growth, reproduction and evolution. But there are numerous exceptions: both living things that lack some of the ostensibly distinctive features of life and inanimate things that have properties of the living.
markfrankel18

Why is the 'mor' in 'Voldemort' so evil-sounding? - The Week - 0 views

  • So what's the deal with "mor"? Is there something to the syllable that suits it for melancholy, darkness, and villainy? We have to be careful here. There are more words out there that have "mor" that don't carry such dark tones. The names Morgan, Maureen, and Maurice aren't so sinister (well, possibly excepting the case of Piers Morgan), and people just wanted more and more of Mork from Ork. So we can't say that this "mor" sound carries darkness and death wherever it goes. But we can say that it has some dark associations available if we want to use them. For starters, the Latin "mor" root (as in moribund and mortal and French words such as morte) refers to death; there is an old Germanic root mora for darkness, which shows up in words such as murky; our modern word murder comes from an Old English word morth for the same; and, of course, a morgue is a place where dead bodies are kept. That's enough to give a familiar ring. And every evil name that has "mor" in it adds to the weight of the association, especially when they're famous evil names. In fact, "mor" may be what is sometimes called a phonestheme: a part of a word that tends to carry a certain connotation not because of etymology or formal definition but just by association.
markfrankel18

Why Nothing Is Truly Alive - NYTimes.com - 0 views

  • What is life? Science cannot tell us. Since the time of Aristotle, philosophers and scientists have struggled and failed to produce a precise, universally accepted definition of life. To compensate, modern textbooks point to characteristics that supposedly distinguish the living from the inanimate, the most important of which are organization, growth, reproduction and evolution. But there are numerous exceptions: both living things that lack some of the ostensibly distinctive features of life and inanimate things that have properties of the living.
  • Life is a concept, not a reality.
markfrankel18

On the Face of It: How We Vote : The New Yorker - 0 views

  • In 2003, the Princeton psychologist Alexander Todorov began to suspect that, except for those people who have hard-core political beliefs, the reasons we vote for particular candidates could have less to do with politics and more to do with basic cognitive processes—in particular, perception. When people are asked about their ideal leader, one of the single most important characteristics that they say they look for is competence—how qualified and capable a candidate is. Todorov wondered whether that judgment was made on the basis of intuitive responses to basic facial features rather than on any deep, rational calculus. It would make sense: in the past, extensive research has shown just how quickly we form impressions of people’s character traits, even before we’ve had a conversation with them. That impression then colors whatever else we learn about them, from their hobbies to, presumably, their political abilities. In other words, when we think that we are making rational political judgments, we could be, in fact, judging someone at least partly based on a fleeting impression of his or her face.
  • Starting that fall, and through the following spring, Todorov showed pairs of portraits to roughly a thousand people, and asked them to rate the competence of each person. Unbeknownst to the test subjects, they were looking at candidates for the House and Senate in 2000, 2002, and 2004. In study after study, participants’ responses to the question of whether someone looked competent predicted actual election outcomes at a rate much higher than chance—from sixty-six to seventy-three per cent of the time. Even looking at the faces for as little as one second, Todorov found, yielded the exact same result: a snap judgment that generally identified the winners and losers.
markfrankel18

Chasing Coincidences - Issue 4: The Unlikely - Nautilus - 0 views

  • The simple question might be “why do such unlikely coincidences occur in our lives?” But the real question is how to define the unlikely. You know that a situation is uncommon just from experience. But even the concept of “uncommon” assumes that like events in the category are common. How do we identify the other events to which we can compare this coincidence? If you can identify other events as likely, then you can calculate the mathematical probability of this particular event as exceptional.
  • We are exposed to possible events all the time: some of them probable, but many of them highly improbable. Each rare event—by itself—is unlikely. But by the mere act of living, we constantly draw cards out of decks. Because something must happen when a card is drawn, so to speak, the highly improbable does appear from time to time.
Lawrence Hrubes

Germanwings 9525, Technology, and the Question of Trust - The New Yorker - 2 views

  • hortly before the dreadful crash of Germanwings Flight 9525, I happened to be reading part of “The Second Machine Age,” a book by two academics at M.I.T., Erik Brynjolfsson and Andrew McAfee, about the coming automation of many professions previously thought of as impervious to technological change, such as those of drivers, doctors, market researchers, and soldiers. With the advances being made in robotics, data analysis, and artificial intelligence, Brynjolfsson and McAfee argue, we are on the cusp of a third industrial revolution.
  • The U.S. military appears to be moving in the direction of eliminating pilots, albeit tentatively. The Pentagon and the C.I.A. have long operated unmanned drones, including the Predator, which are used for reconnaissance and bombing missions. In 2013, the U.S Air Force successfully tested the QF-16 fighter-bomber, which is practically identical to the F-16, except that it doesn’t have a pilot onboard. The plane is flown remotely. Earlier this year, Boeing, the manufacturer of the QF-16, delivered the first of what will be more than a hundred QF-16s to the Air Force. Initially, the planes will be used as flying targets for F-16 pilots to engage during training missions. But at least some military observers expect the QF-16 to end up being used in attack missions.
  • Until now, most executives in the airline industry have assumed that few people would be willing to book themselves and their families on unmanned flights—and they haven’t seriously considered turning commercial aircraft into drones or self-operating vehicles. By placing experienced fliers in the cockpit, the airlines signal to potential customers that their safety is of paramount importance—and not only because the crew members are skilled; their safety is at stake, too. In the language of game theory, this makes the aircraft’s commitment to safety more credible. Without a human flight crew, how could airlines send the same signal?
Lawrence Hrubes

Aboriginal legends reveal ancient secrets to science - BBC News - 0 views

  • Scientists are beginning to tap into a wellspring of knowledge buried in the ancient stories of Australia's Aboriginal peoples. But the loss of indigenous languages could mean it is too late to learn from them.
  • "They describe this gigantic wave coming very far inland and killing everybody except those who were up on the mountaintops, and they actually name all the different locations where people survived," says Mr Hamacher. He and Mr Goff took core samples from locations between 500m and 1km (0.6 miles) inland, and at each spot, they found a layer of ocean sediment, about 2m down, indicating that a tsunami likely washed over the area hundreds, or possibly thousands, of years ago. The samples need further analysis but Mr Hamacher says it is a "very exciting" result that suggests the legend could be true.
  • They also employ a rigid kin-based, cross-generational system of fact-checking stories, involving grandchildren, parents, and elders, which Mr Reid says doesn't seem to be used by other cultures.
  • ...1 more annotation...
  • But there is a problem - Indigenous languages are dying off at an alarming rate, making it increasingly difficult for scientists and other experts to benefit from such knowledge. More than 100 languages have already become extinct since white settlement.
Philip Drobis

BBC News - A Point Of View: What is history's role in society? - 2 views

  • ostering innovation and helping people to think analytically,
  • Called simply Bronze, it celebrates a metal so important it has its own age of history attached to it, and so responsive to the artist's skill that it breathes life into gods, humans, mythological creatures and animals with equal success.
  • It is remarkable to think that had Bronze been mounted say 15 years ago, the portrait of the past that it delivered would have been subtly different. History is very far from a done deal.
  • ...6 more annotations...
  • Historians are always rewriting the past. The focus on what is or is not important in history, is partly determined by the time they themselves live in and therefore the questions that they ask.
  • practise of micro-history for example - the way you could construct pictures of forgotten communities or individual lives from state, parish or court records proved breath-taking.
  • man claiming to be him walks back into both. But is he really Martin Guerre? With no images or mirrors in such places (how does that affect memory, and the construction of identity?) no-one can be sure. Except, surely, his wife?
  • he study of history, English, philosophy or art doesn't really help anyone get a job and does not contribute to the economy to the same degree that science or engineering or business studies obviously do. Well, let's run a truck though that fast shall we? The humanities, alongside filling one in on human history, teach people how to think analytically while at the same time noting and appreciating innovation and creativity. Not a bad set of skills for most jobs wouldn't you say? As for the economy - what about the billion pound industries of publishing, art, television, theatre, film - all of which draw on our love of as well as our apparently insatiable appetite for stories, be they history or fiction?
  • No-one would dare to mess with science in the way they mess with history.
  • but larger topics such as emotions or physical pain - their role and changing meanings within history - are very much up for grabs with big studie
  •  
    -ties in with what we have been discussing
markfrankel18

What Elvish, Klingon, and Dothraki Reveal about Real Language & the Essence of Human Co... - 1 views

  • Language, Darwin believed, was not a conscious invention but a phenomenon “slowly and unconsciously developed by many steps.” But what makes a language a language? In this short animation from TED Ed, linguist John McWhorter, author of the indispensable The Power of Babel: A Natural History of Language (public library), explores the fascinating world of fantasy constructed languages — known as conlangs — from Game of Thrones’ Dothraki to Avatar’s Na’vi to Star Trek’s Klingon to Lord of the Rings’ Elvish. Though fictional, these conlangs reveal a great deal about the fundamentals of real human communication and help us understand the essential components of a successful language — extensive vocabulary, consistent grammar rules but peppered with exceptions, and just the right amount of room for messiness and evolution.
Lawrence Hrubes

Why Startups Love Moleskines - The New Yorker - 0 views

  • Digital note-taking apps also leave their users only a finger-swipe away from e-mail or Candy Crush. An article on digital distraction in the June issue of The Harvard Business Review cites an estimate, by the Information Overload Research Group, that the problem is costing the U.S. economy nine hundred and ninety-seven billion dollars a year. “The digital world provides a lot of opportunity to waste a lot of time,” Allen said. A paper notebook, by contrast, is a walled garden, free from detours (except doodling), and requiring no learning curve. A growing body of research supports the idea that taking notes works better on paper than on laptops, in terms of comprehension, memorization, and other cognitive benefits.
Lawrence Hrubes

Walter Mischel, The Marshallow Test, and Self-Control - The New Yorker - 1 views

  • Mischel’s story isn’t surprising—nicotine is addictive, and quitting is difficult—except for one thing: Mischel is the creator of the marshmallow test, one of the most famous experiments in the history of psychology, which is often cited as evidence of the importance of self-control. In the original test, which was administered at the Bing Nursery School, at Stanford, in the nineteen-sixties, Mischel’s team would present a child with a treat (marshmallows were just one option) and tell her that she could either eat the one treat immediately or wait alone in the room for several minutes until the researcher returned, at which point she could have two treats. The promised treats were always visible and the child knew that all she had to do to stop the agonizing wait was ring a bell to call the experimenter back—although in that case, she wouldn’t get the second treat. The longer a child delayed gratification, Mischel found—that is, the longer she was able to wait—the better she would fare later in life at numerous measures of what we now call executive function. She would perform better academically, earn more money, and be healthier and happier. She would also be more likely to avoid a number of negative outcomes, including jail time, obesity, and drug use.
  • It was not until one day in the late nineteen-sixties, when he saw a man with metastasized lung cancer in the halls of Stanford’s medical school—chest exposed, head shaved, little green “x” marks all over his body, marking the points where radiation would go—that Mischel realized he was fooling himself. Finally, something clicked. From then on, each time he wanted a cigarette (approximately every three minutes, by his count) he would create a picture in his mind of the man in the hallway. As he described it to me, “I changed the objective value of the cigarette. It went from something I craved to something disgusting.” He hasn’t had a smoke since.
  •  
    "Mischel, who is now eighty-four years old, has just published his first popular book, "The Marshmallow Test: Mastering Self-Control." It is part memoir, part scientific analysis, and part self-help guide. In the book, he describes the original impetus for the marshmallow study. At the time, his daughters, Judith, Rebecca, and Linda, were three, four, and five years old, respectively. "I began to see this fascinating phenomenon where they morphed from being highly impulsive, immediate creatures who couldn't delay anything," he told me. "There were these amazingly rapid changes-everything around them was the same, but something inside them had changed. I realized I didn't have a clue what was going on in their heads." He wondered what was it that had enabled them to go from deciding that they wanted to wait to actually being able to do so. He found the answer among their classmates at the Bing preschool."
markfrankel18

We Didn't Eat the Marshmallow. The Marshmallow Ate Us. - NYTimes.com - 4 views

  • The marshmallow study captured the public imagination because it is a funny story, easily told, that appears to reduce the complex social and psychological question of why some people succeed in life to a simple, if ancient, formulation: Character is destiny. Except that in this case, the formulation isn’t coming from the Greek philosopher Heraclitus or from a minister preaching that “patience is a virtue” but from science, that most modern of popular religions.
  • But how our brains work is just one of many factors that drive the choices we make. Just last year, a study by researchers at the University of Rochester called the conclusions of the Stanford experiments into question, showing that some children were more likely to eat the first marshmallow when they had reason to doubt the researcher’s promise to come back with a second one. In the study, published in January 2013 in Cognition under the delectable title “Rational Snacking,” Celeste Kidd, Holly Palmeri and Richard N. Aslin wrote that for a child raised in an unstable environment, “the only guaranteed treats are the ones you have already swallowed,” while a child raised in a more stable environment, in which promises are routinely delivered upon, might be willing to wait a few more minutes, confident that he will get that second treat.
  • Willpower can do only so much for children facing domestic instability, poor physical health or intellectual deficits.
markfrankel18

What Are You So Afraid Of? - NYTimes.com - 0 views

  • Fear, arriving in layers in which genetic legacy converges with personal experience, is vital to our survival. When we freeze, stop in our tracks or take flight, it is a biological response to what we sense as near and present danger. All the same, it observes its own absurd hierarchy, in which we often harbor an abiding anxiety for the wrong things.
  • Except that we do choose, and what we choose are generally the ordinary fears such as heights, public speaking, insects, reptiles
  • The biologist E. O. Wilson has observed that while we fear snakes, spiders, darkness, open spaces and closed spaces, we do not fear the more likely instruments of danger — knives, guns, cars, electrical sockets — because, he says, “our species has not been exposed to these lethal agents long enough in evolutionary time to have acquired the predisposing genes that ensure automatic avoidance.” Which is to say, fear, real fear, deep fear, the kind that changes our habits and actions, is not something on which we are likely to follow sensible instruction.
markfrankel18

How Our Minds Mislead Us: The Marvels and Flaws of Our Intuition | Brain Pickings - 2 views

  • There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.
  • What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .
  •  
    The Marvels and Flaws of Intuition (from the Brain Pickings Blog)
markfrankel18

Let's call everyone "they": Gender-neutral language should be the norm, not the excepti... - 1 views

  • When someone refers to a Hispanic person or a lesbian or someone with blue eyes or curly hair or double-jointed thumbs, it’s not unreasonable to question whether the detail is merely a rhetorical flourish of sorts or is, in the speaker’s mind, substantively relevant. Such superfluous details can often be offered as a way of reinforcing a stereotype without outright stating it. It is no surprise that we often respond to statements like “My black coworker is always late” or “That blonde student is actually quite smart” or “My gay friend is so dramatic” by asking what, exactly, the speaker is “trying to say.” We are entitled to our indignation because the language presented a very clear alternative; these traits needn’t have been mentioned at all.
  • When it comes to gender, on the other hand, the language has traditionally presented no elegant alternative. We are forced to clumsily dance around the matter, or to give in and refer to our coworkers, students and friends as “he” or “she.”
  • Instead, we ought to revert to the gender neutral “they” whenever gender is not explicitly relevant. Least of all because, if the goal is greater inclusion, limiting the use of the singular they to these cases doesn’t even have the desired effect.
Lawrence Hrubes

The Great A.I. Awakening - The New York Times - 1 views

  • Translation, however, is an example of a field where this approach fails horribly, because words cannot be reduced to their dictionary definitions, and because languages tend to have as many exceptions as they have rules. More often than not, a system like this is liable to translate “minister of agriculture” as “priest of farming.” Still, for math and chess it worked great, and the proponents of symbolic A.I. took it for granted that no activities signaled “general intelligence” better than math and chess.
  • A rarefied department within the company, Google Brain, was founded five years ago on this very principle: that artificial “neural networks” that acquaint themselves with the world via trial and error, as toddlers do, might in turn develop something like human flexibility. This notion is not new — a version of it dates to the earliest stages of modern computing, in the 1940s — but for much of its history most computer scientists saw it as vaguely disreputable, even mystical. Since 2011, though, Google Brain has demonstrated that this approach to artificial intelligence could solve many problems that confounded decades of conventional efforts. Speech recognition didn’t work very well until Brain undertook an effort to revamp it; the application of machine learning made its performance on Google’s mobile platform, Android, almost as good as human transcription. The same was true of image recognition. Less than a year ago, Brain for the first time commenced with the gut renovation of an entire consumer product, and its momentous results were being celebrated tonight.
Lawrence Hrubes

When Art Is Dangerous (or Not) - NYTimes.com - 7 views

  • THE only time art ever seems to make news here in the West anymore is when a Pollock or Warhol sells for a sum commensurate with the budget of a “Transformers” film. It seems bizarre, then, to find ourselves grappling with international crises in which art is the issue: the imbroglio involving the Sony movie “The Interview,” the massacre at Charlie Hebdo in Paris. The incomprehension, whether bemused or horrified, that we feel toward people who take up arms against the creators of cartoons or comedies is a chastening reminder that there are still cultures in which art is not a harmless diversion or commodity, but something real and volatile, a potential threat to be violently suppressed. These attacks are, in a way, a savage, atavistic show of respect.
  • Kurt Vonnegut Jr. likened the cumulative firepower of all the art and literature directed against the Vietnam War to “the explosive force of a very large banana-cream pie — a pie two meters in diameter, 20 centimeters thick, and dropped from a height of 10 meters or more.” A lot of artists in America tend to be self-deprecating futilitarians, because we’ve grown up in a culture in which art doesn’t matter except, occasionally, as a high-end investment. When art has been controversial here it’s most often been because it’s deemed obscene. (Sex is our tawdry Muhammad, the thing that cannot be depicted.) But it’s hard to think of a time in our recent history when art gave any cause for alarm to anyone in power.
1 - 20 of 21 Next ›
Showing 20 items per page