Skip to main content

Home/ TOK@ISPrague/ Group items tagged sources

Rss Feed Group items tagged

Lawrence Hrubes

How a Raccoon Became an Aardvark : The New Yorker - 0 views

  • In July of 2008, Dylan Breves, then a seventeen-year-old student from New York City, made a mundane edit to a Wikipedia entry on the coati. The coati, a member of the raccoon family, is “also known as … a Brazilian aardvark,” Breves wrote. He did not cite a source for this nickname, and with good reason: he had invented it. He and his brother had spotted several coatis while on a trip to the Iguaçu Falls, in Brazil, where they had mistaken them for actual aardvarks.
  • Over time, though, something strange happened: the nickname caught on. About a year later, Breves searched online for the phrase “Brazilian aardvark.” Not only was his edit still on Wikipedia, but his search brought up hundreds of other Web sites about coatis. References to the so-called “Brazilian aardvark” have since appeared in the Independent, the Daily Mail, and even in a book published by the University of Chicago. Breves’s role in all this seems clear: a Google search for “Brazilian aardvark” will return no mentions before Breves made the edit, in July, 2008. The claim that the coati is known as a Brazilian aardvark still remains on its Wikipedia entry, only now it cites a 2010 article in the Telegraph as evidence.
  • This kind of feedback loop—wherein an error that appears on Wikipedia then trickles to sources that Wikipedia considers authoritative, which are in turn used as evidence for the original falsehood—is a documented phenomenon. There’s even a Wikipedia article describing it.
Lawrence Hrubes

Period. Full Stop. Point. Whatever It's Called, It's Going Out of Style - The New York ... - 0 views

  • The period — the full-stop signal we all learn as children, whose use stretches back at least to the Middle Ages — is gradually being felled in the barrage of instant messaging that has become synonymous with the digital age
  • Increasingly, says Professor Crystal, whose books include “Making a Point: The Persnickety Story of English Punctuation,” the period is being deployed as a weapon to show irony, syntactic snark, insincerity, even aggression
  • At the same time, he said he found that British teenagers were increasingly eschewing emoticons and abbreviations such as “LOL” (laughing out loud) or “ROTF” (rolling on the floor) in text messages because they had been adopted by their parents and were therefore considered “uncool”
  •  
    note: this article was written with an intentional lack of periods
markfrankel18

How politics makes us stupid - Vox - 0 views

  • In April and May of 2013, Yale Law professor Dan Kahan — working with coauthors Ellen Peters, Erica Cantrell Dawson, and Paul Slovic — set out to test a question that continuously puzzles scientists: why isn’t good evidence more effective in resolving political debates? For instance, why doesn’t the mounting proof that climate change is a real threat persuade more skeptics?
  • The leading theory, Kahan and his coauthors wrote, is the Science Comprehension Thesis, which says the problem is that the public doesn’t know enough about science to judge the debate. It’s a version of the More Information Hypothesis: a smarter, better educated citizenry wouldn’t have all these problems reading the science and accepting its clear conclusion on climate change. But Kahan and his team had an alternative hypothesis. Perhaps people aren’t held back by a lack of knowledge. After all, they don’t typically doubt the findings of oceanographers or the existence of other galaxies. Perhaps there are some kinds of debates where people don’t want to find the right answer so much as they want to win the argument. Perhaps humans reason for purposes other than finding the truth — purposes like increasing their standing in their community, or ensuring they don’t piss off the leaders of their tribe. If this hypothesis proved true, then a smarter, better-educated citizenry wouldn’t put an end to these disagreements. It would just mean the participants are better equipped to argue for their own side.
  • Kahan doesn’t find it strange that we react to threatening information by mobilizing our intellectual artillery to destroy it. He thinks it’s strange that we would expect rational people to do anything else.
  • ...1 more annotation...
  • Kahan’s studies, depressing as they are, are also the source of his optimism: he thinks that if researchers can just develop a more evidence-based model of how people treat questions of science as questions of identity then scientists could craft a communications strategy that would avoid those pitfalls. "My hypothesis is we can use reason to identify the sources of the threats to our reason and then we can use our reason to devise methods to manage and control those processes," he says.
markfrankel18

The Problem With History Classes - Atlantic Mobile - 1 views

  • Currently, most students learn history as a set narrative—a process that reinforces the mistaken idea that the past can be synthesized into a single, standardized chronicle of several hundred pages. This teaching pretends that there is a uniform collective story, which is akin to saying everyone remembers events the same. Yet, history is anything but agreeable. It is not a collection of facts deemed to be "official" by scholars on high. It is a collection of historians exchanging different, often conflicting analyses. And rather than vainly seeking to transcend the inevitable clash of memories, American students would be better served by descending into the bog of conflict and learning the many "histories" that compose the American national story.
  • History may be an attempt to memorialize and preserve the past, but it is not memory; memories can serve as primary sources, but they do not stand alone as history. A history is essentially a collection of memories, analyzed and reduced into meaningful conclusions—but that collection depends on the memories chosen.
  • Although, as Urist notes, the AP course is "designed to teach students to think like historians," my own experience in that class suggests that it fails to achieve that goal. The course’s framework has always served as an outline of important concepts aiming to allow educators flexibility in how to teach; it makes no reference to historiographical conflicts. Historiography was an epiphany for me because I had never before come face-to-face with how historians think and reason—how they construct an argument, what sources animate that approach, and how their position responds to other historians. When I took AP U.S. History, I jumbled these diverse histories into one indistinct narrative. Although the test involved open-ended essay questions, I was taught that graders were looking for a firm thesis—forcing students to adopt a side. The AP test also, unsurprisingly, rewards students who cite a wealth of supporting details. By the time I took the test in 2009, I was a master at "checking boxes," weighing political factors equally against those involving socioeconomics and ensuring that previously neglected populations like women and ethnic minorities received their due. I did not know that I was pulling ideas from different historiographical traditions. I still subscribed to the idea of a prevailing national narrative and served as an unwitting sponsor of synthesis, oblivious to the academic battles that made such synthesis impossible.  
markfrankel18

The Science of Why We Don't Believe Science | Mother Jones - 0 views

  • "A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."
  • The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself. We apply fight-or-flight reflexes not only to predators, but to data itself. We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
  • In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
  • ...2 more annotations...
  • A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information.
  • Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or "narrowcast" and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan's Arthur Lupia, are "not well-adapted to our information age."
markfrankel18

Psychiatry's Mind-Brain Problem - The New York Times - 1 views

  • Recently, a psychiatric study on first episodes of psychosis made front-page news. People seemed quite surprised by the finding: that lower doses of psychotropic drugs, when combined with individual psychotherapy, family education and a focus on social adaptation, resulted in decreased symptoms and increased wellness.
  • Recently, a psychiatric study on first episodes of psychosis made front-page news. People seemed quite surprised by the finding: that lower doses of psychotropic drugs, when combined with individual psychotherapy, family education and a focus on social adaptation, resulted in decreased symptoms and increased wellness. But the real surprise — and disappointment — was that this was considered so surprising.
  • But the real surprise — and disappointment — was that this was considered so surprising.
  • ...2 more annotations...
  • Unfortunately, Dr. Kane’s study arrives alongside a troubling new reality. His project was made possible by funding from the National Institute of Mental Health before it implemented a controversial requirement: Since 2014, in order to receive the institute’s support, clinical researchers must explicitly focus on a target such as a biomarker or neural circuit. It is hard to imagine how Dr. Kane’s study (or one like it) would get funding today, since it does not do this. In fact, psychiatry at present has yet to adequately identify any specific biomarkers or circuits for its major illnesses.
  • Unfortunately, Dr. Kane’s study arrives alongside a troubling new reality. His project was made possible by funding from the National Institute of Mental Health before it implemented a controversial requirement: Since 2014, in order to receive the institute’s support, clinical researchers must explicitly focus on a target such as a biomarker or neural circuit. It is hard to imagine how Dr. Kane’s study (or one like it) would get funding today, since it does not do this. In fact, psychiatry at present has yet to adequately identify any specific biomarkers or circuits for its major illnesses.
markfrankel18

The Return of History - The New York Times - 1 views

  • That the Islamic State has made violent use of history shouldn’t come as a surprise. Perhaps more surprising is that in all those places where a modern nation has been grafted onto an ancient culture, history has returned with a vengeance. From Confucian China to Buddhist Myanmar to Hindu India, history has become the source of a fierce new conservatism that is being used to curb freedoms of women and stoke hatred of minorities. As the ultimate source of legitimacy, history has become a way for modernizing societies to procure the trappings of modernity while guarding themselves from its values.
markfrankel18

Disputing Korean Narrative on 'Comfort Women,' a Professor Draws Fierce Backlash - The ... - 0 views

  • women” in 2013, Park Yu-ha wrote that she felt “a bit fearful” of how it might be received. After all, she said, it challenged “the common knowledge” about the wartime sex slaves.But even she was not prepared for the severity of the backlash.In February, a South Korean court ordered Ms. Park’s book, “Comfort Women of the Empire,” redacted in 34 sections where it found her guilty of defaming former comfort women with false facts. Ms. Park is also on trial on the criminal charge of defaming the aging women, widely accepted here as an inviolable symbol of Korea’s suffering under colonial rule by Japan and its need for historical justice, and she is being sued for defamation by some of the women themselves.
markfrankel18

It's not just climate-change deniers-conservatives and liberals distrust science equall... - 1 views

  • we not only discount or dismiss scientific information inconsistent with our ideology; we may also distrust and attack its source(s).
  • Furthermore, and contrary to popular belief, this biased processing is most likely to occur among people who have greater cognitive and reasoning capabilities–not less. Where the two sets of explanations for ideological divides on science differ is on how motivated reasoning leads to bias.
markfrankel18

The Science of Why We Don't Believe Science - 2 views

  • "A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." So wrote the celebrated Stanford University psychologist Leon Festinger [1] (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study [2] in psychology. Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, "Sananda," who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.
  • In the annals of denial, it doesn't get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin's space cult might lie at on the far end of the spectrum of human self-delusion, there's plenty to go around. And since Festinger's day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning [5]" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president [6] (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
Lawrence Hrubes

'Son of Saul,' Kierkegaard and the Holocaust - The New York Times - 1 views

  • The spectacular success of science in the past 300 years has raised hopes that it also holds the key to guiding human beings towards a good life. Psychology and neuroscience has become a main source of life advice in the popular media. But philosophers have long held reservations about this scientific orientation to how to live life.
  • The 18th century Scottish philosopher David Hume, for instance, famously pointed out, no amount of fact can legislate value, moral or otherwise. You cannot derive ought from is.
  • Science is the best method we have for approaching the world objectively. But in fact it is not science per se that is the problem, from the point of view of subjectivity. It is objectivizing, in any of its forms. One can frame a decision, for example, in objective terms. One might decide between career choices by weighing differences in workloads, prestige, pay and benefits between, say, working for an advanced technology company versus working for a studio in Hollywood. We are often encouraged to make choices by framing them in this way. Alternatively, one might try to frame the decision more in terms of what it might be like to work in either occupation; in this case, one needs to have the patience to dwell in experience long enough for one’s feelings about either alternative to emerge. In other words, one might deliberate subjectively.
  • ...1 more annotation...
  • Most commonly, we turn our back on subjectivity to escape from pain. Suffering, one’s own, or others’, might become bearable, one hopes, when one takes a step back and views it objectively, conceptually, abstractly. And when it comes to something as monumental as the Holocaust, one’s mind cannot help but be numbed by the sheer magnitude of it. How could one feel the pain of all those people, sympathize with millions? Instead one is left with the “facts,” the numbers.
markfrankel18

How an Archive of the Internet Could Change History - The New York Times - 0 views

  • Building an archive has always required asking a couple of simple but thorny questions: What will we save and how? Whose stories are the most important and why? In theory, the internet already functions as a kind of archive: Any document, video or photo can in principle remain there indefinitely, available to be viewed by anyone with a connection. But in reality, things disappear constantly.
  • But there’s still a low-grade urgency to save our social media for posterity — and it’s particularly urgent in cases in which social media itself had a profound influence on historic events.
  • Social media might one day offer a dazzling, and even overwhelming, array of source material for historians. Such an abundance presents a logistical challenge (the total number of tweets ever written is nearing half a trillion) as well as an ethical one (will people get to opt out of having ephemeral thoughts entered into the historical record?). But this plethora of new media and materials may function as a totally new type of archive: a multidimensional ledger of events that academics, scholars, researchers and the general public can parse to generate a more prismatic recollection of history.
markfrankel18

The Dangers of Certainty: A Lesson From Auschwitz - NYTimes.com - 1 views

  • The ascent of man was secured through scientific creativity. But unlike many of his more glossy and glib contemporary epigones, Dr. Bronowski was never reductive in his commitment to science. Scientific activity was always linked to artistic creation. For Bronowski, science and art were two neighboring mighty rivers that flowed from a common source: the human imagination. Newton and Shakespeare, Darwin and Coleridge, Einstein and Braque: all were interdependent facets of the human mind and constituted what was best and most noble about the human adventure.
  • For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty.
  • At this point, in the final minutes of the show, the scene suddenly shifts to Auschwitz, where many members of Bronowski’s family were murdered. Then this happened. Please stay with it. This short video from the show lasts only four minutes or so.[Video: Dr. Jacob Bronowski's argument against certainty, made at Auschwitz for his show "The Ascent of Man." Watch on YouTube.]It is, I am sure you agree, an extraordinary and moving moment. Bronowski dips his hand into the muddy water of a pond which contained the remains of his family members and the members of countless other families. All victims of the same hatred: the hatred of the other human being. By contrast, he says — just before the camera hauntingly cuts to slow motion — “We have to touch people.”
markfrankel18

When are you dead? - 2011 SPRING - Stanford Medicine Magazine - Stanford University Sch... - 0 views

  • A little more than 40 years ago, a partially functioning brain would not have gotten in the way of organ donation; irreversible cardiopulmonary failure was still the only standard for determining death. But during the 1970s, that began to change, and by the early 1980s, the cessation of all brain activity — brain death — had become a widely accepted standard. In the transplant community, brain death was attractive for one particular reason: The bodies of such donors could remain on respirators to keep their organs healthy, even during much of the organ-removal surgery. Today, the medical establishment, facing a huge shortage of organs, needs new sources for transplantation. One solution has been a return to procuring organs from patients who die of heart failure. Before dying, these patients are likely to have been in a coma, sustained by a ventilator, with very minimal brain function — a hopeless distance from what we mean by consciousness. Still, many people, including some physicians, consider this type of organ donation, known as “donation after cardiac death” or DCD, as akin to murder.
markfrankel18

And the Word of the Year Is... Selfie! : The New Yorker - 0 views

  • Hold on to your monocles, friends—the Oxford Dictionaries Word of the Year for 2013 is “selfie.” It’s an informal noun (plural: selfies) defined as “a photograph that one has taken of oneself, typically one taken with a smartphone or webcam and uploaded to a social media website.” It was first used in 2002, in an Australian online forum (compare the Australian diminutives “barbie” for barbecue and “firie” for firefighter), and it first appeared as a hashtag, #selfie, on Flickr, in 2004.
  • The word “selfie” is not yet in the O.E.D., but it is currently being considered for future inclusion; whether the word makes it into the history books is truly for the teens to decide. As Ben Zimmer wrote at Language Log, “Youth slang is the obvious source for much of our lexical innovation, like it or not.” And despite its cloying tone, that Oxford Dictionaries blog post from August does allude to the increasingly important distinction between “acronym“ and “initialism”—either of which may describe the expression “LOL,” depending if you pronounce it “lawl” or “ell-oh-ell.” The kids are going to be all right. Not “alright.” But all right.
Michael Peters

Increasing Number Of Men Pressured To Accept Realistic Standards Of Female Beauty | The... - 0 views

  •  
    Satire, but totally on-point.
markfrankel18

You Need to Hear This Extremely Rare Recording  - The Message - Medium - 0 views

  • “Rare” is such an quizzical descriptor, a blatant contradiction of the very nature of digital culture. Rarity describes a state of scarcity, and as we enter a proto-post-scarcity economy, digital stuff defies such shortages.Things are no longer rare; they are either popular or unpopular.Rarity itself has become very rare.
Lawrence Hrubes

BBC News - Sleep's memory role discovered - 0 views

  • The mechanism by which a good night's sleep improves learning and memory has been discovered by scientists. The team in China and the US used advanced microscopy to witness new connections between brain cells - synapses - forming during sleep. Their study, published in the journal Science, showed even intense training could not make up for lost sleep. Experts said it was an elegant and significant study, which uncovered the mechanisms of memory. It is well known that sleep plays an important role in memory and learning. But what actually happens inside the brain has been a source of considerable debate.
markfrankel18

The End of 'Genius' - NYTimes.com - 0 views

  • WHERE does creativity come from? For centuries, we’ve had a clear answer: the lone genius. The idea of the solitary creator is such a common feature of our cultural landscape (as with Newton and the falling apple) that we easily forget it’s an idea in the first place.But the lone genius is a myth that has outlived its usefulness. Fortunately, a more truthful model is emerging: the creative network, as with the crowd-sourced Wikipedia or the writer’s room at “The Daily Show” or — the real heart of creativity — the intimate exchange of the creative pair, such as John Lennon and Paul McCartney and myriad other examples with which we’ve yet to fully reckon.
  • The pair is the primary creative unit — not just because pairs produce such a staggering amount of work but also because they help us to grasp the concept of dialectical exchange. At its heart, the creative process itself is about a push and pull between two entities, two cultures or traditions, or two people, or even a single person and the voice inside her head.
1 - 20 of 31 Next ›
Showing 20 items per page