Skip to main content

Home/ TOK Friends/ Group items tagged error

Rss Feed Group items tagged

Javier E

The Social Side of Reasoning - NYTimes.com - 0 views

  • We have a very hard time sticking to rules of deductive logic, and we constantly make basic errors in statistical reasoning.  Most importantly, we are strongly inclined to “confirmation-bias”: we systematically focus on data that support a view we hold and ignore data that count against it.
  • These facts suggest that our evolutionary development has not done an especially good job of making us competent reasoners.  Sperber and Mercier, however, point out that this is true only if the point of reasoning is to draw true conclusions.
  • it makes sense to think that the evolutionary point of human reasoning is to win arguments, not to reach the truth.
  • ...7 more annotations...
  • The root of the dilemma is the distinction between seeking the truth and winning an argument.  The distinction makes sense for cases where someone does not care about knowing the truth and argues only to convince other people of something, whether or not it’s true.
  • how do I justify a belief and so come to know that it’s true?  There are competing philosophical answers to this question, but one fits particularly well with Sperber and Mercier’s approach.  This is the view that justification is a matter of being able to convince other people that a claim is correct
  • The key point is that justification — and therefore knowledge of the truth — is a social process.  This need not mean that claims are true because we come to rational agreement about them.  But such agreement, properly arrived at, is the best possible justification of a claim to truth. 
  • This pragmatic view understands seeking the truth as a special case of trying to win an argument: not winning by coercing or tricking people into agreement, but by achieving agreement through honest arguments.
  • The important practical conclusion is that finding the truth does require winning arguments, but not in the sense that my argument defeats yours.  Rather, we find an argument that defeats all contrary arguments.
  • the philosophical view gains plausibility from its convergence with the psychological account.
  • This symbiosis is an instructive example of how philosophy and empirical psychology can fruitfully interact.
Javier E

Don't Blink! The Hazards of Confidence - NYTimes.com - 0 views

  • people who face a difficult question often answer an easier one instead, without realizing it. We were required to predict a soldier’s performance in officer training and in combat, but we did so by evaluating his behavior over one hour in an artificial situation. This was a perfect instance of a general rule that I call WYSIATI, “What you see is all there is.” We had made up a story from the little we knew but had no way to allow for what we did not know about the individual’s future, which was almost everything that would actually matter.
  • the exaggerated expectation of consistency is a common error. We are prone to think that the world is more regular and predictable than it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives. F
  • The confidence we experience as we make a judgment is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true.
  • ...1 more annotation...
  • When a compelling impression of a particular event clashes with general knowledge, the impression commonly prevails. And this goes for you, too. The confidence you will experience in your future judgments will not be diminished by what you just read, even if you believe every word.
julia rhodes

Brainlike Computers, Learning From Experience - NYTimes.com - 0 views

  • Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
  • Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.
  • The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information.
  • ...6 more annotations...
  • In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control.
  • They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “learned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
  • The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
  • “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr
  • Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic. I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
  • Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
Javier E

Does Everything Happen for a Reason? - NYTimes.com - 1 views

  • we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life
  • British atheists were just as likely as American atheists to believe that their life events had underlying purposes, even though Britain is far less religious than America.
  • even young children show a bias to believe that life events happen for a reason — to “send a sign” or “to teach a lesson.” This belief exists regardless of how much exposure the children have had to religion at home, and even if they’ve had none at all.
  • ...6 more annotations...
  • This tendency to see meaning in life events seems to reflect a more general aspect of human nature: our powerful drive to reason in psychological terms, to make sense of events and situations by appealing to goals, desires and intentions
  • This drive serves us well when we think about the actions of other people, who actually possess these psychological states, because it helps us figure out why people behave as they do and to respond appropriately.
  • But it can lead us into error when we overextend it, causing us to infer psychological states even when none exist. This fosters the illusion that the world itself is full of purpose and design.
  • we found that highly paranoid people (who tend to obsess over other people’s hidden motives and intentions) and highly empathetic people (who think deeply about other people’s goals and emotions) are particularly likely to believe in fate and to believe that there are hidden messages and signs embedded in their own life events. In other words, the more likely people are to think about other people’s purposes and intentions, the more likely they are to also infer purpose and intention in human life itself.
  • the belief also has some ugly consequences. It tilts us toward the view that the world is a fundamentally fair place, where goodness is rewarded and badness punished. It can lead us to blame those who suffer from disease and who are victims of crimes, and it can motivate a reflexive bias in favor of the status quo — seeing poverty, inequality and oppression as reflecting the workings of a deep and meaningful plan.
  • even those who are devout should agree that, at least here on Earth, things just don’t naturally work out so that people get what they deserve. If there is such a thing as divine justice or karmic retribution, the world we live in is not the place to find it. Instead, the events of human life unfold in a fair and just manner only when individuals and society work hard to make this happen.We should resist our natural urge to think otherwise.
Javier E

New Truths That Only One Can See - NYTimes.com - 1 views

  • Replication, the ability of another lab to reproduce a finding, is the gold standard of science, reassurance that you have discovered something true. But that is getting harder all the time.
  • With the most accessible truths already discovered, what remains are often subtle effects, some so delicate that they can be conjured up only under ideal circumstances, using highly specialized techniques.
  • Taking into account the human tendency to see what we want to see, unconscious bias is inevitable.
  • ...8 more annotations...
  • He and his colleagues could not replicate 47 of 53 landmark papers about cancer. Some of the results could not be reproduced even with the help of the original scientists working in their own labs.
  • Paradoxically the hottest fields, with the most people pursuing the same questions, are most prone to error, Dr. Ioannidis argued. If one of five competing labs is alone in finding an effect, that result is the one likely to be published. But there is a four in five chance that it is wrong. Papers reporting negative conclusions are more easily ignored.
  • The effect is amplified by competition for a shrinking pool of grant money and also by the design of so many experiments — with small sample sizes (cells in a lab dish or people in an epidemiological pool) and weak standards for what passes as statistically significant. That makes it all the easier to fool oneself.
  • The fear that much published research is tainted has led to proposals to make replication easier by providing more detailed documentation, including videos of difficult procedures.
  • A call for the establishment of independent agencies to replicate experiments has led to a backlash, a fear that perfectly good results will be thrown out.
  • Scientists talk about “tacit knowledge,” the years of mastery it can take to perform a technique. The image they convey is of an experiment as unique as a Rembrandt.
  • Embedded in the tacit knowledge may be barely perceptible tweaks and jostles — ways of unknowingly smuggling one’s expectations into the results, like a message coaxed from a Ouija board.
  • Exciting new results will continue to appear. But as the quarry becomes more elusive, the trophies are bound to be fewer and fewer.
Javier E

The Mental Virtues - NYTimes.com - 0 views

  • Even if you are alone in your office, you are thinking. Thinking well under a barrage of information may be a different sort of moral challenge than fighting well under a hail of bullets, but it’s a character challenge nonetheless.
  • some of the cerebral virtues. We can all grade ourselves on how good we are at each of them.
  • love of learning. Some people are just more ardently curious than others, either by cultivation or by nature.
  • ...12 more annotations...
  • courage. The obvious form of intellectual courage is the willingness to hold unpopular views. But the subtler form is knowing how much risk to take in jumping to conclusions.
  • Intellectual courage is self-regulation, Roberts and Wood argue, knowing when to be daring and when to be cautious. The philosopher Thomas Kuhn pointed out that scientists often simply ignore facts that don’t fit with their existing paradigms, but an intellectually courageous person is willing to look at things that are surprisingly hard to look at.
  • The median point between flaccidity and rigidity is the virtue of firmness. The firm believer can build a steady worldview on solid timbers but still delight in new information. She can gracefully adjust the strength of her conviction to the strength of the evidence. Firmness is a quality of mental agility.
  • humility, which is not letting your own desire for status get in the way of accuracy. The humble person fights against vanity and self-importance.
  • wisdom isn’t a body of information. It’s the moral quality of knowing how to handle your own limitations.
  • autonomy
  • Autonomy is the median of knowing when to bow to authority and when not to, when to follow a role model and when not to, when to adhere to tradition and when not to.
  • generosity. This virtue starts with the willingness to share knowledge and give others credit. But it also means hearing others as they would like to be heard, looking for what each person has to teach and not looking to triumphantly pounce upon their errors.
  • thinking well means pushing against the grain of our nature — against vanity, against laziness, against the desire for certainty, against the desire to avoid painful truths. Good thinking isn’t just adopting the right technique. It’s a moral enterprise and requires good character, the ability to go against our lesser impulses for the sake of our higher ones.
  • The humble researcher doesn’t become arrogant toward his subject, assuming he has mastered it. Such a person is open to learning from anyone at any stage in life.
  • Warren Buffett made a similar point in his own sphere, “Investing is not a game where the guy with the 160 I.Q. beats the guy with the 130 I.Q. Once you have ordinary intelligence, what you need is the temperament to control the urges that get other people into trouble.”
  • Good piece. I only wish David had written more about all the forces that work _against_ the virtues he describes. The innumerable examples of corporate suppression/spin of "inconvenient" truths (i.e, GM, Toyota, et al); the virtual acceptance that lying is a legitimate tactic in political campaigns; our preoccupation with celebrity, appearances, and "looking good" in every imaginable transaction; make the quiet virtues that DB describes even more heroic than he suggests.
carolinewren

2014 Was Hottest Year on Record, Surpassing 2010 - NYTimes.com - 0 views

  • Last year was the hottest in earth’s recorded history, scientists reported on Friday, underscoring scientific warnings about the risks of runaway emissions and undermining claims by climate-change contrarians that global warming had somehow stopped.
  • 2014 now surpasses 2010 as the warmest year in a global temperature record that stretches back to 1880. The 10 warmest years on record have all occurred since 1997, a reflection of the relentless planetary warming that scientists say is a consequence of human emissions and poses profound long-term risks to civilization and to the natural world.
  • Longstanding claims by climate-change skeptics that global warming has stopped, seized on by politicians in Washington to justify inaction on emissions, depend on a particular starting year: 1998, when an unusually powerful El Niño produced the hottest year of the 20th century.
  • ...3 more annotations...
  • John R. Christy, an atmospheric scientist at the University of Alabama in Huntsville who is known for his skepticism about the seriousness of global warming, pointed out in an interview that 2014 had surpassed the other record-warm years by only a few hundredths of a degree, well within the error margin of global temperature measurements.
  • indicates that global warming has not ‘stopped in 1998,’ as some like to falsely claim.”
  • It’s because the planet is warming. The basic issue is the long-term trend, and it is not going away.”
charlottedonoho

Daylight Savings Time May Affect Your Sleeping Cycle More Than You Think « CB... - 0 views

  • In a nation that’s already sleep deprived, an hour of lost sleep can take its toll.
  • “There are more chances of errors the next day, more chances of drowsy driving,” Walia said. “It has been shown that more accidents occur the next day when the daylight changing time occurs so it can definitely have a significant impact on functioning as well.
Javier E

Conservative Delusions About Inflation - NYTimes.com - 0 views

  • the stark partisan divide over issues that should be simply factual, like whether the planet is warming or evolution happened.
  • The problem, in other words, isn’t ignorance; it’s wishful thinking. Confronted with a conflict between evidence and what they want to believe for political and/or religious reasons, many people reject the evidence. And knowing more about the issues widens the divide, because the well informed have a clearer view of which evidence they need to reject to sustain their belief system.
  • the similar state of affairs when it comes to economics, monetary economics in particular.
  • ...7 more annotations...
  • Above all, there were many dire warnings about the evils of “printing money.” For example, in May 2009 an editorial in The Wall Street Journal warned that both interest rates and inflation were set to surge “now that Congress and the Federal Reserve have flooded the world with dollars.” In 2010 a virtual Who’s Who of conservative economists and pundits sent an open letter to Ben Bernanke warning that his policies risked “currency debasement and inflation.”
  • Although the Fed continued on its expansionary course — its balance sheet has grown to more than $4 trillion, up fivefold since the start of the crisis — inflation stayed low. For the most part, the funds the Fed injected into the economy simply piled up either in bank reserves or in cash holdings by individuals — which was exactly what economists on the other side of the divide had predicted would happen.
  • In fact, hardly any of the people who predicted runaway inflation have acknowledged that they were wrong, and that the error suggests something amiss with their approach. Some have offered lame excuses; some, following in the footsteps of climate-change deniers, have gone down the conspiracy-theory rabbit hole, claiming that we really do have soaring inflation, but the government is lying about the numbers
  • Mainly, though, the currency-debasement crowd just keeps repeating the same lines, ignoring its utter failure in prognostication.
  • Isn’t the question of how to manage the money supply a technical issue, not a matter of theological doctrine?
  • Well, it turns out that money is indeed a kind of theological issue. Many on the right are hostile to any kind of government activism, seeing it as the thin edge of the wedge — if you concede that the Fed can sometimes help the economy by creating “fiat money,” the next thing you know liberals will confiscate your wealth and give it to the 47 percent.
  • if you look at the internal dynamics of the Republican Party, it’s obvious that the currency-debasement, return-to-gold faction has been gaining strength even as its predictions keep failing.
johnsonma23

Ohio Men Wrongly Convicted of Murder After 39 Years Released - NBC News.com - 0 views

  • Two Ohio men wrongly accused of murder four decades ago are walking free Friday morning after spending 39 years behind bars.
  • Jackson had been the longest-held U.S. prisoner to be exonerated.
  • convicted along with Bridgeman and Bridgeman’s brother, Ronnie, in the 1975 shooting death and robbery of Harold Franks, a Cleveland-area money order salesman.
  • ...2 more annotations...
  • But that witness, now 53, recanted his testimony last year, saying he was coerced by detectives, according to court documents.
  • Jackson was originally sentenced to death but that sentence was vacated because of a paperwork error. The Bridgeman brothers remained on death row until Ohio declared the death penalty unconstitutional in 1978.
johnsonma23

What 'White Privilege' Really Means - NYTimes.com - 0 views

  • Critical philosophy of race, like critical race theory in legal studies, seeks to understand the disadvantages of nonwhite racial groups in society (blacks especially) by understanding social customs, laws, and legal practices.
  • What’s happening in Ferguson is the result of several recent historical factors and deeply entrenched racial attitudes, as well as a breakdown in participatory democracy.
  • It’s too soon to tell, but “Don’t Shoot” could become a real political movement — or it could peter out as the morally outraged self-expression of the moment, like Occupy Wall Street.
  • ...5 more annotations...
  • people in such circumstances would vote for political representatives on all levels of government who would be their advocates.
  • Middle-class and poor blacks in the United States do less well than whites with the same income on many measures of human well-being: educational attainment, family wealth, employment, health, longevity, infant mortality.
  • But the value of money pales in contrast to the tragedy this country is now forced to deal with. A tragedy is the result of a mistake, of an error in judgment that is based on habit and character, which brings ruin
  • People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate. The death of Michael Brown, like the death of Trayvon Martin before him and the death of Oscar Grant before him, may be but the tip of an iceberg.
  • Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our societ
Javier E

André Glucksmann, French Philosopher Who Renounced Marxism, Dies at 78 - The ... - 0 views

  • In 1975, in “The Cook and the Cannibal,” Mr. Glucksmann subjected Marxism to a scalding critique. Two years later, he broadened his attack in his most influential work, “The Master Thinkers,” which drew a direct line from the philosophies of Marx, Hegel, Fichte and Nietzsche to the enormities of Nazism and Soviet Communism. It was they, he wrote in his conclusion, who “erected the mental apparatus which is indispensable for launching the grand final solutions of the 20th century.”
  • An instant best seller, the book put him in the company of several like-minded former radicals, notably Bernard-Henri Lévy and Pascal Bruckner. Known as the nouveaux philosophes, a term coined by Mr. Lévy, they became some of France’s most prominent public intellectuals, somewhat analogous to the neoconservatives in the United States, but with a lingering leftist orientation.
  • Their apostasy sent shock waves through French intellectual life, and onward to Moscow, which depended on the cachet afforded by Jean-Paul Sartre and other leftist philosophers
  • ...8 more annotations...
  • “It was André Glucksmann who dealt the decisive blow to Communism in France,”
  • “In the West, he presented the anti-totalitarian case more starkly and more passionately than anyone else in modern times,
  • “He was a passionate defender of the superoppressed, whether it was the prisoners of the Gulag, the Bosnians and Kosovars, gays during the height of the AIDS crisis, the Chechens under Putin or the Iraqis under Saddam,” he said. “When he turned against Communism, it was because he realized that Communists were not on the same side.”
  • After earning the teaching degree known as an agrégation from the École Normale Supérieure de Saint-Cloud in 1961, Mr. Glucksmann enrolled in the National Center for Scientific Research to pursue a doctorate under Raymond Aron — an odd matchup because Aron was France’s leading anti-Marxist intellectual.
  • His subsequent turn away from Marxism made him a reviled figure on the left, and former comrades looked on aghast as he became one of France’s most outspoken defenders of the United States. He argued for President Ronald Reagan’s policy of nuclear deterrence toward the Soviet Union, intervention in the Balkans and both American invasions of Iraq. In 2007, he supported the candidacy of Nicolas Sarkozy for the French presidency.
  • “There is the Glucksmann who was right and the Glucksmann who could — with the same fervor, the same feeling of being in the right — be wrong,” Mr. Lévy wrote in a posthumous appreciation for Le Monde. “What set him apart from others under such circumstances is that he would admit his error, and when he came around he was fanatical about studying his mistake, mulling it over, understanding it.”
  • In his most recent book, “Voltaire Counterattacks,” published this year, he positioned France’s greatest philosopher, long out of favor, as a penetrating voice perfectly suited to the present moment.
  • “I think thought is an individual action, not one of a party,” Mr. Glucksmann told The Chicago Tribune in 1991. “First you think. And if that corresponds with the Left, then you are of the Left; if Right, then you are of the Right. But this idea of thinking Left or Right is a sin against the spirit and an illusion.”
katrinaskibicki

Revolutionary discovery: Scientists find gravitational waves Einstein predicted - 0 views

  • For the first time ever, scientists have directly detected gravitational waves, bizarre ripples in space-time foreseen by Einstein a century ago. The discovery was the final, acid test of Einstein’s general theory of relativity.
  • Einstein has been proven right – again.For the first time ever, scientists have directly detected gravitational waves, bizarre ripples in space-time foreseen by Einstein a century ago. The discovery was the final, acid test of Einstein’s celebrated general theory of relativity, and once again Einstein’s genius held up to scrutiny.
  • The waves in question arose during the close approach of two black holes some 1.3 billion years ago, when multicellular life began to spread on Earth. Traveling at the speed of light, the waves reached our planet in September -- precisely when a observatory built to detect them was emerging from a long hiatus.
  • ...2 more annotations...
  • When scientists first saw the data suggesting that they’d captured a gravitational wave, they thought the results seemed to good to be true. Past claims of gravitational waves have proven unreliable, and there are many possible sources of error.
  • Gravitational waves confirmedAstrophysicists have announced the discovery of gravitational waves, ripples that travel at the speed of light through the fabric of space-time. A 1916 theory of Albert Einstein’s predicted their existence. .oembed-asset-photo-image { width: 100%; }
  •  
    A new scientific discovery shows that Einstein's predictions were correct, yet again!
kushnerha

Diversity Makes You Brighter - The New York Times - 0 views

  • Diversity improves the way people think. By disrupting conformity, racial and ethnic diversity prompts people to scrutinize facts, think more deeply and develop their own opinions. Our findings show that such diversity actually benefits everyone, minorities and majority alike.
  • When trading, participants could observe the behavior of their counterparts and decide what to make of it. Think of yourself in similar situations: Interacting with others can bring new ideas into view, but it can also cause you to adopt popular but wrong ones.
  • It depends how deeply you contemplate what you observe. So if you think that something is worth $100, but others are bidding $120 for it, you may defer to their judgment and up the ante (perhaps contributing to a price bubble) or you might dismiss them and stand your ground.
  • ...6 more annotations...
  • When participants were in diverse company, their answers were 58 percent more accurate. The prices they chose were much closer to the true values of the stocks. As they spent time interacting in diverse groups, their performance improved.In homogeneous groups, whether in the United States or in Asia, the opposite happened. When surrounded by others of the same ethnicity or race, participants were more likely to copy others, in the wrong direction. Mistakes spread as participants seemingly put undue trust in others’ answers, mindlessly imitating them. In the diverse groups, across ethnicities and locales, participants were more likely to distinguish between wrong and accurate answers. Diversity brought cognitive friction that enhanced deliberation.
  • For our study, we intentionally chose a situation that required analytical thinking, seemingly unaffected by ethnicity or race. We wanted to understand whether the benefits of diversity stem, as the common thinking has it, from some special perspectives or skills of minorities.
  • What we actually found is that these benefits can arise merely from the very presence of minorities.
  • before participants interacted, there were no statistically significant differences between participants in the homogeneous or diverse groups. Minority members did not bring some special knowledge.
  • When surrounded by people “like ourselves,” we are easily influenced, more likely to fall for wrong ideas. Diversity prompts better, critical thinking. It contributes to error detection. It keeps us from drifting toward miscalculation.
  • Our findings suggest that racial and ethnic diversity matter for learning, the core purpose of a university. Increasing diversity is not only a way to let the historically disadvantaged into college, but also to promote sharper thinking for everyone.
kushnerha

The Words That Killed Medieval Jews - The New York Times - 0 views

  • DO harsh words lead to violent acts? At a moment when hate speech seems to be proliferating, it’s a question worth asking.
  • worry that heated anti-Muslim political rhetoric would spark an increase in attacks against Muslims.
  • Some claim that last month’s mass shooting in Colorado Springs was provoked by Carly Fiorina’s assertion that Planned Parenthood was “harvesting baby parts”; Mrs. Fiorina countered that language could not be held responsible for the deeds of a “deranged” man.
  • ...12 more annotations...
  • beating of a homeless Hispanic man in Boston, allegedly inspired by Donald J. Trump’s anti-immigration rhetoric, and by the shooting deaths of police officers in California, Texas and Illinois, which some have attributed to anti-police sentiment expressed at Black Lives Matter protests.
  • history does show that a heightening of rhetoric against a certain group can incite violence against that group, even when no violence is called for. When a group is labeled hostile and brutal, its members are more likely to be treated with hostility and brutality. Visual images are particularly powerful, spurring actions that may well be unintended by the images’ creators.
  • Official Christian theology and policy toward Jews remained largely unchanged in the Middle Ages. Over roughly 1,000 years, Christianity condemned the major tenets of Judaism and held “the Jews” responsible for the death of Jesus. But the terms in which these ideas were expressed changed radically.
  • Before about 1100, Christian devotions focused on Christ’s divine nature and triumph over death. Images of the crucifixion showed Jesus alive and healthy on the cross. For this reason, his killers were not major focuses in Christian thought. No anti-Jewish polemics were composed during these centuries
  • In an effort to spur compassion among Christian worshipers, preachers and artists began to dwell in vivid detail on Christ’s pain. Christ morphed from triumphant divine judge to suffering human savior. A parallel tactic, designed to foster a sense of Christian unity, was to emphasize the cruelty of his supposed tormentors, the Jews.
  • The “Goad of Love,” a retelling of the crucifixion that is considered the first anti-Jewish Passion treatise, was written around 1155-80. It describes Jews as consumed with sadism and blood lust. They were seen as enemies not only of Christ, but also of living Christians; it was at this time that Jews began to be accused of ritually sacrificing Christian children.
  • Ferocious anti-Jewish rhetoric began to permeate sermons, plays and polemical texts. Jews were labeled demonic and greedy. In one diatribe, the head of the most influential monastery in Christendom thundered at the Jews: “Why are you not called brute animals? Why not beasts?” Images began to portray Jews as hooknosed caricatures of evil.
  • the First Crusade had called only for an “armed pilgrimage” to retake Jerusalem from Muslims, the first victims of the Crusade were not the Turkish rulers of Jerusalem but Jewish residents of the German Rhineland. Contemporary accounts record the crusaders asking why, if they were traveling to a distant land to “kill and to subjugate all those kingdoms that do not believe in the Crucified,” they should not also attack “the Jews, who killed and crucified him?”
  • At no point did Christian authorities promote or consent to the violence. Christian theology, which applied the Psalm verse “Slay them not” to Jews, and insisted that Jews were not to be killed for their religion, had not changed. Clerics were at a loss to explain the attacks. A churchman from a nearby town attributed the massacres to “some error of mind.”
  • But not all the Rhineland killers were crazy. The crusaders set out in the Easter season. Both crusade and Easter preaching stirred up rage about the crucifixion and fear of hostile and threatening enemies.
  • Sometimes the perpetrators were zealous holy warriors, sometimes they were opportunistic business rivals, sometimes they were parents grieving for children lost to accident or crime, or fearful of the ravages of a new disease.
  • Some may well have been insane. But sane or deranged, they did not pick their victims in a vacuum. It was repeated and dehumanizing excoriation that led those medieval Christians to attack people who had long been their neighbors.
paisleyd

'Brain training' app may improve memory, daily functioning of people with schizophrenia... - 0 views

  • A 'brain training' iPad game developed and tested by researchers at the University of Cambridge may improve the memory of patients with schizophrenia
  • Schizophrenia is a long-term mental health condition that causes a range of psychological symptoms, ranging from changes in behaviour through to hallucinations and delusions
  • patients are still left with debilitating cognitive impairments, including in their memory
  • ...10 more annotations...
  • increasing evidence that computer-assisted training and rehabilitation can help people with schizophrenia overcome some of their symptoms
  • Schizophrenia is estimated to cost £13.1 billion per year in total in the UK, so even small improvements in cognitive functions could help patients make the transition to independent living
  • The game, Wizard, was the result of a nine-month collaboration between psychologists, neuroscientists, a professional game-developer and people with schizophrenia
  • The memory task was woven into a narrative in which the player was allowed to choose their own character and name; the game rewarded progress with additional in-game activities to provide the user with a sense of progression independent of the cognitive training process
  • Participants in the training group played the memory game for a total of eight hours over a four-week period; participants in the control group continued their treatment as usual. At the end of the four weeks, the researchers tested all participants' episodic memory using the Cambridge Neuropsychological Test Automated Battery (CANTAB) PAL, as well as their level of enjoyment and motivation, and their score on the Global Assessment of Functioning (GAF) scale
  • patients who had played the memory game made significantly fewer errors and needed significantly fewer attempts to remember the location of different patterns in the CANTAB PAL test relative to the control group. In addition, patients in the cognitive training group saw an increase in their score on the GAF scale
  • Because the game is interesting, even those patients with a general lack of motivation are spurred on to continue the training
  • used in conjunction with medication and current psychological therapies, this could help people with schizophrenia minimise the impact of their illness on everyday life
  • It is not clear exactly how the apps also improved the patients' daily functioning, but the researchers suggest it may be because improvements in memory had a direct impact on global functions or that the cognitive training may have had an indirect impact on functionality by improving general motivation and restoring self-esteem
  • This new app will allow the Wizard memory game to become widely available, inexpensively. State-of-the-art neuroscience at the University of Cambridge, combined with the innovative approach at Peak, will help bring the games industry to a new level and promote the benefits of cognitive enhancement
kushnerha

BBC - Future - What Sherlock Holmes taught us about the mind - 0 views

  • The century-old detective stories are being studied by today’s neurologists – but why? As it turns out, not even modern technology can replace their lessons in rational thinking.
  • Arthur Conan Doyle was a physician himself, and there is evidence that he modelled the character of Holmes on one of the leading doctors of the day, Joseph Bell of the Royal Edinburgh Infirmary. “I thought I would try my hand at writing a story where the hero would treat crime as Dr Bell treated disease,”
  • Conan Doyle may have also drawn some inspiration from other doctors, such as William Gowers, who wrote the Bible of Neurology
  • ...11 more annotations...
  • Gowers often taught his students to begin their diagnosis from the moment a patient walked through the door
  • “Did you notice him as he came into the room? If you did not then you should have done so. One of the habits to be acquired and never omitted is to observe a patient as he enters the room; to note his aspect and his gait. If you did so, you would have seen that he seemed lame, and you may have been struck by that which must strike you now – an unusual tint of his face.”
  • the importance of the seemingly inconsequential that seems to inspire both men. “It has long been an axiom of mine that the little things are infinitely the most important,” Conan Doyle wrote
  • Both Gowers and Holmes also warned against letting your preconceptions fog your judgement. For both men, cool, unprejudiced observation was the order of the day. It is for this reason that Holmes chastises Watson in The Scandal of Bohemia: “You see, but you do not observe. The distinction is clear.”
  • Gowers: “The method you should adopt is this: Whenever you find yourself in the presence of a case that is not familiar to you in all its detail forget for a time all your types and all your names. Deal with the case as one that has never been seen before, and work it out as a new problem sui generis, to be investigated as such.”
  • both men “reasoned backwards”, for instance, dissecting all the possible paths that may have led to a particular disease (in Gowers’ case) or murder (in Holmes’)
  • Holmes’ most famous aphorism: “When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
  • the most important lesson to be learned, from both Gowers and Holmes, is the value of recognising your errors. “Gentlemen – It is always pleasant to be right, but it is generally a much more useful thing to be wrong,” wrote Gowers
  • This humility is key in beating the ‘curse of expertise’ that afflicts so many talented and intelligent people.
  • University College London has documented many instances in which apparent experts in both medicine and forensic science have allowed their own biases to cloud their judgements – sometimes even in life or death situations.
  • Even the most advanced technology can never replace the powers of simple observation and rational deduction.
kushnerha

'Run, Hide, Fight' Is Not How Our Brains Work - The New York Times - 0 views

  • One suggestion, promoted by the Federal Bureau of Investigation and Department of Homeland Security, and now widely disseminated, is “run, hide, fight.” The idea is: Run if you can; hide if you can’t run; and fight if all else fails. This three-step program appeals to common sense, but whether it makes scientific sense is another question.
  • Underlying the idea of “run, hide, fight” is the presumption that volitional choices are readily available in situations of danger. But the fact is, when you are in danger, whether it is a bicyclist speeding at you or a shooter locked and loaded, you may well find yourself frozen, unable to act and think clearly.
  • Freezing is not a choice. It is a built-in impulse controlled by ancient circuits in the brain involving the amygdala and its neural partners, and is automatically set into motion by external threats. By contrast, the kinds of intentional actions implied by “run, hide, fight” require newer circuits in the neocortex.
  • ...7 more annotations...
  • Contemporary science has refined the old “fight or flight” concept — the idea that those are the two hard-wired options when in mortal danger — to the updated “freeze, flee, fight.”
  • Why do we freeze? It’s part of a predatory defense system that is wired to keep the organism alive. Not only do we do it, but so do other mammals and other vertebrates. Even invertebrates — like flies — freeze. If you are freezing, you are less likely to be detected if the predator is far away, and if the predator is close by, you can postpone the attack (movement by the prey is a trigger for attack)
  • The freezing reaction is accompanied by a hormonal surge that helps mobilize your energy and focus your attention. While the hormonal and other physiological responses that accompany freezing are there for good reason, in highly stressful situations the secretions can be excessive and create impediments to making informed choices.
  • Sometimes freezing is brief and sometimes it persists. This can reflect the particular situation you are in, but also your individual predisposition. Some people naturally have the ability to think through a stressful situation, or to even be motivated by it, and will more readily run, hide or fight as required.
  • we have created a version of this predicament using rats. The animals have been trained, through trial and error, to “know” how to escape in a certain dangerous situation. But when they are actually placed in the dangerous situation, some rats simply cannot execute the response — they stay frozen. If, however, we artificially shut down a key subregion of the amygdala in these rats, they are able to overcome the built-in impulse to freeze and use their “knowledge” about what to do.
  • shown that if people cognitively reappraise a situation, it can dampen their amygdala activity. This dampening may open the way for conceptually based actions, like “run, hide, fight,” to replace freezing and other hard-wired impulses.
  • How to encourage this kind of cognitive reappraisal? Perhaps we could harness the power of social media to conduct a kind of collective cultural training in which we learn to reappraise the freezing that occurs in dangerous situations. In most of us, freezing will occur no matter what. It’s just a matter of how long it will last.
Javier E

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
Javier E

Opinion | Knowledge, Ignorance and Climate Change - The New York Times - 1 views

  • the value of being aware of our ignorance has been a recurring theme in Western thought: René Descartes said it’s necessary to doubt all things to build a solid foundation for science; and Ludwig Wittgenstein, reflecting on the limits of language, said that “the difficulty in philosophy is to say no more than we know.”
  • Sometimes, when it appears that someone is expressing doubt, what he is really doing is recommending a course of action. For example, if I tell you that I don’t know whether there is milk in the fridge, I’m not exhibiting philosophical wisdom — I’m simply recommending that you check the fridge before you go shopping.
  • According to NASA, at least 97 percent of actively publishing climate scientists think that “climate-warming trends over the past century are extremely likely caused by human activities.”
  • ...14 more annotations...
  • As a philosopher, I have nothing to add to the scientific evidence of global warming, but I can tell you how it’s possible to get ourselves to sincerely doubt things, despite abundant evidence to the contrary
  • scenarios suggest that it’s possible to feel as though you don’t know something even when possessing enormous evidence in its favor. Philosophers call scenarios like these “skeptical pressure” cases
  • In general, a skeptical pressure case is a thought experiment in which the protagonist has good evidence for something that he or she believes, but the reader is reminded that the protagonist could have made a mistake
  • If the story is set up in the right way, the reader will be tempted to think that the protagonist’s belief isn’t genuine knowledge
  • When presented with these thought experiments, some philosophy students conclude that what these examples show is that knowledge requires full-blown certainty. In these skeptical pressure cases, the evidence is overwhelming, but not 100 percent. It’s an attractive idea, but it doesn’t sit well with the fact that we ordinarily say we know lots of things with much lower probability.
  • Although there is no consensus about how it arises, a promising idea defended by the philosopher David Lewis is that skeptical pressure cases often involve focusing on the possibility of error. Once we start worrying and ruminating about this possibility, no matter how far-fetched, something in our brains causes us to doubt. The philosopher Jennifer Nagel aptly calls this type of effect “epistemic anxiety.”
  • In my own work, I have speculated that an extreme version of this phenomenon is operative in obsessive compulsive disorder
  • The standard response by climate skeptics is a lot like our reaction to skeptical pressure cases. Climate skeptics understand that 97 percent of scientists disagree with them, but they focus on the very tiny fraction of holdouts. As in the lottery case, this focus might be enough to sustain their skepticism.
  • Anti-vaccine proponents, for example, aware that medical professionals disagree with their position, focus on any bit of fringe research that might say otherwise.
  • Skeptical allure can be gripping. Piling on more evidence does not typically shake you out of it, just as making it even more probable that you will lose the lottery does not all of a sudden make you feel like you know your ticket is a loser.
  • One way to counter the effects of skepticism is to stop talking about “knowledge” and switch to talking about probabilities. Instead of saying that you don’t know some claim, try to estimate the probability that it is true. As hedge fund managers, economists, policy researchers, doctors and bookmakers have long been aware, the way to make decisions while managing risk is through probabilities.
  • Once we switch to this perspective, claims to “not know,” like those made by Trump, lose their force and we are pushed to think more carefully about the existing data and engage in cost-benefit analyses.
  • It’s easy to say you don’t know, but it’s harder to commit to an actual low probability estimate in the face of overwhelming contrary evidence.
  • Socrates was correct that awareness of one’s ignorance is virtuous, but philosophers have subsequently uncovered many pitfalls associated with claims of ignorance. An appreciation of these issues can help elevate public discourse on important topics, including the future of our planet.
« First ‹ Previous 101 - 120 of 177 Next › Last »
Showing 20 items per page