Skip to main content

Home/ TOK Friends/ Group items matching "Perspective" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
11More

Are scientists blocking their own progress? - The Washington Post - 1 views

  • Max Planck won a Nobel prize for his revolutionary work in quantum mechanics, but it was his interest in the philosophy of science that led to what is now called “Planck’s Principle.” Planck argued that science was an evolving system of thought which changes slowly over time, fueled by the deaths of old ideas. As he wrote in his 1968 autobiography: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
  • Is our understanding of the world based in pure objective reason, or are the theories that underpin it shaped by generational biases? Do our most famous thinkers actually block new ideas from gaining ground?
  • A new paper published by the National Bureau of Economic Research suggests that fame does play a significant role in deciding when and whether new scientific ideas can gain traction. When a prominent scientist dies, the paper’s authors found, the number of articles published by his or her collaborators tends to fall “precipitously” in the years following the death — those supporters tend not to continue advocating for a once-famous scientist’s ideas once the scientist is gone.
  • ...8 more annotations...
  • the number of research articles written by other scientists — including those with opposing ideas — increases by 8 percent on average, implying that the work of these scientists had been stifled before, but that after the death of a ubiquitous figure, the field becomes more open to new ideas. The study also found that these new articles are less likely to cite previous research and are more likely to be cited by others in the field. Death signifies a changing of the guard
  • Our instinct is often to view science as a concrete tower, growing ever upward and built upon the immovable foundations of earlier pioneers.  Sir Isaac Newton famously characterized this as “standing on the shoulders of giants.”
  • Mid-20th century philosopher Thomas Kuhn was among the first to come to this conclusion, in his 1962 book “The Structure of Scientific Revolutions.” He argued that scientific theories appeared in punctuated “paradigm shifts,” in which the underlying assumptions of a field are questioned and eventually overthrown
  • Kuhn’s book was, to some extent, a paradigm shift in its own right. According to his logic, commonly held notions in science were bound to change and become outdated. What we believe today will tomorrow be revised, rewritten — and in the most extreme cases ridiculed.
  • the journal Nature earlier this year said scientific data is prone to bias because researchers design experiments and make observations in ways that support hypotheses
  • equally as important are simple shifts in perspective. It only takes one researcher seeing an accepted scientific model in a new light for a solidified paradigm to enter what Kuhn called a “crisis phase” and beg for alternative explanations
  • The NBER study shows that those who questioned consensus ought to be given the opportunity to make their case, not ignored, silenced or pushed to the back of the line.
  • We’re likely to see these “paradigm shifts” happen at a much faster rate as data and research become easier to share worldwide. For some, this reality might seem chaotic; for the truly curious, it is exhilarating. The result may be a more democratic version of science — one in which the progress of ideas doesn’t have to wait until the funeral of a great mind.
16More

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
9More

Diversity Makes You Brighter - The New York Times - 0 views

  • Diversity improves the way people think. By disrupting conformity, racial and ethnic diversity prompts people to scrutinize facts, think more deeply and develop their own opinions. Our findings show that such diversity actually benefits everyone, minorities and majority alike.
  • When trading, participants could observe the behavior of their counterparts and decide what to make of it. Think of yourself in similar situations: Interacting with others can bring new ideas into view, but it can also cause you to adopt popular but wrong ones.
  • It depends how deeply you contemplate what you observe. So if you think that something is worth $100, but others are bidding $120 for it, you may defer to their judgment and up the ante (perhaps contributing to a price bubble) or you might dismiss them and stand your ground.
  • ...6 more annotations...
  • When participants were in diverse company, their answers were 58 percent more accurate. The prices they chose were much closer to the true values of the stocks. As they spent time interacting in diverse groups, their performance improved.In homogeneous groups, whether in the United States or in Asia, the opposite happened. When surrounded by others of the same ethnicity or race, participants were more likely to copy others, in the wrong direction. Mistakes spread as participants seemingly put undue trust in others’ answers, mindlessly imitating them. In the diverse groups, across ethnicities and locales, participants were more likely to distinguish between wrong and accurate answers. Diversity brought cognitive friction that enhanced deliberation.
  • For our study, we intentionally chose a situation that required analytical thinking, seemingly unaffected by ethnicity or race. We wanted to understand whether the benefits of diversity stem, as the common thinking has it, from some special perspectives or skills of minorities.
  • What we actually found is that these benefits can arise merely from the very presence of minorities.
  • before participants interacted, there were no statistically significant differences between participants in the homogeneous or diverse groups. Minority members did not bring some special knowledge.
  • When surrounded by people “like ourselves,” we are easily influenced, more likely to fall for wrong ideas. Diversity prompts better, critical thinking. It contributes to error detection. It keeps us from drifting toward miscalculation.
  • Our findings suggest that racial and ethnic diversity matter for learning, the core purpose of a university. Increasing diversity is not only a way to let the historically disadvantaged into college, but also to promote sharper thinking for everyone.
19More

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
12More

The Science of Older and Wiser - The New York Times - 0 views

  • The question remains compelling: What is wisdom, and how does it play out in individual lives?
  • Most psychologists agree that if you define wisdom as maintaining positive well-being and kindness in the face of challenges, it is one of the most important qualities one can possess to age successfully — and to face physical decline and death.
  • Based on an analysis of their answers, she determined that wisdom consists of three key components: cognition, reflection and compassion.
  • ...9 more annotations...
  • While younger people were faster in tests of cognitive performance, older people showed “greater sensitivity to fine-grained differences,” the study found.
  • one must take time to gain insights and perspectives from one’s cognitive knowledge to be wise (the reflective dimension). Then one can use those insights to understand and help others (the compassionate dimension).
  • Wisdom, she has found, is the ace in the hole that can help even severely impaired people find meaning, contentment and acceptance in later life.
  • “Wise people are able to accept reality as it is, with equanimity,” Professor Ardelt said.
  • True personal wisdom involves five elements, said Professor Staudinger, now a life span psychologist and professor at Columbia University. They are self-insight; the ability to demonstrate personal growth; self-awareness in terms of your historical era and your family history; understanding that priorities and values, including your own, are not absolute; and an awareness of life’s ambiguities.
  • True wisdom involves recognizing the negative both within and outside ourselves and trying to learn from it, she said.
  • If you are wise, she said, “You’re not only regulating your emotional state, you’re also attending to another person’s emotional state.” She added: “You’re not focusing so much on what you need and deserve, but on what you can contribute.”
  • Continuing education can be an important way to cultivate wisdom in the later years, researchers say, for one thing because it combats isolation.
  • Reflecting on the meaning and structure of their lives, she said, can help people thrive after the balance shifts and there is much less time left than has gone before.
11More

The Age of Protest - The New York Times - 0 views

  • If you go to The Guardian’s website these days you can find a section that is just labeled “Protest.” So now, with your morning coffee, you can get your news, weather, sports — and protests.
  • In my view, this age of protest is driven, in part, by the fact that the three largest forces on the planet — globalization, Moore’s law and Mother Nature — are all in acceleration, creating an engine of disruption that is stressing strong countries and middle classes and blowing up weak ones, while superempowering individuals and transforming the nature of work, leadership and government all at once.
  • When you get that much agitation in a world where everyone with a smartphone is now a reporter, news photographer and documentary filmmaker, it’s a wonder that every newspaper doesn’t have a “Protest” section.
  • ...8 more annotations...
  • “People everywhere seem to be morally aroused,” said Seidman. “The philosopher David Hume argued that ‘the moral imagination diminishes with distance.’ It would follow that the opposite is also true: As distance decreases, the moral imagination increases. Now that we have no distance — it’s like we’re all in a crowded theater, making everything personal — we are experiencing the aspirations, hopes, frustrations, plights of others in direct and visceral ways.”
  • “A dentist from Minnesota shoots a cherished lion in Zimbabwe named Cecil, and days later everyone in the world knows about it, triggering a tsunami of moral outrage on Twitter and Facebook. As a result, some people try to shut down his dental practice by posting negative reviews on Yelp and spray paint ‘Lion Killer’ on his Florida vacation home. Almost 400,000 people then sign a petition in one day on Change.org demanding that Delta Air Lines change their policy of transporting trophy kills. Delta does so and other airlines follow. And then hunters who contribute to Zimbabwe’s tourism industry protest the protest, claiming that they were being discriminated against.”
  • That we are becoming more morally aroused “is generally a good thing,” argued Seidman. Institutionalized racism in police departments, or in college fraternities, is real and had been tolerated for way too long. That it’s being called out is a sign of a society’s health “and re-engagement.”
  • But when moral arousal manifests as moral outrage, he added, “it can either inspire or repress a serious conversation or the truth.”
  • “If moral outrage, as justified as it may be, is followed immediately by demands for firings or resignations,” argued Seidman, “it can result in a vicious cycle of moral outrage being met with equal outrage, as opposed to a virtuous cycle of dialogue and the hard work of forging real understanding and enduring agreements.”
  • Furthermore, “when moral outrage skips over moral conversation, then the outcome is likely going to be acquiescence, not inspired solutions,” Seidman added. It can also feed the current epidemic of inauthentic apologies, “since apologies extracted under pressure are like telling a child, `Just say you’re sorry,’ to move past the issue without ever making amends.”
  • it’s as if “we’re living in a never-ending storm,” he said. Alas, though, resolving moral disputes “requires perspective, fuller context and the ability to make meaningful distinctions.”
  • requires leaders with the courage and empathy “to inspire people to pause to reflect, so that instead of reacting by yelling in 140 characters they can channel all this moral outrage into deep and honest conversations.”
15More

Anxiety and Depression Are on an 80-Year Upswing -- Science of Us - 1 views

  • Ever since the 1930s, young people in America have reported feeling increasingly anxious and depressed. And no one knows exactly why.One of the researchers who has done the most work on this subject is Dr. Jean Twenge, a social psychologist at San Diego State University who is the author of Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before. She’s published a handful of articles on this trajectory, and the underlying story, she thinks, is a rather negative one. “I think the research tells us that modern life is not good for mental health,” she said.
  • The words “depression” and “anxiety” themselves, after all, mean very different things to someone asked about them in 1935 as compared to 1995, so surveys that invoke these concepts directly only have limited utility for longitudinal study. To get around this, Twenge prefers to rely on surveys and inventories in which respondents are asked about specific symptoms which are frequently correlated with anxiety and depression
  • Much of the richest data on this question, then, comes from the Minnesota Multiphasic Personality Inventory (MMPI), which has been administered to high school and college students since the 1930s — and which includes many questions about symptoms. Specifically, it asks — among many other things — whether respondents feel well-rested when they wake up, whether they have trouble thinking, and whether they have experienced dizzy spells, headaches, shortness of breath, a racing heart, and so on.
  • ...12 more annotations...
  • The trendlines are obvious: Asked the same questions at about the same points in their lives, Americans are, over time, experiencing worse and worse symptoms associated with anxiety and depression.
  • there’s an interesting recent wrinkle to this trajectory. In a paper published in 2014 in Social Indicators Research, Twenge tracked the results of the Monitoring the Future (MtF) survey, “a nationally representative sample of U.S. 12th graders [administered] every year since 1976,” between 1982 and 2013. Like the MMPI, the MtF asks students about symptoms in a manner that should be generally resistant to cultural change: The somatic items Twenge examined asked about trouble sleeping, remembering things, thinking/concentrating, and learning, as well as shortness of breath. An interesting recent pattern emerged on these measures:
  • All the items end up significantly higher than where they started, but for many of them most of the increase happens over the first half of the time period in question. From the late 1990s or so until 2013, many of the items bounce around a bit but ultimately remain flat, or flat-ish.
  • drugs — Prozac and Lexapro, among others — have been prescribed to millions of people who experience these symptoms, many of whom presumably saw some improvement once the drugs kicked in, so this explanation at least makes intuitive sens
  • there are likely other factors leading to the plateau as well, said Twenge. For one thing, the “crime rate is lower [today] than it was when it peaked in the early 1990s,” and dealing with crime can lead to anxiety and depression symptoms. Other indicators of youth well-being, like teen pregnancy, were also significantly higher back then, and could have accounted for the trajectory visible on the graphs.“For whatever reason,” said Twenge, “if you look at what was going on back then, the early 1990s were not a good time, particularly for young people.”
  • “Obviously there’s a lot of good things about societal and technological progress,” she said, “and in a lot of ways our lives are much easier than, say, our grandparents’ or great-grandparents’ lives. But there’s a paradox here that we seem to have so much ease and relative economic prosperity compared to previous centuries, yet there’s this dissatisfaction, there’s this unhappiness, there are these mental health issues in terms of depression and anxiety.
  • She thinks the primary problem is that “modern life doesn’t give us as many opportunities to spend time with people and connect with them, at least in person, compared to, say, 80 years ago or 100 years ago. Families are smaller, the divorce rate is higher, people get married much later in life.”
  • it may simply be the case that many people who lived in less equal, more “traditional” times were forced into close companionship with a lot of other people, and that this shielded them from certain psychological problems, whatever else was going on in their lives.
  • She was virtually never alone — and that can be a bad thing, clearly, but from a mental health perspective being surrounded by people is a good thing.”
  • the shift away from this sort of life has also brought with it a shift in values, and Twenge thinks that this, too, can account for the increase in anxiety and depression. “There’s clear evidence that the focus on money, fame, and image has gone up,
  • “and there’s also clear evidence that people who focus on money, fame, and image are more likely to be depressed and anxious.”
  • “It’s so tempting to say the world is going to hell in a handbasket and everything’s bad, but there are so many good things about modern life,” she said. So maybe the key message here is that while there’s no way to go back to family farms and young marriage and parenthood — and, from an equality standpoint,we wouldn’t want to anyway — modern life needs to do a better job of connecting people to one another, and encouraging them to adopt the sorts of goals and outlooks that will make them happy.
17More

Opinion | Knowledge, Ignorance and Climate Change - The New York Times - 1 views

  • the value of being aware of our ignorance has been a recurring theme in Western thought: René Descartes said it’s necessary to doubt all things to build a solid foundation for science; and Ludwig Wittgenstein, reflecting on the limits of language, said that “the difficulty in philosophy is to say no more than we know.”
  • Sometimes, when it appears that someone is expressing doubt, what he is really doing is recommending a course of action. For example, if I tell you that I don’t know whether there is milk in the fridge, I’m not exhibiting philosophical wisdom — I’m simply recommending that you check the fridge before you go shopping.
  • According to NASA, at least 97 percent of actively publishing climate scientists think that “climate-warming trends over the past century are extremely likely caused by human activities.”
  • ...14 more annotations...
  • As a philosopher, I have nothing to add to the scientific evidence of global warming, but I can tell you how it’s possible to get ourselves to sincerely doubt things, despite abundant evidence to the contrary
  • scenarios suggest that it’s possible to feel as though you don’t know something even when possessing enormous evidence in its favor. Philosophers call scenarios like these “skeptical pressure” cases
  • In general, a skeptical pressure case is a thought experiment in which the protagonist has good evidence for something that he or she believes, but the reader is reminded that the protagonist could have made a mistake
  • If the story is set up in the right way, the reader will be tempted to think that the protagonist’s belief isn’t genuine knowledge
  • When presented with these thought experiments, some philosophy students conclude that what these examples show is that knowledge requires full-blown certainty. In these skeptical pressure cases, the evidence is overwhelming, but not 100 percent. It’s an attractive idea, but it doesn’t sit well with the fact that we ordinarily say we know lots of things with much lower probability.
  • Although there is no consensus about how it arises, a promising idea defended by the philosopher David Lewis is that skeptical pressure cases often involve focusing on the possibility of error. Once we start worrying and ruminating about this possibility, no matter how far-fetched, something in our brains causes us to doubt. The philosopher Jennifer Nagel aptly calls this type of effect “epistemic anxiety.”
  • In my own work, I have speculated that an extreme version of this phenomenon is operative in obsessive compulsive disorder
  • The standard response by climate skeptics is a lot like our reaction to skeptical pressure cases. Climate skeptics understand that 97 percent of scientists disagree with them, but they focus on the very tiny fraction of holdouts. As in the lottery case, this focus might be enough to sustain their skepticism.
  • Anti-vaccine proponents, for example, aware that medical professionals disagree with their position, focus on any bit of fringe research that might say otherwise.
  • Skeptical allure can be gripping. Piling on more evidence does not typically shake you out of it, just as making it even more probable that you will lose the lottery does not all of a sudden make you feel like you know your ticket is a loser.
  • One way to counter the effects of skepticism is to stop talking about “knowledge” and switch to talking about probabilities. Instead of saying that you don’t know some claim, try to estimate the probability that it is true. As hedge fund managers, economists, policy researchers, doctors and bookmakers have long been aware, the way to make decisions while managing risk is through probabilities.
  • Once we switch to this perspective, claims to “not know,” like those made by Trump, lose their force and we are pushed to think more carefully about the existing data and engage in cost-benefit analyses.
  • It’s easy to say you don’t know, but it’s harder to commit to an actual low probability estimate in the face of overwhelming contrary evidence.
  • Socrates was correct that awareness of one’s ignorance is virtuous, but philosophers have subsequently uncovered many pitfalls associated with claims of ignorance. An appreciation of these issues can help elevate public discourse on important topics, including the future of our planet.
11More

When bias beats logic: why the US can't have a reasoned gun debate | US news | The Guar... - 1 views

  • Jon Stokes, a writer and software developer, said he is frustrated after each mass shooting by “the sentiment among very smart people, who are used to detail and nuance and doing a lot of research, that this is cut and dried, this is black and white”.
  • Stokes has lived on both sides of America’s gun culture war, growing up in rural Louisiana, where he got his first gun at age nine, and later studying at Harvard and the University of Chicago, where he adopted some of a big-city resident’s skepticism about guns. He’s written articles about the gun geek culture behind the popularity of the AR-15, why he owns a military-style rifle, and why gun owners are so skeptical of tech-enhanced “smart guns”.
  • Even to suggest that the debate is more complicated – that learning something about guns, by taking a course on how to safely carry a concealed weapon, or learning how to fire a gun, might shift their perspective on whichever solution they have just heard about on TV – “just upsets them, and they basically say you’re trying to obscure the issue”.
  • ...8 more annotations...
  • In early 2013, a few months after the mass shooting at Sandy Hook elementary school, a Yale psychologist created an experiment to test how political bias affects our reasoning skills. Dan Kahan was attempting to understand why public debates over social problems remain deadlocked, even when good scientific evidence is available. He decided to test a question about gun control.
  • Then Kahan ran the same test again. This time, instead of evaluating skin cream trials, participants were asked to evaluate whether a law banning citizens from carrying concealed firearms in public made crime go up or down. The result: when liberals and conservatives were confronted with a set of results that contradicted their political assumptions, the smartest people were barely more likely to arrive at the correct answer than the people with no math skills at all. Political bias had erased the advantages of stronger reasoning skills.
  • The reason that measurable facts were sidelined in political debates was not that people have poor reasoning skills, Kahan concluded. Presented with a conflict between holding to their beliefs or finding the correct answer to a problem, people simply went with their tribe.
  • It wasa reasonable strategy on the individual level – and a “disastrous” one for tackling social change, he concluded.
  • But the biggest distortion in the gun control debate is the dramatic empathy gap between different kinds of victims. It’s striking how puritanical the American imagination is, how narrow its range of sympathy. Mass shootings, in which the perpetrator kills complete strangers at random in a public place, prompt an outpouring of grief for the innocent lives lost. These shootings are undoubtedly horrifying, but they account for a tiny percentage of America’s overall gun deaths each year.
  • The roughly 60 gun suicides each day, the 19 black men and boys lost each day to homicide, do not inspire the same reaction, even though they represent the majority of gun violence victims. Yet there are meaningful measures which could save lives here – targeted inventions by frontline workers in neighborhoods where the gun homicide rate is 400 times higher than other developed countries, awareness campaigns to help gun owners in rural states learn about how to identify suicide risk and intervene with friends in trouble.
  • When it comes to suicide, “there is so much shame about that conversation … and where there is shame there is also denial,”
  • When young men of color are killed, “you have disdain and aggression,” fueled by the type of white supremacist argument which equates blackness with criminality.
10More

Jonathan Franzen Is Fine With All of It - The New York Times - 0 views

  • If you’re in a state of perpetual fear of losing market share for you as a person, it’s just the wrong mind-set to move through the world with.” Meaning that if your goal is to get liked and retweeted, then you are perhaps molding yourself into the kind of person you believe will get those things, whether or not that person resembles the actual you. The writer’s job is to say things that are uncomfortable and hard to reduce. Why would a writer mold himself into a product?
  • And why couldn’t people hear him about the social effects this would have? “The internet is all about destroying the elite, destroying the gatekeepers,” he said. “The people know best. You take that to its conclusion, and you get Donald Trump. What do those Washington insiders know? What does the elite know?
  • So he decided to withdraw from it all. After publicity for “The Corrections” ended, he decided he would no longer read about himself — not reviews, not think pieces, not stories, and then, as they came, not status updates and not tweets. He didn’t want to hear reaction to his work. He didn’t want to see the myriad ways he was being misunderstood. He didn’t want to know what the hashtags were.
  • ...7 more annotations...
  • I stopped reading reviews because I noticed all I remember is the negatives. Whatever fleeting pleasure you have in someone applying a laudatory adjective to your book is totally washed away by the unpleasantness of remembering the negative things for the rest of your life verbatim.
  • Franzen thinks that there’s no way for a writer to do good work — to write something that can be called “consuming and extraordinarily moving” — without putting a fence around yourself so that you can control the input you encounter. So that you could have a thought that isn’t subject to pushback all the time from anyone who has ever met you or heard of you or expressed interest in hearing from you. Without allowing yourself to think for a minute.
  • It’s not just writers. It’s everyone. The writer is just an extreme case of something everyone struggles with. “On the one hand, to function well, you have to believe in yourself and your abilities and summon enormous confidence from somewhere. On the other hand, to write well, or just to be a good person, you need to be able to doubt yourself — to entertain the possibility that you’re wrong about everything, that you don’t know everything, and to have sympathy with people whose lives and beliefs and perspectives are very different from yours.”
  • “This balancing act” — the confidence that you know everything plus the ability to believe that you don’t — “only works, or works best, if you reserve a private space for it.”
  • Can you write clearly about something that you don’t yourself swim in? Don’t you have to endure it and hate it most of the time like the rest of us?
  • his answer was no. No. No, you absolutely don’t. You can miss a meme, and nothing really changes. You can be called fragile, and you will live. “I’m pretty much the opposite of fragile. I don’t need internet engagement to make me vulnerable. Real writing makes me — makes anyone doing it — vulnerable.”
  • Has anyone considered that the interaction is the fragility? Has anyone considered that letting other people define how you fill your day and what they fill your head with — a passive, postmodern stream of other people’s thoughts — is the fragility?
9More

Higgs Boson Gets Nobel Prize, But Physicists Still Don't Know What It's Telling Them - ... - 2 views

  • This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.
  • left physicists without a clear roadmap of where to go next
  • No one is sure which of these models, if any, will eventually describe reality
  • ...6 more annotations...
  • Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.
  • we’ve entered a very deep crisis.
  • hough happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct.
  • One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural.
  • The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.
  • physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them.
18More

Opinion | Speaking as a White Male … - The New York Times - 0 views

  • If you go back to the intellectuals of the 1950s, you get the impression that they thought individuals could very much determine their own beliefs.
  • Busy fighting communism and fascism, people back then emphasized individual reason and were deeply allergic to groupthink.
  • We don’t think this way anymore, and in fact thinking this way can get you into trouble. I guess the first step was the rise of perspectivism
  • ...15 more annotations...
  • This is the belief, often traced back to Nietzsche, that what you believe is determined by where you stand: Our opinions are not guided by objective truth, because there is no such thing; they are guided by our own spot in society.
  • Then came Michel Foucault and critical race theorists and the rest, and the argument that society is structured by elites to preserve their privilege.
  • Now we are at a place where it is commonly assumed that your perceptions are something that come to you through your group, through your demographic identity.
  • What does that mean? After you’ve stated your group identity, what is the therefore that follows?
  • We’ve shifted from an emphasis on individual judgment toward a greater emphasis on collective experience.
  • Under what circumstances should we embrace the idea that collective identity shapes our thinking? Under what circumstances should we resist collective identity and insist on the primacy of individual discretion, and our common humanity?
  • On the one hand, the drive to bring in formerly marginalized groups has obviously been one of the great achievements of our era
  • Wider inclusion has vastly improved public debate
  • other times, group identity seems irrelevant to many issues
  • And there are other times when collective thinking seems positively corrupting. Why are people’s views of global warming, genetically modified foods and other scientific issues strongly determined by political label? That seems ridiculous.
  • Our whole education system is based on the idea that we train individuals to be critical thinkers. Our political system is based on the idea that persuasion and deliberation lead to compromise and toward truth. The basis of human dignity is our capacity to make up our own minds
  • One of the things I’ve learned in a lifetime in journalism is that people are always more unpredictable than their categories.
  • the notion that group membership determines opinion undermines all that.
  • If it’s just group against group, deliberation is a sham, beliefs are just masks groups use to preserve power structures, and democracy is a fraud.
  • The epistemological foundation of our system is in surprisingly radical flux.
12More

Assessing the Value of Buddhism, for Individuals and for the World - The New York Times - 0 views

  • Robert Wright sketches an answer early in “Why Buddhism Is True.” He settles on a credible blend that one might call Western Buddhism, a largely secular approach to life and its problems but not devoid of a spiritual dimension. The centerpiece of the approach is the practice of mindful meditation.
  • The goal of “Why Buddhism Is True” is ambitious: to demonstrate “that Buddhism’s diagnosis of the human predicament is fundamentally correct, and that its prescription is deeply valid and urgently important.”
  • It is reasonable to claim that Buddhism, with its focus on suffering, addresses critical aspects of the human predicament. It is also reasonable to suggest that the prescription it offers may be applicable and useful to resolve that predicament.
  • ...9 more annotations...
  • To produce his demonstrations and to support the idea that Buddhism is “true,” Wright relies on science, especially on evolutionary psychology, cognitive science and neuroscience.
  • Wright is up to the task: He’s a Buddhist who has written about religion and morality from a scientific perspective — he is most famous for his 1994 book, “The Moral Animal.”
  • Second, the mismatch between causes and responses is rooted in evolution. We have inherited from our nonhuman and human forerunners a complex affect apparatus suited to life circumstances very different from ours
  • First, the beneficial powers of meditation come from the possibility of realizing that our emotive reactions and the consequent feelings they engender — which operate in automated fashion, outside our deliberate control — are often inappropriate and even counterproductive relative to the situations that trigger them.
  • Third, meditation allows us to realize that the idea of the self as director of our decisions is an illusion, and that the degree to which we are at the mercy of a weakly controlled system places us at a considerable disadvantage
  • Fourth, the awareness brought on by meditation helps the construction of a truly enlightened humanity and counters the growing tribalism of contemporary societies.
  • when, in modern life, emotions such as fear and anger are incorrectly and unnecessarily engaged — for example, road rage — Wright calls the respective feelings “false” or “illusory.” Such feelings, however, are no less true than the thirst, hunger or pain that Wright accepts and welcomes
  • We can agree that mindful meditation promotes a distancing effect and thus may increase our chances of combining affect and reason advantageously. Meditation can help us glean the especially flawed and dislocated status of humans in modern societies, and help us see how social and political conflicts appear to provoke resentment and anger so easily.
  • How does one scale up, from many single individuals to populations, in time to prevent the social catastrophes that seem to be looming?
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
15More

The Fallacy of the 'I Turned Out Fine' Argument - The New York Times - 0 views

  • Most of the messages centered on one single, repeated theme: “I was smacked as a child and I turned out just fine.”
  • It makes sense, doesn’t it? Many of us think, “If I had something happen to me and nothing went wrong, then surely it’s fine for everyone else.”
  • The “I turned out just fine” argument is popular. It means that based on our personal experience we know what works and what doesn’t.But the argument has fatal flaws.
  • ...12 more annotations...
  • It’s what’s known as an anecdotal fallacy. This fallacy, in simple terms, states that “I’m not negatively affected (as far as I can tell), so it must be O.K. for everyone.
  • We are relying on a sample size of one. Ourselves, or someone we know. And we are applying that result to everyone
  • It relies on a decision-making shortcut known as the availability heuristic. Related to the anecdotal fallacy, it’s where we draw on information that is immediately available to us when we make a judgment call.
  • studies show that the availability heuristic is a cognitive bias that can cloud us from making accurate decisions utilizing all the information available. It blinds us to our own prejudices
  • It dismisses well-substantiated, scientific evidence. To say “I turned out fine” is an arrogant dismissal of an alternative evidence-based view
  • The statement closes off discourse and promotes a single perspective that is oblivious to alternatives that may be more enlightened. Anecdotal evidence often undermines scientific results, to our detriment.
  • It leads to entrenched attitudes.
  • Perhaps an inability to engage with views that run counter to our own suggests that we did not turn out quite so “fine.”
  • Where is the threshold for what constitutes having turned out fine? If it means we avoided prison, we may be setting the bar too low. Gainfully employed and have a family of our own? Still a pretty basic standard
  • It is as reasonable to say “I turned out fine because of this” as it is to say “I turned out fine in spite of this.”
  • To claim that on this basis spanking a child is fine means that we fall victim to anecdote, rely on our availability heuristic (thereby dismissing all broader data to the contrary), dismiss alternate views, fail to learn and progress by engaging with a challenging idea.
  • We expect our children to embrace learning and to progress in their thinking as they grow older. They deserve to expect the same from us.
7More

Want to help your child succeed in school? Add language to the math, reading mix -- Sci... - 0 views

  • Research shows that the more skills children bring with them to kindergarten -- in basic math, reading, even friendship and cooperation -- the more likely they will succeed in those same areas in school.
  • Now it's time to add language to that mix of skills, says a new University of Washington-led study. Not only does a child's use of vocabulary and grammar predict future proficiency with the spoken and written word, but it also affects performance in other subject areas.
  • The team analyzed academic and behavioral assessments, assigned standardized scores and looked at how scores correlated in grades 1, 3, and 5. Growth curve modeling allowed the team to look at children's levels of performance across time and investigate rates of change at specific times in elementary school.
  • ...4 more annotations...
  • Reading ability in kindergarten predicted reading, math and language skills later on; and math proficiency correlated with math and reading performance over time.
  • Measuring the impact of one skill on another, in addition to measuring growth in the same skill, provides more of a "whole child" perspective, Pace said. A child who enters school with little exposure to number sense or spatial concepts but with strong social skills may benefit from that emotional buffer.
  • Researchers expected to find that the effects of kindergarten readiness would wear off by third grade, the time when elementary school curriculum transitions from introducing foundational skills to helping students apply those skills as they delve deeper into content areas. But according to the study, children's performance in kindergarten continues to predict their performance in grades three through five.
  • The study also represents an opportunity to rethink what skills are considered measures of kindergarten-readiness, she said.
20More

Opinion | The Strange Failure of the Educated Elite - The New York Times - 0 views

  • We replaced a system based on birth with a fairer system based on talent. We opened up the universities and the workplace to Jews, women and minorities. University attendance surged, creating the most educated generation in history. We created a new boomer ethos, which was egalitarian (bluejeans everywhere!), socially conscious (recycling!) and deeply committed to ending bigotry.
  • The older establishment won World War II and built the American Century. We, on the other hand, led to Donald Trump. The chief accomplishment of the current educated elite is that it has produced a bipartisan revolt against itself.
  • the new meritocratic aristocracy has come to look like every other aristocracy. The members of the educated class use their intellectual, financial and social advantages to pass down privilege to their children, creating a hereditary elite that is ever more insulated from the rest of society. We need to build a meritocracy that is true to its values, truly open to all.
  • ...17 more annotations...
  • But the narrative is insufficient. The real problem with the modern meritocracy can be found in the ideology of meritocracy itself. Meritocracy is a system built on the maximization of individual talent, and that system unwittingly encourages several ruinous beliefs:
  • Exaggerated faith in intelligence.
  • Many of the great failures of the last 50 years, from Vietnam to Watergate to the financial crisis, were caused by extremely intelligent people who didn’t care about the civic consequences of their actions.
  • Misplaced faith in autonomy
  • The meritocracy is based on the metaphor that life is a journey. On graduation days, members for the educated class give their young Dr. Seuss’ “Oh, the Places You’ll Go!” which shows a main character, “you,” who goes on a solitary, unencumbered journey through life toward success. If you build a society upon this metaphor you will wind up with a society high in narcissism and low in social connection
  • Life is not really an individual journey. Life is more like settling a sequence of villages. You help build a community at home, at work, in your town and then you go off and settle more villages.
  • Instead of seeing the self as the seat of the soul, the meritocracy sees the self as a vessel of human capital, a series of talents to be cultivated and accomplishments to be celebrated.
  • Misplaced notion of the self
  • If you base a society on a conception of self that is about achievement, not character, you will wind up with a society that is demoralized; that puts little emphasis on the sorts of moral systems that create harmony within people, harmony between people and harmony between people and their ultimate purpose.
  • Inability to think institutionally.
  • Previous elites poured themselves into institutions and were pretty good at maintaining existing institutions, like the U.S. Congress, and building new ones, like the postwar global order.
  • The current generation sees institutions as things they pass through on the way to individual success. Some institutions, like Congress and the political parties, have decayed to the point of uselessness, while others, like corporations, lose their generational consciousness
  • Misplaced idolization of diversity
  • But diversity is a midpoint, not an endpoint. Just as a mind has to be opened so that it can close on something, an organization has to be diverse so that different perspectives can serve some end.
  • Diversity for its own sake, without a common telos, is infinitely centrifugal, and leads to social fragmentation.
  • The essential point is this: Those dimwitted, stuck up blue bloods in the old establishment had something we meritocrats lack — a civic consciousness, a sense that we live life embedded in community and nation, that we owe a debt to community and nation and that the essence of the admirable life is community before self.
  • The meritocracy is here to stay, thank goodness, but we probably need a new ethos to reconfigure it — to redefine how people are seen, how applicants are selected, how social roles are understood and how we narrate a common national purpose
30More

Sex, Morality, and Modernity: Can Immanuel Kant Unite Us? - The Atlantic - 1 views

  • Before I jump back into the conversation about sexual ethics that has unfolded on the Web in recent days, inspired by Emily Witt's n+1 essay "What Do You Desire?" and featuring a fair number of my favorite writers, it's worth saying a few words about why I so value debate on this subject, and my reasons for running through some sex-life hypotheticals near the end of this article.
  • As we think and live, the investment required to understand one another increases. So do the stakes of disagreeing. 18-year-olds on the cusp of leaving home for the first time may disagree profoundly about how best to live and flourish, but the disagreements are abstract. It is easy, at 18, to express profound disagreement with, say, a friend's notions of child-rearing. To do so when he's 28, married, and raising a son or daughter is delicate, and perhaps best avoided
  • I have been speaking of friends. The gulfs that separate strangers can be wider and more difficult to navigate because there is no history of love and mutual goodwill as a foundation for trust. Less investment has been made, so there is less incentive to persevere through the hard parts.
  • ...27 more annotations...
  • I've grown very close to new people whose perspectives are radically different than mine.
  • It floors me: These individuals are all repositories of wisdom. They've gleaned it from experiences I'll never have, assumptions I don't share, and brains wired different than mine. I want to learn what they know.
  • Does that get us anywhere? A little ways, I think.
  • "Are we stuck with a passé traditionalism on one hand, and total laissez-faire on the other?" Is there common ground shared by the orthodox-Christian sexual ethics of a Rod Dreher and those who treat consent as their lodestar?
  • Gobry suggests that Emmanuel Kant provides a framework everyone can and should embrace, wherein consent isn't nearly enough to make a sexual act moral--we must, in addition, treat the people in our sex lives as ends, not means.
  • Here's how Kant put it: "Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end."
  • the disappearance of a default sexual ethic in America and the divergence of our lived experiences means we have more to learn from one another than ever, even as our different choices raise the emotional stakes.
  • Nor does it seem intuitively obvious that a suffering, terminally ill 90-year-old is regarding himself as a means, or an object, if he prefers to end his life with a lethal injection rather than waiting three months in semi-lucid agony for his lungs to slowly shut down and suffocate him. (Kant thought suicide impermissible.) The terminally ill man isn't denigrating his own worth or the preciousness of life or saying it's permissible "any time" it is difficult. He believes ending his life is permissible only because the end is nigh, and the interim affords no opportunity for "living" in anything except a narrow biological sense.
  • It seems to me that, whether we're talking about a three-week college relationship or a 60-year marriage, it is equally possible to treat one's partner as a means or as an end (though I would agree that "treating as means" is more common in hookups than marriage)
  • my simple definition is this: It is wrong to treat human persons in such a way that they are reduced to objects. This says nothing about consent: a person may consent to be used as an object, but it is still wrong to use them that way. It says nothing about utility: society may approve of using some people as objects; whether those people are actual slaves or economically oppressed wage-slaves it is still wrong to treat them like objects. What it says, in fact, is that human beings have intrinsic worth and dignity such that treating them like objects is wrong.
  • what it means to treat someone as a means, or as an object, turns out to be in dispute.
  • Years ago, I interviewed a sister who was acting as a surrogate for a sibling who couldn't carry her own child. The notion that either regarded the other (or themselves) as an object seems preposterous to me. Neither was treating the other as a means, because they both freely chose, desired and worked in concert to achieve the same end.
  • It seems to me that the Kantian insight is exactly the sort of challenge traditionalist Christians should make to college students as they try to persuade them to look more critically at hookup culture. I think a lot of college students casually mislead one another about their intentions and degree of investment, feigning romantic interest when actually they just want to have sex. Some would say they're transgressing against consent. I think Kant has a more powerful challenge. 
  • Ultimately, Kant only gets us a little way in this conversation because, outside the realm of sex, he thinks consent goes a long way toward mitigating the means problem, whereas in the realm of sex, not so much. This is inseparable from notions he has about sex that many of us just don't share.
  • two Biblical passages fit my moral intuition even better than Kant. "Love your neighbor as yourself." And "therefore all things whatsoever would that men should do to you, do ye even so to them.
  • "do unto others..." is extremely demanding, hard to live up to, and a very close fit with my moral intuitions.
  • "Do unto others" is also enough to condemn all sorts of porn, and to share all sorts of common ground with Dreher beyond consent. Interesting that it leaves us with so many disagreements too. "Do unto others" is core to my support for gay marriage.
  • Are our bones always to be trusted?) The sexual behavior parents would be mortified by is highly variable across time and cultures. So how can I regard it as a credible guide of inherent wrong? Professional football and championship boxing are every bit as violent and far more physically damaging to their participants than that basement scene, yet their cultural familiarity is such that most people don't feel them to be morally suspect. Lots of parents are proud, not mortified, when a son makes the NFL.
  • "Porn operates in fantasy the way boxing and football operate in fantasy. The injuries are quite real." He is, as you can see, uncomfortable with both. Forced at gunpoint to choose which of two events could proceed on a given night, an exact replica of the San Francisco porn shoot or an Ultimate Fighting Championship tournament--if I had to shut one down and grant the other permission to proceed--what would the correct choice be?
  • insofar as there is something morally objectionable here, it's that the audience is taking pleasure in the spectacle of someone being abused, whether that abuse is fact or convincing illusion. Violent sports and violent porn interact with dark impulses in humanity, as their producers well know.
  • If Princess Donna was failing to "do unto others" at all, the audience was arguably who she failed. Would she want others to entertain her by stoking her dark human impulses? Then again, perhaps she is helping to neuter and dissipate them in a harmless way. That's one theory of sports, isn't it? We go to war on the gridiron as a replacement for going to war? And the rise in violent porn has seemed to coincide with falling, not rising, incidence of sexual violence. 
  • On all sorts of moral questions I can articulate confident judgments. But I am confident in neither my intellect nor my gut when it comes to judging Princess Donna, or whether others are transgressing against themselves or "nature" when doing things that I myself wouldn't want to do. Without understanding their mindset, why they find that thing desirable, or what it costs them, if anything, I am loath to declare that it's grounded in depravity or inherently immoral just because it triggers my disgust instinct, especially if the people involved articulate a plausible moral code that they are following, and it even passes a widely held standard like "do unto others."
  • Here's another way to put it. Asked to render moral judgments about sexual behaviors, there are some I would readily label as immoral. (Rape is an extreme example. Showing the topless photo your girlfriend sent to your best friend is a milder one.) But I often choose to hold back and error on the side of not rendering a definitive judgment, knowing that occasionally means I'll fail to label as unethical some things that actually turn out to be morally suspect.
  • Partly I take that approach because, unlike Dreher, I don't see any great value or urgency in the condemnations, and unlike Douthat, I worry more about wrongful stigma than lack of rightful stigmas
  • In a society where notions of sexual morality aren't coercively enforced by the church or the state, what purpose is condemnation serving?
  • People are great! Erring on the side of failing to condemn permits at least the possibility of people from all of these world views engaging in conversation with one another.
  • Dreher worries about the fact that, despite our discomfort, neither Witt nor I can bring ourselves to say that the sexual acts performed during the S.F. porn shoot were definitely wrong. Does that really matter? My interlocutors perhaps see a cost more clearly than me, as well they might. My bias is that just arguing around the fire is elevating.
14More

Right and Left React to the Paris Climate Agreement News - The New York Times - 0 views

  • The political news cycle is fast, and keeping up can be overwhelming. Trying to find differing perspectives worth your time is even harder. That’s why we have scoured the internet for political writing from the right and left that you might not have seen.
  • “Its breakthrough was not in lifting nations up to higher levels of ambition, but rather in dropping expectations to the lowest common denominator.”
  • He argues that the treaty did little to reduce emissions because of one central flaw in the agreement’s logic: the “pledge and review” process that governed international talks. “That logic relied on a misunderstanding of what motivates developing nations,” he writes.
  • ...7 more annotations...
  • “There are a few reasons that explain conservatives who were Never-Trumpers during the election, and who remain anti-Trump today. [...] They do not believe that America is engaged in a civil war, with the survival of America as we know it at stake.
    • dicindioha
       
      um interesting...
  • “That means praise him when he’s right, and find the most plausible possible defense when he’s wrong.”
  • “The conservative reaction to Trump’s Paris decision really drove home how this is all — and I do mean all — about waging culture war against the left.”
  • Rather than seeing the science in “pragmatic terms,” the president and the G.O.P. have made the issue into a “tribal struggle.” The cost of the right’s “desire to piss off lefty tree-huggers,” however, is an uncertain future for our grandchildren.
  • “A man who wished to become the most powerful man in the world [...] was granted his wish. Surely he must have imagined that more power meant more flattery, a grander image, a greater hall of mirrors reflecting back his magnificence. But he misunderstood power and prominence.”
  • “When it comes to decisions about strangers, the easiest, most accessible shortcut is our first impression. Unknowledgeable voters go for this shortcut.”
    • dicindioha
       
      choosing a favored candidate
  • he writes about his work on first impressions and their effect on political outcomes. When we don’t have a lot of information, he explains, our brains rely on “shortcuts”; low-information voters tend to rely on appearance to guide their decisions.
    • dicindioha
       
      ***
  •  
    the right and the left are presented to be in heavy disagreement from the quotes in these articles. understanding climate change is a thing that needs to be stopped will not get through some minds and it is very frustrating. this article also has a brief excerpt on quickly choosing a favored candidate based on limited information, similar to an interview!
« First ‹ Previous 141 - 160 of 240 Next › Last »
Showing 20 items per page