Skip to main content

Home/ TOK Friends/ Group items tagged critical theory

Rss Feed Group items tagged

Javier E

How Conservative Media Lost to the MSM and Failed the Rank and File - Conor Friedersdor... - 0 views

  • Before rank-and-file conservatives ask, "What went wrong?", they should ask themselves a question every bit as important: "Why were we the last to realize that things were going wrong for us?"
  • It is easy to close oneself off inside a conservative echo chamber. And right-leaning outlets like Fox News and Rush Limbaugh's show are far more intellectually closed than CNN or public radio.
  • Since the very beginning of the election cycle, conservative media has been failing you. With a few exceptions, they haven't tried to rigorously tell you the truth, or even to bring you intellectually honest opinion. What they've done instead helps to explain why the right failed to triumph in a very winnable election.
  • ...6 more annotations...
  • Conservatives were at a disadvantage because Romney supporters like Jennifer Rubin and Hugh Hewitt saw it as their duty to spin constantly for their favored candidate rather than being frank about his strengths and weaknesses.
  • Conservatives were at an information disadvantage because so many right-leaning outlets wasted time on stories the rest of America dismissed as nonsense. WorldNetDaily brought you birtherism. Forbes brought you Kenyan anti-colonialism. National Review obsessed about an imaginary rejection of American exceptionalism, misrepresenting an Obama quote in the process, and Andy McCarthy was interviewed widely about his theory that Obama, aka the Drone Warrior in Chief, allied himself with our Islamist enemies in a "Grand Jihad" against America. Seriously? 
  • Conservatives were at a disadvantage because their information elites pandered in the most cynical, self-defeating ways, treating would-be candidates like Sarah Palin and Herman Cain as if they were plausible presidents rather than national jokes who'd lose worse than George McGovern.
  • How many hours of Glenn Beck conspiracy theories did Fox News broadcast to its viewers? How many hours of transparently mindless Sean Hannity content is still broadcast daily? Why don't Americans trust Republicans on foreign policy as they once did? In part because conservatism hasn't grappled with the foreign-policy failures of George W. Bush. A conspiracy of silence surrounds the subject. Romney could neither run on the man's record nor repudiate it.
  • Most conservative pundits know better than this nonsense -- not that they speak up against it. They see criticizing their own side as a sign of disloyalty. I see a coalition that has lost all perspective, partly because there's no cost to broadcasting or publishing inane bullshit. In fact, it's often very profitable. A lot of cynical people have gotten rich broadcasting and publishing red meat for movement conservative consumption.
  • On the biggest political story of the year, the conservative media just got its ass handed to it by the mainstream media. And movement conservatives, who believe the MSM is more biased and less rigorous than their alternatives, have no way to explain how their trusted outlets got it wrong, while the New York Times got it right. Hint: The Times hired the most rigorous forecaster it could find.   It ought to be an eye-opening moment.   
Javier E

About Face: Emotions and Facial Expressions May Not Be Directly Related | Boston Magazine - 0 views

  • Ekman had traveled the globe with photographs that showed faces experiencing six basic emotions—happiness, sadness, fear, disgust, anger, and surprise. Everywhere he went, from Japan to Brazil to the remotest village of Papua New Guinea, he asked subjects to look at those faces and then to identify the emotions they saw on them. To do so, they had to pick from a set list of options presented to them by Ekman. The results were impressive. Everybody, it turned out, even preliterate Fore tribesmen in New Guinea who’d never seen a foreigner before in their lives, matched the same emotions to the same faces. Darwin, it seemed, had been right.
  • Ekman’s findings energized the previously marginal field of emotion science. Suddenly, researchers had an objective way to measure and compare human emotions—by reading the universal language of feeling written on the face. In the years that followed, Ekman would develop this idea, arguing that each emotion is like a reflex, with its own circuit in the brain and its own unique pattern of effects on the face and the body. He and his peers came to refer to it as the Basic Emotion model—and it had significant practical applications
  • What if he’s wrong?
  • ...15 more annotations...
  • Barrett is a professor of psychology at Northeastern
  • her research has led her to conclude that each of us constructs them in our own individual ways, from a diversity of sources: our internal sensations, our reactions to the environments we live in, our ever-evolving bodies of experience and learning, our cultures.
  • if Barrett is correct, we’ll need to rethink how we interpret mental illness, how we understand the mind and self, and even what psychology as a whole should become in the 21st century.
  • The problem was the options that Ekman had given his subjects when asking them to identify the emotions shown on the faces they were presented with. Those options, Barrett discovered, had limited the ways in which people allowed themselves to think. Barrett explained the problem to me this way: “I can break that experiment really easily, just by removing the words. I can just show you a face and ask how this person feels. Or I can show you two faces, two scowling faces, and I can say, ‘Do these people feel the same thing?’ And agreement drops into the toilet.”
  • Just as that first picture of the bee actually wasn’t a picture of a bee for me until I taught myself that it was, my emotions aren’t actually emotions until I’ve taught myself to think of them that way. Without that, I have only a meaningless mishmash of information about what I’m feeling.
  • emotion isn’t a simple reflex or a bodily state that’s hard-wired into our DNA, and it’s certainly not universally expressed. It’s a contingent act of perception that makes sense of the information coming in from the world around you, how your body is feeling in the moment, and everything you’ve ever been taught to understand as emotion. Culture to culture, person to person even, it’s never quite the same. What’s felt as sadness in one person might as easily be felt as weariness in another, or frustration in someone else.
  • The brain, it turns out, doesn’t consciously process every single piece of information that comes its way. Think of how impossibly distracting the regular act of blinking would be if it did. Instead, it pays attention to what you need to pay attention to, then raids your memory stores to fill in the blanks.
  • In many quarters, Barrett was angrily attacked for her ideas, and she’s been the subject of criticism ever since. “I think Lisa does a disservice to the actual empirical progress that we’re making,” says Dacher Keltner, a Berkeley psychologist
  • Keltner told me that he himself has coded thousands of facial expressions using Ekman’s system, and the results are strikingly consistent: Certain face-emotion combinations recur regularly, and others never occur. “That tells me, ‘Wow, this approach to distinct emotions has real power,’” he says.
  • Ekman reached the peak of his fame in the years following 2001. That’s the year the American Psychological Association named him one of the most influential psychologists of the 20th century. The next year, Malcolm Gladwell wrote an article about him in the New Yorker, and in 2003 he began working pro bono for the TSA. A year later, riding the updraft of success, he left his university post and started the Paul Ekman Group,
  • a small research team to visit the isolated Himba tribe in Namibia, in southern Africa. The plan was this: The team, led by Maria Gendron, would do a study similar to Ekman’s original cross-cultural one, but without providing any of the special words or context-heavy stories that Ekman had used to guide his subjects’ answers. Barrett’s researchers would simply hand a jumbled pile of different expressions (happy, sad, fearful, angry, disgusted, and neutral) to their subjects, and would ask them to sort them into six piles. If emotional expressions are indeed universal, they reasoned, then the Himba would put all low-browed, tight-lipped expressions into an anger pile, all wrinkled-nose faces into a disgust pile, and so on.
  • It didn’t happen that way. The Himba sorted some of the faces in ways that aligned with Ekman’s theory: smiling faces went into one pile, wide-eyed fearful faces went into another, and affectless faces went mostly into a third. But in the other three piles, the Himba mixed up angry scowls, disgusted grimaces, and sad frowns. Without any suggestive context, of the kind that Ekman had originally provided, they simply didn’t recognize the differences that leap out so naturally to Westerners.
  • “What we’re trying to do,” she told me, “is to just get people to pay attention to the fact that there’s a mountain of evidence that does not support the idea that facial expressions are universally recognized as emotional expressions.” That’s the crucial point, of course, because if we acknowledge that, then the entire edifice that Paul Ekman and others have been constructing for the past half-century comes tumbling down. And all sorts of things that we take for granted today—how we understand ourselves and our relationships with others, how we practice psychology
  • Barrett’s theory is still only in its infancy. But other researchers are beginning to take up her ideas, sometimes in part, sometimes in full, and where the science will take us as it expands is impossible to predict. It’s even possible that Barrett will turn out to be wrong, as she herself acknowledges. “Every scientist has to face that,” she says. Still, if she is right, then perhaps the most important change we’ll need to make is in our own heads. If our emotions are not universal physiological responses but concepts we’ve constructed from various biological signals and stashed memories, then perhaps we can exercise more control over our emotional lives than we’ve assumed.
  • “Every experience you have now is seeding your experience for the future,” Barrett told me. “Knowing that, would you choose to do what you’re doing now?” She paused a beat and looked me in the eye. “Well? Would you? You are the architect of your own experience.”
Javier E

False consciousness - 0 views

  • Marx’s works, including “The Communist Manifesto”, written with Friedrich Engels in 1848, may have had more impact on the modern world than many suppose. Of the manifesto’s ten principal demands, perhaps four have been met in many rich countries, including “free education for all children in public schools” and a “progressive or graduated income tax”.
  • Mr Stedman Jones’s book is above all an intellectual biography, which focuses on the philosophical and political context in which Marx wrote.
  • Marx did not invent communism. Radicals, including Pierre-Joseph Proudhon (1809-65) and the Chartist movement in England, had long used language that modern-day readers would identify as “Marxist”—“to enjoy political equality, abolish property”; “reserve army of labour” and so forth.
  • ...7 more annotations...
  • What, then, was his contribution?
  • Far more significantly, he attempted to provide an overall theoretical description of how capitalism worked
  • in many parts the author is highly critical. For instance, he points out that Marx displayed “condescension towards developments in political economy”
  • More damning, the “Grundrisse”, an unfinished manuscript which many neo-Marxists see as a treasure trove of theory, has “defects [in the] core arguments”.
  • The author encapsulates a feeling of many students of Marx: read the dense, theoretical chapters of “Capital” closely, and no matter how much you try, it is hard to escape the conclusion that there is plenty of nonsense in there.
  • The real value of such a work, in Mr Stedman Jones’s eyes, lies in its documentation of the actual day-to-day life faced by the English working classes.
  • He did not pay enough attention, for example, to objective measures of living standards (such as real wages), which by the 1850s were clearly improving.
Javier E

Who Decides What's Racist? - Persuasion - 1 views

  • The implication of Hannah-Jones’s tweet and candidate Biden’s quip seems to be that you can have African ancestry, dark skin, textured hair, and perhaps even some “culturally black” traits regarding tastes in food, music, and ways of moving through the world. But unless you hold the “correct” political beliefs and values, you are not authentically black.
  • In a now-deleted tweet from May 22, 2020, Nikole Hannah-Jones, a Pulitzer Prize-winning reporter for The New York Times, opined, “There is a difference between being politically black and being racially black.”
  • Shelly Eversley’s The Real Negro suggests that in the latter half of the 20th century, the criteria of what constitutes “authentic” black experience moved from perceptible outward signs, like the fact of being restricted to segregated public spaces and speaking in a “black” dialect, to psychological, interior signs. In this new understanding, Eversley writes, “the ‘truth’ about race is felt, not performed, not seen.”
  • ...26 more annotations...
  • This insight goes a long way to explaining the current fetishization of experience, especially if it is (redundantly) “lived.” Black people from all walks of life find themselves deferred to by non-blacks
  • black people certainly don’t all “feel” or “experience” the same things. Nor do they all "experience" the same event in an identical way. Finally, even when their experiences are similar, they don’t all think about or interpret their experiences in the same way.
  • we must begin to attend in a serious way to heterodox black voices
  • This need is especially urgent given the ideological homogeneity of the “antiracist” outlook and efforts of elite institutions, including media, corporations, and an overwhelmingly progressive academia. For the arbiters of what it means to be black that dominate these institutions, there is a fairly narrowly prescribed “authentic” black narrative, black perspective, and black position on every issue that matters.
  • When we hear the demand to “listen to black voices,” what is usually meant is “listen to the right black voices.”
  • Many non-black people have heard a certain construction of “the black voice” so often that they are perplexed by black people who don’t fit the familiar model.
  • Similarly, many activists are not in fact “pro-black”: they are pro a rather specific conception of “blackness” that is not necessarily endorsed by all black people.
  • This is where our new website, Free Black Thought (FBT), seeks to intervene in the national conversation. FBT honors black individuals for their distinctive, diverse, and heterodox perspectives, and offers up for all to hear a polyphony, perhaps even a cacophony, of different and differing black voices.
  • The practical effects of the new antiracism are everywhere to be seen, but in few places more clearly than in our children’s schools
  • one might reasonably question what could be wrong with teaching children “antiracist” precepts. But the details here are full of devils.
  • To take an example that could affect millions of students, the state of California has adopted a statewide Ethnic Studies Model Curriculum (ESMC) that reflects “antiracist” ideas. The ESMC’s content inadvertently confirms that contemporary antiracism is often not so much an extension of the civil rights movement but in certain respects a tacit abandonment of its ideals.
  • It has thus been condemned as a “perversion of history” by Dr. Clarence Jones, MLK’s legal counsel, advisor, speechwriter, and Scholar in Residence at the Martin Luther King, Jr. Institute at Stanford University:
  • Essentialist thinking about race has also gained ground in some schools. For example, in one elite school, students “are pressured to conform their opinions to those broadly associated with their race and gender and to minimize or dismiss individual experiences that don’t match those assumptions.” These students report feeling that “they must never challenge any of the premises of [the school’s] ‘antiracist’ teachings.”
  • In contrast, the non-white students were taught that they were “folx (sic) who do not benefit from their social identities,” and “have little to no privilege and power.”
  • The children with “white” in their identity map were taught that they were part of the “dominant culture” which has been “created and maintained…to hold power and stay in power.” They were also taught that they had “privilege” and that “those with privilege have power over others.
  • Or consider the third-grade students at R.I. Meyerholz Elementary School in Cupertino, California
  • Or take New York City’s public school system, one of the largest educators of non-white children in America. In an effort to root out “implicit bias,” former Schools Chancellor Richard Carranza had his administrators trained in the dangers of “white supremacy culture.”
  • A slide from a training presentation listed “perfectionism,” “individualism,” “objectivity” and “worship of the written word” as white supremacist cultural traits to be “dismantled,”
  • Finally, some schools are adopting antiracist ideas of the sort espoused by Ibram X. Kendi, according to whom, if metrics such as tests and grades reveal disparities in achievement, the project of measuring achievement must itself be racist.
  • Parents are justifiably worried about such innovations. What black parent wants her child to hear that grading or math are “racist” as a substitute for objective assessment and real learning? What black parent wants her child told she shouldn’t worry about working hard, thinking objectively, or taking a deep interest in reading and writing because these things are not authentically black?
  • Clearly, our children’s prospects for success depend on the public being able to have an honest and free-ranging discussion about this new antiracism and its utilization in schools. Even if some black people have adopted its tenets, many more, perhaps most, hold complex perspectives that draw from a constellation of rather different ideologies.
  • So let’s listen to what some heterodox black people have to say about the new antiracism in our schools.
  • Coleman Hughes, a fellow at the Manhattan Institute, points to a self-defeating feature of Kendi-inspired grading and testing reforms: If we reject high academic standards for black children, they are unlikely to rise to “those same rejected standards” and racial disparity is unlikely to decrease
  • Chloé Valdary, the founder of Theory of Enchantment, worries that antiracism may “reinforce a shallow dogma of racial essentialism by describing black and white people in generalizing ways” and discourage “fellowship among peers of different races.”
  • We hope it’s obvious that the point we’re trying to make is not that everyone should accept uncritically everything these heterodox black thinkers say. Our point in composing this essay is that we all desperately need to hear what these thinkers say so we can have a genuine conversation
  • We promote no particular politics or agenda beyond a desire to offer a wide range of alternatives to the predictable fare emanating from elite mainstream outlets. At FBT, Marxists rub shoulders with laissez-faire libertarians. We have no desire to adjudicate who is “authentically black” or whom to prefer.
katherineharron

How to be a human lie detector of fake news - CNN - 0 views

  • Fake news existed long before the internet. In an essay on political lying in the early 18th century, the writer Jonathan Swift noted that "Falsehood flies and the truth comes limping after it." You have to hire a train to pull the truth, explained English pastor Charles Spurgeon in the 19th century, while a lie is "light as a feather ... a breath will carry it."
  • MIT researchers recently studied more than 10 years' worth of data on the most shared stories on Facebook. Their study covered conspiracy theories about the Boston bombings, misleading reports on natural disasters, unfounded business rumors and incorrect scientific claims. There is an inundation of false medical advice online, for example, that encourages people to avoid life-saving treatments such as vaccines and promotes unproven therapies. (Gwyneth Paltrow's Goop is just one example.)
  • The psychological research does, however, offer us a silver lining to this storm cloud, with various experiments demonstrating that people can learn to be better lie detectors with a little training in critical thinking.
  • ...6 more annotations...
  • If you would like to improve your own lie detection, a good first step is to learn the common logical fallacies -- red herrings, appeals to ignorance, straw men and "ad populum" appeals to the bandwagon -- that purveyors of misinformation may use to create the illusion of truth.
  • These efforts are often called "inoculations," since they use a real-life example in one domain to teach people about the strategies used to spread lies and therefore equipping people to spot them more easily. Educating people about the tobacco industry's attempts to question the medical consensus on smoking, for example, led people to be more skeptical of articles denying climate change, according to one study.
  • Another project aimed to inoculate students at North Carolina State University in Raleigh, involved a course on misinformation throughout history. The class was taught about everything from the myth that aliens somehow built the Egyptian pyramids to the theories that NASA's moon landings were faked. Along the way, the students had to identify the erroneous logic that helped create the arguments, and the motivations that may lead some people to spread those ideas.
  • You could also try basic strategies such as cross-checking different outlets and finding the original source of a claim. You might also look at independent fact-checking websites used in the MIT study such as Snopes, PolitiFact and TruthOrFiction.com.
  • The psychological literature offers us one good strategy against bias, called the "consider the opposite" method. This involves asking yourself whether you would have been so credulous of a claim if its opinions had differed from your own. And if not, what kind of additional scrutiny might you have applied? This should help you to identify the weaknesses in your own thinking.
  • Falsehoods may fly, but with this lie detection kit, you can better ensure your actions and beliefs remain grounded in the truth.
Javier E

'Follow the science': As Year 3 of the pandemic begins, a simple slogan becomes a polit... - 0 views

  • advocates for each side in the masking debate are once again claiming the mantle of science to justify political positions
  • pleas to “follow the science” have consistently yielded to use of the phrase as a rhetorical land mine.
  • “so much is mixed up with science — risk and values and politics. The phrase can come off as sanctimonious,” she said, “and the danger is that it says, ‘These are the facts,’ when it should say, ‘This is the situation as we understand it now and that understanding will keep changing.’
  • ...34 more annotations...
  • The pandemic’s descent from medical emergency to political flash point can be mapped as a series of surges of bickering over that one simple phrase. “Follow the science!” people on both sides insisted, as the guidance from politicians and public health officials shifted over the past two years from anti-mask to pro-mask to “keep on masking” to more refined recommendations about which masks to wear and now to a spotty lifting of mandates.
  • demands that the other side “follow the science” are often a complete rejection of another person’s cultural and political identity: “It’s not just people believing the scientific research that they agree with. It’s that in this extreme polarization we live with, we totally discredit ideas because of who holds them.
  • “I’m struggling as much as anyone else,” she said. “Our job as informed citizens in the pandemic is to be like judges and synthesize information from both sides, but with the extreme polarization, nobody really trusts each other enough to know how to judge their information.
  • Many people end up putting their trust in some subset of the celebrity scientists they see online or on TV. “Follow the science” often means “follow the scientists” — a distinction that offers insight into why there’s so much division over how to cope with the virus,
  • although a slim majority of Americans they surveyed don’t believe that “scientists adjust their findings to get the answers they want,” 31 percent do believe scientists cook the books and another 16 percent were unsure.
  • Those who mistrust scientists were vastly less likely to be worried about getting covid-19 — and more likely to be supporters of former president Donald Trump,
  • A person’s beliefs about scientists’ integrity “is the strongest and most consistent predictor of views about … the threats from covid-19,”
  • When a large minority of Americans believe scientists’ conclusions are determined by their own opinions, that demonstrates a widespread “misunderstanding of scientific methods, uncertainty, and the incremental nature of scientific inquiry,” the sociologists concluded.
  • Americans’ confidence in science has declined in recent decades, especially among Republicans, according to Gallup polls
  • The survey found last year that 64 percent of Americans said they had “a great deal” or “quite a lot” of confidence in science, down from 70 percent who said that back in 1975
  • Confidence in science jumped among Democrats, from 67 percent in the earlier poll to 79 percent last year, while Republicans’ confidence cratered during the same period from 72 percent to 45 percent.
  • The fact that both sides want to be on the side of “science” “bespeaks tremendous confidence or admiration for a thing called ‘science,’ ”
  • Even in this time of rising mistrust, everybody wants to have the experts on their side.
  • That’s been true in American debates regarding science for many years
  • Four decades ago, when arguments about climate change were fairly new, people who rejected the idea looked at studies showing a connection between burning coal and acid rain and dubbed them “junk science.” The “real” science, those critics said, showed otherwise.
  • “Even though the motive was to reject a scientific consensus, there was still a valorization of expertise,”
  • “Even people who took a horse dewormer when they got covid-19 were quick to note that the drug was created by a Nobel laureate,” he said. “Almost no one says they’re anti-science.”
  • “There isn’t a thing called ‘the science.’ There are multiple sciences with active disagreements with each other. Science isn’t static.”
  • The problem is that the phrase has become more a political slogan than a commitment to neutral inquiry, “which bespeaks tremendous ignorance about what science is,”
  • t scientists and laypeople alike are often guilty of presenting science as a monolithic statement of fact, rather than an ever-evolving search for evidence to support theories,
  • while scientists are trained to be comfortable with uncertainty, a pandemic that has killed and sickened millions has made many people eager for definitive solutions.
  • “I just wish when people say ‘follow the science,’ it’s not the end of what they say, but the beginning, followed by ‘and here’s the evidence,’
  • As much as political leaders may pledge to “follow the science,” they answer to constituents who want answers and progress, so the temptation is to overpromise.
  • It’s never easy to follow the science, many scientists warn, because people’s behaviors are shaped as much by fear, folklore and fake science as by well-vetted studies or evidence-based government guidance.
  • “Science cannot always overcome fear,”
  • Some of the states with the lowest covid case rates and highest vaccination rates nonetheless kept many students in remote learning for the longest time, a phenomenon she attributed to “letting fear dominate our narrative.”
  • “That’s been true of the history of science for a long time,” Gandhi said. “As much as we try to be rigorous about fact, science is always subject to the political biases of the time.”
  • A study published in September indicates that people who trust in science are actually more likely to believe fake scientific findings and to want to spread those falsehoods
  • The study, reported in the Journal of Experimental Social Psychology, found that trusting in science did not give people the tools they need to understand that the scientific method leads not to definitive answers, but to ever-evolving theories about how the world works.
  • Rather, people need to understand how the scientific method works, so they can ask good questions about studies.
  • Trust in science alone doesn’t arm people against misinformation,
  • Overloaded with news about studies and predictions about the virus’s future, many people just tune out the information flow,
  • That winding route is what science generally looks like, Swann said, so people who are frustrated and eager for solid answers are often drawn into dangerous “wells of misinformation, and they don’t even realize it,” she said. “If you were told something every day by people you trusted, you might believe it, too.”
  • With no consensus about how and when the pandemic might end, or about which public health measures to impose and how long to keep them in force, following the science seems like an invitation to a very winding, even circular path.
Javier E

Opinion | Empathy Is Exhausting. There Is a Better Way. - The New York Times - 0 views

  • “What can I even do?”Many people are feeling similarly defeated, and many others are outraged by the political inaction that ensues. A Muslim colleague of mine said she was appalled to see so much indifference to the atrocities and innocent lives lost in Gaza and Israel. How could anyone just go on as if nothing had happened?
  • inaction isn’t always caused by apathy. It can also be the product of empathy. More specifically, it can be the result of what psychologists call empathic distress: hurting for others while feeling unable to help.
  • I felt it intensely this fall, as violence escalated abroad and anger echoed across the United States. Helpless as a teacher, unsure of how to protect my students from hostility and hate. Useless as a psychologist and writer, finding words too empty to offer any hope. Powerless as a parent, searching for ways to reassure my kids that the world is a safe place and most people are good. Soon I found myself avoiding the news altogether and changing the subject when war came up
  • ...22 more annotations...
  • Understanding how empathy can immobilize us like that is a critical step for helping others — and ourselves.
  • Early researchers labeled it compassion fatigue and described it as the cost of caring.
  • Having concluded that nothing they do will make a difference, they start to become indifferent.
  • The symptoms of empathic distress were originally diagnosed in health care, with nurses and doctors who appeared to become insensitive to the pain of their patients.
  • Empathic distress explains why many people have checked out in the wake of these tragedies
  • when two neuroscientists, Olga Klimecki and Tania Singer, reviewed the evidence, they discovered that “compassion fatigue” is a misnomer. Caring itself is not costly. What drains people is not merely witnessing others’ pain but feeling incapable of alleviating it.
  • In times of sustained anguish, empathy is a recipe for more distress, and in some cases even depression. What we need instead is compassion.
  • empathy and compassion aren’t the same. Empathy absorbs others’ emotions as your own: “I’m hurting for you.”
  • Compassion focuses your action on their emotions: “I see that you’re hurting, and I’m here for you.”
  • “Empathy is biased,” the psychologist Paul Bloom writes. It’s something we usually reserve for our own group, and in that sense, it can even be “a powerful force for war and atrocity.”
  • Dr. Singer and their colleagues trained people to empathize by trying to feel other people’s pain. When the participants saw someone suffering, it activated a neural network that would light up if they themselves were in pain. It hurt. And when people can’t help, they escape the pain by withdrawing.
  • To combat this, the Klimecki and Singer team taught their participants to respond with compassion rather than empathy — focusing not on sharing others’ pain but on noticing their feelings and offering comfort.
  • A different neural network lit up, one associated with affiliation and social connection. This is why a growing body of evidence suggests that compassion is healthier for you and kinder to others than empathy:
  • When you see others in pain, instead of causing you to get overloaded and retreat, compassion motivates you to reach out and help
  • The most basic form of compassion is not assuaging distress but acknowledging it.
  • in my research, I’ve found that being helpful has a secondary benefit: It’s an antidote to feeling helpless.
  • To figure out who needs your support after something terrible happens, the psychologist Susan Silk suggests picturing a dart board, with the people closest to the trauma in the bull’s-eye and those more peripherally affected in the outer rings.
  • Once you’ve figured out where you belong on the dart board, look for support from people outside your ring, and offer it to people closer to the center.
  • Even if people aren’t personally in the line of fire, attacks targeting members of a specific group can shatter a whole population’s sense of security.
  • If you notice that people in your life seem disengaged around an issue that matters to you, it’s worth considering whose pain they might be carrying.
  • Instead of demanding that they do more, it may be time to show them compassion — and help them find compassion for themselves, too.
  • Your small gesture of kindness won’t end the crisis in the Middle East, but it can help someone else. And that can give you the strength to help more.
Emily Horwitz

News from The Associated Press - 0 views

  • If you saw the film "Argo," no, you didn't miss this development, which is recounted in Mendez's book about the real-life operation. It wasn't there because director Ben Affleck and screenwriter Chris Terrio replaced it with an even more dramatic scenario, involving canceled flight reservations, suspicious Iranian officials who call the Hollywood office of the fake film crew (a call answered just in time), and finally a heart-pounding chase on the tarmac just as the plane's wheels lift off, seconds from catastrophe.
  • they've caught some flak for the liberties they took in the name of entertainment.
  • And they aren't alone - two other high-profile best-picture nominees this year, Kathryn Bigelow's "Zero Dark Thirty" and Steven Spielberg's "Lincoln," have also been criticized for different sorts of factual issues.
  • ...15 more annotations...
  • But because these three major films are in contention, the issue has come to the forefront of this year's Oscar race, and with it a thorny cultural question: Does the audience deserve the truth, the whole truth and nothing but? Surely not, but just how much fiction is OK?
  • In response to a complaint by a Connecticut congressman, Kushner acknowledged he'd changed the details for dramatic effect, having two Connecticut congressmen vote against the amendment when, in fact, all four voted for it. (The names of those congressmen were changed, to avoid changing the vote of specific individuals.)
  • Kushner said he had "adhered to time-honored and completely legitimate standards for the creation of historical drama, which is what `Lincoln' is. I hope nobody is shocked to learn that I also made up dialogue and imagined encounters and invented characters."
  • "Maybe changing the vote went too far," says Richard Walter, chairman of screenwriting at the University of California, Los Angeles. "Maybe there was another way to do it. But really, it's not terribly important. People accept that liberties will be taken. A movie is a movie. People going for a history lesson are going to the wrong place."
  • Walter says he always tells his students: "Go for the feelings. Because the only thing that's truly real in the movies are the feelings that people feel when they watch."
  • No subject or individual's life is compelling and dramatic enough by itself, he says, that it neatly fits into a script with three acts, subplots, plot twists and a powerful villain.
  • Reeves, who actually gave the "Lincoln" script a negative review because he thought it was too heavy on conversation and lacking action. He adds, though, that when the subject is as famous as Lincoln, one has a responsibility to be more faithful to the facts.
  • "This is fraught territory," he says. "You're always going to have to change something, and you're always going to get in some sort of trouble, with somebody," he says.
  • Futterman also doesn't begrudge the "Argo" filmmakers, because he feels they use a directorial style that implies some fun is being had with the story. "All the inside joking about Hollywood - tonally, you get a sense that something is being played with," he says.
  • Futterman says he was sympathetic to those concerns and would certainly have addressed them in the script, had he anticipated them.
  • Of the three Oscar-nominated films in question, "Zero Dark Thirty" has inspired the most fervent debate. The most intense criticism, despite acclaim for the filmmaking craft involved, has been about its depictions of interrogations, with some, including a group of senators, saying the film misleads viewers for suggesting that torture provided information that helped the CIA find Osama bin Laden.
  • have been questions about the accuracy of the depiction of the main character, a CIA officer played by Jessica Chastain; the real person - or even combination of people, according to some theories - that she plays remains anonymous.
  • screenwriters have a double responsibility: to the material and to the audience.
  • The debate over "Argo" has been much less intense, though there has been some grumbling from former officials in Britain and New Zealand that their countries were portrayed incorrectly in the film as offering no help at all to the six Americans, whereas actually, as Mendez writes, they did provide some help.
  • "When I am hungry and crave a tuna fish sandwich, I don't go to a hardware store," he says. "When I seek a history lesson, I do not go to a movie theater. I loved `Argo' even though I know there was no last-minute turn-around via a phone call from President Carter, nor were there Iranian police cars chasing the plane down the tarmac as it took off. So what? These conceits simply make the movie more exciting."
  •  
    This article reaffirmed my feelings that we can't trust everything that we see or hear through the media, because it is often skewed to better captivate the target audience. As the article stated, there appears to be a fine line in catering to the attention span of the audience, and respecting the known facts of a given event that is portrayed by a movie.
Javier E

Adam Kirsch: Art Over Biology | The New Republic - 1 views

  • Nietzsche, who wrote in Human, All Too Human, under the rubric “Art dangerous for the artist,” about the particular ill-suitedness of the artist to flourishing in a modern scientific age: When art seizes an individual powerfully, it draws him back to the views of those times when art flowered most vigorously.... The artist comes more and more to revere sudden excitements, believes in gods and demons, imbues nature with a soul, hates science, becomes unchangeable in his moods like the men of antiquity, and desires the overthrow of all conditions that are not favorable to art.... Thus between him and the other men of his period who are the same age a vehement antagonism is finally generated, and a sad end
  • What is modern is the sense of the superiority of the artist’s inferiority, which is only possible when the artist and the intellectual come to see the values of ordinary life—prosperity, family, worldly success, and happiness—as inherently contemptible.
  • Art, according to a modern understanding that has not wholly vanished today, is meant to be a criticism of life, especially of life in a materialist, positivist civilization such as our own. If this means the artist does not share in civilization’s boons, then his suffering will be a badge of honor.
  • ...18 more annotations...
  • The iron law of Darwinian evolution is that everything that exists strives with all its power to reproduce, to extend life into the future, and that every feature of every creature can be explained as an adaptation toward this end. For the artist to deny any connection with the enterprise of life, then, is to assert his freedom from this universal imperative; to reclaim negatively the autonomy that evolution seems to deny to human beings. It is only because we can freely choose our own ends that we can decide not to live for life, but for some other value that we posit. The artist’s decision to produce spiritual offspring rather than physical ones is thus allied to the monk’s celibacy and the warrior’s death for his country, as gestures that deny the empire of mere life.
  • Animals produce beauty on their bodies; humans can also produce it in their artifacts. The natural inference, then, would be that art is a human form of sexual display, a way for males to impress females with spectacularly redundant creations.
  • For Darwin, the human sense of beauty was not different in kind from the bird’s.
  • Still, Darwin recognized that the human sense of beauty was mediated by “complex ideas and trains of thought,” which make it impossible to explain in terms as straightforward as a bird’s:
  • Put more positively, one might say that any given work of art can be discussed critically and historically, but not deduced from the laws of evolution.
  • with the rise of evolutionary psychology, it was only a matter of time before the attempt was made to explain art in Darwinian terms. After all, if ethics and politics can be explained by game theory and reciprocal altruism, there is no reason why aesthetics should be different: in each case, what appears to be a realm of human autonomy can be reduced to the covert expression of biological imperatives
  • Still, there is an unmistakable sense in discussions of Darwinian aesthetics that by linking art to fitness, we can secure it against charges of irrelevance or frivolousness—that mattering to reproduction is what makes art, or anything, really matter.
  • The first popular effort in this direction was the late Denis Dutton’s much-discussed book The Art Instinct, which appeared in 2009.
  • Dutton’s Darwinism was aesthetically conservative: “Darwinian aesthetics,” he wrote, “can restore the vital place of beauty, skill, and pleasure as high artistic values.” Dutton’s argument has recently been reiterated and refined by a number of new books,
  • “The universality of art and artistic behaviors, their spontaneous appearance everywhere across the globe ... and the fact that in most cases they can be easily recognized as artistic across cultures suggest that they derive from a natural, innate source: a universal human psychology.”
  • Again like language, art is universal in the sense that any local expression of it can be “learned” by anyone.
  • Yet earlier theorists of evolution were reluctant to say that art was an evolutionary adaptation like language, for the simple reason that it does not appear to be evolutionarily adaptive.
  • Stephen Jay Gould suggested that art was not an evolutionary adaptation but what he called a “spandrel”—that is, a showy but accidental by-product of other adaptations that were truly functiona
  • the very words “success” and “failure,” despite themselves, bring an emotive and ethical dimension into the discussion, so impossible is it for human beings to inhabit a valueless world. In the nineteenth century, the idea that fitness for survival was a positive good motivated social Darwinism and eugenics. Proponents of these ideas thought that in some way they were serving progress by promoting the flourishing of the human race, when the basic premise of Darwinism is that there is no such thing as progress or regress, only differential rates of reproduction
  • In particular, Darwin suggests that it is impossible to explain the history or the conventions of any art by the general imperatives of evolution
  • Boyd begins with the premise that human beings are pattern-seeking animals: both our physical perceptions and our social interactions are determined by our brain’s innate need to find and to
  • Art, then, can be defined as the calisthenics of pattern-finding. “Just as animal physical play refines performance, flexibility, and efficiency in key behaviors,” Boyd writes, “so human art refines our performance in our key perceptual and cognitive modes, in sight (the visual arts), sound (music), and social cognition (story). These three modes of art, I propose, are adaptations ... they show evidence of special design in humans, design that offers survival and especially reproductive advantages.”
  • make coherent patterns
Javier E

Untier Of Knots « The Dish - 0 views

  • Benedict XVI and John Paul II focused on restoring dogmatic certainty as the counterpart to papal authority. Francis is arguing that both, if taken too far, can be sirens leading us away from God, not ensuring our orthodoxy but sealing us off in calcified positions and rituals that can come to mean nothing outside themselves
  • In this quest to seek and find God in all things there is still an area of uncertainty. There must be. If a person says that he met God with total certainty and is not touched by a margin of uncertainty, then this is not good. For me, this is an important key. If one has the answers to all the questions – that is the proof that God is not with him. It means that he is a false prophet using religion for himself. The great leaders of the people of God, like Moses, have always left room for doubt. You must leave room for the Lord, not for our certainties; we must be humble.
  • If the Christian is a restorationist, a legalist, if he wants everything clear and safe, then he will find nothing. Tradition and memory of the past must help us to have the courage to open up new areas to God.
  • ...31 more annotations...
  • In the end, you realize your only real option – against almost every fiber in your irate being – is to take each knot in turn, patiently and gently undo it, loosen a little, see what happens, and move on to the next. You will never know exactly when all the knots will resolve themselves – it can happen quite quickly after a while or seemingly never. But you do know that patience, and concern with the here and now, is the only way to “solve” the “problem.” You don’t look forward with a plan; you look down with a practice.
  • we can say what God is not, we can speak of his attributes, but we cannot say what He is. That apophatic dimension, which reveals how I speak about God, is critical to our theology
  • I would also classify as arrogant those theologies that not only attempted to define with certainty and exactness God’s attributes, but also had the pretense of saying who He was.
  • It is only in living that we achieve hints and guesses – and only hints and guesses – of what the Divine truly is. And because the Divine is found and lost by humans in time and history, there is no reachable truth for humans outside that time and history.
  • We are part of an unfolding drama in which the Christian, far from clinging to some distant, pristine Truth he cannot fully understand, will seek to understand and discern the “signs of the times” as one clue as to how to live now, in the footsteps of Jesus. Or in the words of T.S. Eliot, There is only the fight to recover what has been lost And found and lost again and again: and now, under conditions That seem unpropitious. But perhaps neither gain nor loss. For us, there is only the trying. The rest is not our business.
  • Ratzinger’s Augustinian notion of divine revelation: it is always a radical gift; it must always be accepted without question; it comes from above to those utterly unworthy below; and we are too flawed, too sinful, too human to question it in even the slightest respect. And if we ever compromise an iota on that absolute, authentic, top-down truth, then we can know nothing as true. We are, in fact, lost for ever.
  • A Christian life is about patience, about the present and about trust that God is there for us. It does not seek certainty or finality to life’s endless ordeals and puzzles. It seeks through prayer and action in the world to listen to God’s plan and follow its always-unfolding intimations. It requires waiting. It requires diligence
  • We may never know why exactly Benedict resigned as he did. But I suspect mere exhaustion of the body and mind was not the whole of it. He had to see, because his remains such a first-rate mind, that his project had failed, that the levers he continued to pull – more and more insistent doctrinal orthodoxy, more political conflict with almost every aspect of the modern world, more fastidious control of liturgy – simply had no impact any more.
  • The Pope must accompany those challenging existing ways of doing things! Others may know better than he does. Or, to feminize away the patriarchy: I dream of a church that is a mother and shepherdess. The church’s ministers must be merciful, take responsibility for the people, and accompany them like the good Samaritan, who washes, cleans, and raises up his neighbor. This is pure Gospel.
  • the key to Francis’ expression of faith is an openness to the future, a firm place in the present, and a willingness to entertain doubt, to discern new truths and directions, and to grow. Think of Benedict’s insistence on submission of intellect and will to the only authentic truth (the Pope’s), and then read this: Within the Church countless issues are being studied and reflected upon with great freedom. Differing currents of thought in philosophy, theology, and pastoral practice, if open to being reconciled by the Spirit in respect and love, can enable the Church to grow, since all of them help to express more clearly the immense riches of God’s word. For those who long for a monolithic body of doctrine guarded by all and leaving no room for nuance, this might appear as undesirable and leading to confusion. But in fact such variety serves to bring out and develop different facets of the inexhaustible riches of the Gospel.
  • Francis, like Jesus, has had such an impact in such a short period of time simply because of the way he seems to be. His being does not rely on any claims to inherited, ecclesiastical authority; his very way of life is the only moral authority he wants to claim.
  • faith is, for Francis, a way of life, not a set of propositions. It is a way of life in community with others, lived in the present yet always, deeply, insistently aware of eternity.
  • Father Howard Gray S.J. has put it simply enough: Ultimately, Ignatian spirituality trusts the world as a place where God dwells and labors and gathers all to himself in an act of forgiveness where that is needed, and in an act of blessing where that is prayed for.
  • Underlying all this is a profound shift away from an idea of religion as doctrine and toward an idea of religion as a way of life. Faith is a constantly growing garden, not a permanently finished masterpiece
  • Some have suggested that much of what Francis did is compatible with PTSD. He disowned his father and family business, and he chose to live homeless, and close to naked, in the neighboring countryside, among the sick and the animals. From being the dashing man of society he had once been, he became a homeless person with what many of us today would call, at first blush, obvious mental illness.
  • these actions – of humility, of kindness, of compassion, and of service – are integral to Francis’ resuscitation of Christian moral authority. He is telling us that Christianity, before it is anything else, is a way of life, an orientation toward the whole, a living commitment to God through others. And he is telling us that nothing – nothing – is more powerful than this.
  • I would not speak about, not even for those who believe, an “absolute” truth, in the sense that absolute is something detached, something lacking any relationship. Now, the truth is a relationship! This is so true that each of us sees the truth and expresses it, starting from oneself: from one’s history and culture, from the situation in which one lives, etc. This does not mean that the truth is variable and subjective. It means that it is given to us only as a way and a life. Was it not Jesus himself who said: “I am the way, the truth, the life”? In other words, the truth is one with love, it requires humbleness and the willingness to be sought, listened to and expressed.
  • “proselytism is solemn nonsense.” That phrase – deployed by the Pope in dialogue with the Italian atheist Eugenio Scalfari (as reported by Scalfari) – may seem shocking at first. But it is not about denying the revelation of Jesus. It is about how that revelation is expressed and lived. Evangelism, for Francis, is emphatically not about informing others about the superiority of your own worldview and converting them to it. That kind of proselytism rests on a form of disrespect for another human being. Something else is needed:
  • nstead of seeming to impose new obligations, Christians should appear as people who wish to share their joy, who point to a horizon of beauty and who invite others to a delicious banquet. It is not by proselytizing that the Church grows, but “by attraction.”
  • what you see in the life of Saint Francis is a turn from extreme violence to extreme poverty, as if only the latter could fully compensate for the reality of the former. This was not merely an injunction to serve the poor. It is the belief that it is only by being poor or becoming poor that we can come close to God
  • Pope Francis insists – and has insisted throughout his long career in the church – that poverty is a key to salvation. And in choosing the name Francis, he explained last March in Assisi, this was the central reason why:
  • Saint Francis. His conversion came after he had gone off to war in defense of his hometown, and, after witnessing horrifying carnage, became a prisoner of war. After his release from captivity, his strange, mystical journey began.
  • the priority of practice over theory, of life over dogma. Evangelization is about sitting down with anyone anywhere and listening and sharing and being together. A Christian need not be afraid of this encounter. Neither should an atheist. We are in this together, in the same journey of life, with the same ultimate mystery beyond us. When we start from that place – of radical humility and radical epistemological doubt – proselytism does indeed seem like nonsense, a form of arrogance and detachment, reaching for power, not freedom. And evangelization is not about getting others to submit their intellect and will to some new set of truths; it is about an infectious joy for a new way of living in the world. All it requires – apart from joy and faith – is patience.
  • “Preach the Gospel always. If necessary, with words.”
  • But there is little sense that a political or economic system can somehow end the problem of poverty in Francis’ worldview. And there is the discomfiting idea that poverty itself is not an unmitigated evil. There is, indeed, a deep and mysterious view, enunciated by Jesus, and held most tenaciously by Saint Francis, that all wealth, all comfort, and all material goods are suspect and that poverty itself is a kind of holy state to which we should all aspire.
  • Not only was Saint Francis to become homeless and give up his patrimony, he was to travel on foot, wearing nothing but a rough tunic held together with rope. Whatever else it is, this is not progressivism. It sees no structural, human-devised system as a permanent improver of our material lot. It does not envision a world without poverty, but instead a church of the poor and for the poor. The only material thing it asks of the world, or of God, is daily bread – and only for today, never for tomorrow.
  • From this perspective, the idea that a society should be judged by the amount of things it can distribute to as many people as possible is anathema. The idea that there is a serious social and political crisis if we cannot keep our wealth growing every year above a certain rate is an absurdity.
  • this is a 21st-century heresy. Which means, I think, that this Pope is already emerging and will likely only further emerge as the most potent critic of the newly empowered global capitalist project.
  • Now, the only dominant ideology in the world is the ideology of material gain – either through the relatively free markets of the West or the state-controlled markets of the East. And so the church’s message is now harder to obscure. It stands squarely against the entire dominant ethos of our age. It is the final resistance.
  • For Francis, history has not come to an end, and capitalism, in as much as it is a global ideology that reduces all of human activity to the cold currency of wealth, is simply another “ism” to be toppled in humankind’s unfolding journey toward salvation on earth.
  • Francis will grow as the church reacts to him; it will be a dynamic, not a dogma; and it will be marked less by the revelation of new things than by the new recognition of old things, in a new language. It will be, if its propitious beginnings are any sign, a patient untying of our collective, life-denying knots.
Javier E

John Roberts, the Umpire in Chief - The New York Times - 0 views

  • The Roberts-Scalia debate is part of a longstanding argument about how judges should interpret laws passed by Congress.
  • the chief justice embraces an approach called “purposivism,” while Justice Scalia prefers “textualism.”
  • In Judge Katzmann’s account, purposivism has been the approach favored for most of American history by conservative and liberal judges, senators, and representatives, as well as administrative agencies. Purposivism holds that judges shouldn’t confine themselves to the words of a law but should try to discern Congress’s broader purposes.
  • ...6 more annotations...
  • In the 1980s, when he was a lower court judge, Justice Scalia began to champion a competing view of statutory interpretation, textualism, which holds that judges should confine themselves to interpreting the words that Congress chose without trying to discern Congress’s broader purposes.
  • Textualism, in this view, promises to constrain judicial activism by preventing judges from roving through legislative history in search of evidence that supports their own policy preferences. But in the view of its critics, like Chief Judge Katzmann, textualism “increases the probability that a judge will construe a law in a manner that the legislators did not intend.”
  • Judge Katzmann, who was appointed by President Bill Clinton, also accuses Justice Scalia of inconsistency for consulting the intent of the framers in the case of constitutional interpretation but not statutory interpretation.
  • The chief justice’s embrace of bipartisan judicial restraint in the second Affordable Care Act case was consistent with his embrace of the same philosophy in the first Affordable Care Act case in 2012, where he quoted one of his heroes, Justice Oliver Wendell Holmes Jr: “The rule is settled that as between two possible interpretations of a statute, by one of which it would be unconstitutional and by the other valid, our plain duty is to adopt that which will save the Act.”
  • Chief Justice Roberts was not, as Justice Scalia charged, rewriting the law. Instead he was advancing the view that he championed soon after his confirmation: In a polarized age, it is important for the Supreme Court to maintain its institutional legitimacy by deferring to the political branches.
  • Chief Justice Roberts’s relatively consistent embrace of judicial deference to democratic decisions supports his statement during his confirmation hearings that judges should be like umpires calling “balls and strikes.” As he put it then: “Umpires don’t make the rules, they apply them. The role of an umpire and a judge is critical. They make sure everybody plays by the rules, but it is a limited role. Nobody ever went to a ballgame to see the umpire.”
Javier E

What 'White Privilege' Really Means - NYTimes.com - 0 views

  • This week’s conversation is with Naomi Zack, a professor of philosophy at the University of Oregon and the author of “The Ethics and Mores of Race: Equality After the History of Philosophy.”
  • My first book, “Race and Mixed Race” (1991) was an analysis of the incoherence of U.S. black/white racial categories in their failure to allow for mixed race. In “Philosophy of Science and Race,” I examined the lack of a scientific foundation for biological notions of human races, and in “The Ethics and Mores of Race,” I turned to the absence of ideas of universal human equality in the Western philosophical tradition.
  • Critical philosophy of race, like critical race theory in legal studies, seeks to understand the disadvantages of nonwhite racial groups in society (blacks especially) by understanding social customs, laws, and legal practices.
  • ...14 more annotations...
  • What’s happening in Ferguson is the result of several recent historical factors and deeply entrenched racial attitudes, as well as a breakdown in participatory democracy.
  • In Ferguson, the American public has awakened to images of local police, fully decked out in surplus military gear from our recent wars in Iraq and Afghanistan, who are deploying all that in accordance with a now widespread “broken windows” policy, which was established on the hypothesis that if small crimes and misdemeanors are checked in certain neighborhoods, more serious crimes will be deterred. But this policy quickly intersected with police racial profiling already in existence to result in what has recently become evident as a propensity to shoot first.
  • How does this “broken windows” policy relate to the tragic deaths of young black men/boys? N.Z.:People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate.
  • Young black men are the convenient target of choice in the tragic intersection of the broken windows policy, the domestic effects of the war on terror and police racial profiling.
  • Why do you think that young black men are disproportionately targeted? N.Z.: Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our society and have suffered the most extreme forms of oppression, for centuries.
  • Police in the United States are mostly white and mostly male. Some confuse their work roles with their own characters. As young males, they naturally pick out other young male opponents. They have to win, because they are the law, and they have the moral charge of protecting.
  • So young black males, who have less status than they do, and are already more likely to be imprisoned than young white males, are natural suspects.
  • Besides the police, a large segment of the white American public believes they are in danger from blacks, especially young black men, who they think want to rape young white women. This is an old piece of American mythology that has been invoked to justify crimes against black men, going back to lynching. The perceived danger of blacks becomes very intense when blacks are harmed.
  • The term “white privilege” is misleading. A privilege is special treatment that goes beyond a right. It’s not so much that being white confers privilege but that not being white means being without rights in many cases. Not fearing that the police will kill your child for no reason isn’t a privilege. It’s a right. 
  • that is what “white privilege” is meant to convey, that whites don’t have many of the worries nonwhites, especially blacks, do.
  • Other examples of white privilege include all of the ways that whites are unlikely to end up in prison for some of the same things blacks do, not having to worry about skin-color bias, not having to worry about being pulled over by the police while driving or stopped and frisked while walking in predominantly white neighborhoods, having more family wealth because your parents and other forebears were not subject to Jim Crow and slavery.
  • Probably all of the ways in which whites are better off than blacks in our society are forms of white privilege.
  • Over half a century later, it hasn’t changed much in the United States. Black people are still imagined to have a hyper-physicality in sports, entertainment, crime, sex, politics, and on the street. Black people are not seen as people with hearts and minds and hopes and skills but as cyphers that can stand in for anything whites themselves don’t want to be or think they can’t be.
  • race is through and through a social construct, previously constructed by science, now by society, including its most extreme victims. But, we cannot abandon race, because people would still discriminate and there would be no nonwhite identities from which to resist. Also, many people just don’t want to abandon race and they have a fundamental right to their beliefs. So race remains with us as something that needs to be put right.
johnsonma23

What 'White Privilege' Really Means - NYTimes.com - 0 views

  • Critical philosophy of race, like critical race theory in legal studies, seeks to understand the disadvantages of nonwhite racial groups in society (blacks especially) by understanding social customs, laws, and legal practices.
  • What’s happening in Ferguson is the result of several recent historical factors and deeply entrenched racial attitudes, as well as a breakdown in participatory democracy.
  • It’s too soon to tell, but “Don’t Shoot” could become a real political movement — or it could peter out as the morally outraged self-expression of the moment, like Occupy Wall Street.
  • ...5 more annotations...
  • people in such circumstances would vote for political representatives on all levels of government who would be their advocates.
  • Middle-class and poor blacks in the United States do less well than whites with the same income on many measures of human well-being: educational attainment, family wealth, employment, health, longevity, infant mortality.
  • But the value of money pales in contrast to the tragedy this country is now forced to deal with. A tragedy is the result of a mistake, of an error in judgment that is based on habit and character, which brings ruin
  • People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate. The death of Michael Brown, like the death of Trayvon Martin before him and the death of Oscar Grant before him, may be but the tip of an iceberg.
  • Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our societ
Javier E

The Obama Boom - The New York Times - 1 views

  • What did Mr. Obama do that was supposed to kill jobs? Quite a lot, actually. He signed the 2010 Dodd-Frank financial reform, which critics claimed would crush employment by starving businesses of capital.
  • He raised taxes on high incomes, especially at the very top, where average tax rates rose by about six and a half percentage points after 2012, a step that critics claimed would destroy incentives.
  • Yet none of the dire predicted consequences of these policies have materialized.
  • ...6 more annotations...
  • And he enacted a health reform that went into full effect in 2014, amid claims that it would have catastrophic effects on employment.
  • what do we learn from this impressive failure to fail? That the conservative economic orthodoxy dominating the Republican Party is very, very wrong.
  • conservative orthodoxy has a curiously inconsistent view of the abilities and motivations of corporations and wealthy individuals — I mean, job creators.
  • On one side, this elite is presumed to be a bunch of economic superheroes, able to deliver universal prosperity by summoning the magic of the marketplace. On the other side, they’re depicted as incredibly sensitive flowers who wilt in the face of adversity — raise their taxes a bit, subject them to a few regulations, or for that matter hurt their feelings in a speech or two, and they’ll stop creating jobs and go sulk in their tents, or more likely their mansions.
  • It’s a doctrine that doesn’t make much sense, but it conveys a clear message that, whaddya know, turns out to be very convenient for the elite: namely, that injustice is a law of nature, that we’d better not do anything to make our society less unequal or protect ordinary families from financial risks. Because if we do, the usual suspects insist, we’ll be severely punished by the invisible hand, which will collapse the economy.
  • From a conservative point of view, Mr. Obama did everything wrong, afflicting the comfortable (slightly) and comforting the afflicted (a lot), and nothing bad happened. We can, it turns out, make our society better after all.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
caelengrubb

Free Market - Econlib - 0 views

  • Free market” is a summary term for an array of exchanges that take place in society.
  • Each exchange is undertaken as a voluntary agreement between two people or between groups of people represented by agents. These two individuals (or agents) exchange two economic goods, either tangible commodities or nontangible services
  • Both parties undertake the exchange because each expects to gain from it. Also, each will repeat the exchange next time (or refuse to) because his expectation has proved correct (or incorrect) in the recent past.
  • ...25 more annotations...
  • Trade, or exchange, is engaged in precisely because both parties benefit; if they did not expect to gain, they would not agree to the exchange.
  • This simple reasoning refutes the argument against free trade typical of the “mercantilist” period of sixteenth- to eighteenth-century Europe and classically expounded by the famed sixteenth-century French essayist Montaigne.
  • The mercantilists argued that in any trade, one party can benefit only at the expense of the other—that in every transaction there is a winner and a loser, an “exploiter” and an “exploited.”
  • We can immediately see the fallacy in this still-popular viewpoint: the willingness and even eagerness to trade means that both parties benefit. In modern game-theory jargon, trade is a win-win situation, a “positive-sum” rather than a “zero-sum” or “negative-sum” game.
  • Each one values the two goods or services differently, and these differences set the scene for an exchange.
  • Two factors determine the terms of any agreement: how much each participant values each good in question, and each participant’s bargaining skills.
  • the market in relation to how favorably buyers evaluate these goods—in shorthand, by the interaction of their supply with the demand for them.
  • On the other hand, given the buyers’ evaluation, or demand, for a good, if the supply increases, each unit of supply—each baseball card or loaf of bread—will fall in value, and therefore the price of the good will fall. The reverse occurs if the supply of the good decreases.
  • The market, then, is not simply an array; it is a highly complex, interacting latticework of exchanges.
  • Production begins with natural resources, and then various forms of machines and capital goods, until finally, goods are sold to the consumer.
  • At each stage of production from natural resource to consumer good, money is voluntarily exchanged for capital goods, labor services, and land resources. At each step of the way, terms of exchanges, or prices, are determined by the voluntary interactions of suppliers and demanders. This market is “free” because choices, at each step, are made freely and voluntarily.
  • The free market and the free price system make goods from around the world available to consumers.
  • Saving and investment can then develop capital goods and increase the productivity and wages of workers, thereby increasing their standard of living.
  • The free competitive market also rewards and stimulates technological innovation that allows the innovator to get a head start in satisfying consumer wants in new and creative ways.
  • Government, in every society, is the only lawful system of coercion. Taxation is a coerced exchange, and the heavier the burden of taxation on production, the more likely it is that economic growth will falter and decline
  • The ultimate in government coercion is socialism.
  • Under socialist central planning the socialist planning board lacks a price system for land or capital goods.
  • Market socialism is, in fact, a contradiction in terms.
  • The fashionable discussion of market socialism often overlooks one crucial aspect of the market: When two goods are exchanged, what is really exchanged is the property titles in those goods.
  • This means that the key to the existence and flourishing of the free market is a society in which the rights and titles of private property are respected, defended, and kept secure.
  • The key to socialism, on the other hand, is government ownership of the means of production, land, and capital goods.
  • Under socialism, therefore, there can be no market in land or capital goods worthy of the name.
  • ome critics of the free market argue that property rights are in conflict with “human” rights. But the critics fail to realize that in a free-market system, every person has a property right over his own person and his own labor and can make free contracts for those services.
  • A common charge against the free-market society is that it institutes “the law of the jungle,” of “dog eat dog,” that it spurns human cooperation for competition and exalts material success as opposed to spiritual values, philosophy, or leisure activities.
  • It is the coercive countries with little or no market activity—the notable examples in the last half of the twentieth century were the communist countries—where the grind of daily existence not only impoverishes people materially but also deadens their spirit.
Javier E

The Dictionary Is Telling People How to Speak Again - The Atlantic - 1 views

  • print dictionaries have embodied certain ideas about democracy and capitalism that seem especially American—specifically, the notion that “good” English can be packaged and sold, becoming accessible to anyone willing to work hard enough to learn it.
  • Massive social changes in the 1960s accompanied the appearance Webster’s Third, and a new era arose for dictionaries: one in which describing how people use language became more important than showing them how to do so properly. But that era might finally be coming to an end, thanks to the internet, the decline of print dictionaries, and the political consequences of an anything-goes approach to language.
  • The standard way of describing these two approaches in lexicography is to call them “descriptivist” and “prescriptivist.” Descriptivist lexicographers, steeped in linguistic theory, eschew value judgements about so-called correct English and instead describe how people are using the language. Prescriptivists, by contrast, inform readers which usage is “right” and which is “wrong.”
  • ...11 more annotations...
  • Many American readers, though, didn’t want a non-hierarchical assessment of their language. They wanted to know which usages were “correct,” because being able to rely on a dictionary to tell you how to sound educated and upper class made becoming upper class seem as if it might be possible. That’s why the public responded badly to Webster’s latest: They craved guidance and rules.
  • Webster’s Third so unnerved critics and customers because the American idea of social mobility is limited, provisional, and full of paradoxes
  • There’s no such thing as social mobility if everyone can enjoy it. To be allowed to move around within a hierarchy implies that the hierarchy must be left largely intact. But in America, people have generally accepted the idea of inherited upper-class status, while seeing upward social mobility as something that must be earned.
  • In a 2001 Harper’s essay about the Webster’s Third controversy, David Foster Wallace called the publication of the dictionary “the Fort Sumter of the contemporary usage wars.”
  • for decades after the publication of Webster’s Third, people still had intense opinions about dictionaries. In the 1990s, an elderly copy editor once told me, with considerable vehemence, that Merriam-Webster’s Dictionaries were “garbage.” She would only use Houghton Mifflin’s American Heritage Dictionary, which boasted a Usage Panel of experts to advise readers about the finer points of English grammar
  • what descriptivists do: They describe rather than judge. Nowadays, this approach to dictionary making is generally not contested or even really discussed.
  • In his 2009 book Going Nucular, Geoffrey Nunberg observes that we now live in a culture in which there are no clear distinctions between highbrow, middlebrow, and lowbrow culture. It stands to reason that in a society in which speaking in a recognizably “highbrow” way confers no benefits, dictionaries will likely matter less
  • If American Heritage was aggressively branding itself in the 1960s, Merriam-Webster is doing the same now.
  • The company has a feisty blog and Twitter feed that it uses to criticize linguistic and grammatical choices. President Trump and his administration are regular catalysts for social-media clarifications by Merriam-Webster. The company seems bothered when Trump and his associates change the meanings of words for their own convenience, or when they debase the language more generally.
  • it seems that the way the company has regained its relevance in the post-print era is by having a strong opinions about how people should use English.
  • It may be that in spite of Webster’s Third’s noble intentions, language may just be too human a thing to be treated in an entirely detached, scientific way. Indeed, I’m not sure I want to live in a society in which citizens can’t call out government leaders when they start subverting language in distressing ways.
Javier E

'I Like to Watch,' by Emily Nussbaum book review - The Washington Post - 0 views

  • Nussbaum’s case: That television could be great, and not because it was “novelistic” or “cinematic” but because it was, simply, television, “episodic, collaborative, writer-driven, and formulaic” by design.
  • According to Nussbaum, a TV show achieved greatness not despite these facts (which assumes they are limitations) but because of them (which sees them as an infrastructure that provokes creativity and beauty — “the sort that govern sonnets,”
  • Nussbaum’s once-iconoclastic views have become mainstream.
  • ...8 more annotations...
  • It is increasingly common to find yourself apologizing not for watching too much TV but for having failed to spend 70 hours of your precious, finite life binge-watching one of the Golden Age of Television’s finest offerings.
  • Nussbaum writes of her male classmates at NYU, where she was a literature doctoral student in the late 1990s. These men worshiped literature and film; they thought TV was trash. These men “were also, not coincidentally, the ones whose opinions tended to dominate mainstream media conversation.”
  • the same forces that marginalize the already-marginalized still work to keep TV shows by and about women, people of color, and LGBTQ+ individuals on a lower tier than those about cis, straight, white men: Your Tony Sopranos, your Walter Whites, your Don Drapers, your True Detectives
  • Over and over, Nussbaum pushes back against a hierarchy that rewards dramas centered on men and hyperbolically masculine pursuits (dealing drugs, being a cop, committing murders, having sex with beautiful women) and shoves comedies and whatever scans as “female” to the side.
  • Nussbaum sticks up for soaps, rom-coms, romance novels and reality television, “the genres that get dismissed as fluff, which is how our culture regards art that makes women’s lives look like fun.
  • Nussbaum’s writing consistently comes back to the question of “whose stories carried weight . . . what kind of creativity counted as ambitious, and who . . . deserved attention . . . Whose story counted as universal?
  • What does it mean to think morally about the art we consume — and, by extension, financially support, and center in our emotional and imaginative lives? The art that informs, on some near-cellular level, who we want to know and love and be?
  • maybe the next frontier of cultural thought is in thinking more cohesively about what we’ve long compartmentalized — of not stashing conflicting feelings about good art by bad men in some dark corner of our minds, but in holding our discomfort and contradictions up to the light, for a clearer view.
katherineharron

Pew Research Center finds widespread agreement about the 'made-up news' malady - CNN - 0 views

  • Survey people about a range of issues, ask which issues are a "very big problem for the country," and more Americans will cite "made-up news" than terrorism, illegal immigration, racism or sexism.
  • Of course, some point the finger primarily at President Trump while others blame irresponsible news outlets. People are using different definitions of "made-up." But the study shows a widespread awareness of what's sometimes called the War on Truth.
  • 1: Pew says "Americans blame political leaders and activists far more than journalists for the creation of made-up news but put the responsibility on the news media to fix it." Only 9% say the onus is mostly on the tech companies.2: When people bemoan made-up news, they're not just talking about politics: 61% of respondents said there's a lot of bogus content out there about entertainment and celebrities.3: "52% of Americans have shared made-up news knowingly and/or unknowingly." Almost everyone says they only found out the info was bogus after sharing.4: Here is a hopeful sign! 78% "say they have checked the facts in news stories themselves." More here...
  • ...2 more annotations...
  • The attorney representing 10 of the families who lost relatives in the Sandy Hook massacre told me that he welcomed YouTube's Wednesday action, but said it was "too late to undo the harm" that has been caused to his clients from conspiracy theories circulating on the platform over the past several years. "Sandy Hook happened now nearly seven years ago, and so during that entire time the clients were subject to hostile postings on YouTube that disseminated this false narrative and caused undue harassment, threats, and fallacies as they were trying to heal," said the attorney, Josh Koskoff. "At the same time, better late than never."
  • Moving forward, it will be interesting to see if other social media company adopts guidelines similar to the ones YouTube announced on Wednesday regarding content that denies well-documented violent events like Sandy Hook. "All social media platforms who have not taken this step, should look in the mirror and decide whether they want to continue to facilitate harassment and hate in this day and age where that has serious consequences," Koskoff told me. And Pozner echoed that, saying that he hoped "Twitter and other hosting platforms will follow suit in implementing and enforcing more socially responsible policies."
Javier E

Pandemic-Era Politics Are Ruining Public Education - The Atlantic - 0 views

  • You’re also the nonvoting, perhaps unwitting, subject of adults’ latest pedagogical experiments: either relentless test prep or test abolition; quasi-religious instruction in identity-based virtue and sin; a flood of state laws to keep various books out of your hands and ideas out of your head.
  • Your parents, looking over your shoulder at your education and not liking what they see, have started showing up at school-board meetings in a mortifying state of rage. If you live in Virginia, your governor has set up a hotline where they can rat out your teachers to the government. If you live in Florida, your governor wants your parents to sue your school if it ever makes you feel “discomfort” about who you are
  • Adults keep telling you the pandemic will never end, your education is being destroyed by ideologues, digital technology is poisoning your soul, democracy is collapsing, and the planet is dying—but they’re counting on you to fix everything when you grow up.
  • ...37 more annotations...
  • It isn’t clear how the American public-school system will survive the COVID years. Teachers, whose relative pay and status have been in decline for decades, are fleeing the field. In 2021, buckling under the stresses of the pandemic, nearly 1 million people quit jobs in public education, a 40 percent increase over the previous year.
  • These kids, and the investments that come with them, may never return—the beginning of a cycle of attrition that could continue long after the pandemic ends and leave public schools even more underfunded and dilapidated than before. “It’s an open question whether the public-school system will recover,” Steiner said. “That is a real concern for democratic education.”
  • The high-profile failings of public schools during the pandemic have become a political problem for Democrats, because of their association with unions, prolonged closures, and the pedagogy of social justice, which can become a form of indoctrination.
  • The party that stands for strong government services in the name of egalitarian principles supported the closing of schools far longer than either the science or the welfare of children justified, and it has been woefully slow to acknowledge how much this damaged the life chances of some of America’s most disadvantaged students.
  • Public education is too important to be left to politicians and ideologues. Public schools still serve about 90 percent of children across red and blue America.
  • Since the common-school movement in the early 19th century, the public school has had an exalted purpose in this country. It’s our core civic institution—not just because, ideally, it brings children of all backgrounds together in a classroom, but because it prepares them for the demands and privileges of democratic citizenship. Or at least, it needs to.
  • What is school for? This is the kind of foundational question that arises when a crisis shakes the public’s faith in an essential institution. “The original thinkers about public education were concerned almost to a point of paranoia about creating self-governing citizens,”
  • “Horace Mann went to his grave having never once uttered the phrase college- and career-ready. We’ve become more accustomed to thinking about the private ends of education. We’ve completely lost the habit of thinking about education as citizen-making.”
  • School can’t just be an economic sorting system. One reason we have a stake in the education of other people’s children is that they will grow up to be citizens.
  • Public education is meant not to mirror the unexamined values of a particular family or community, but to expose children to ways that other people, some of them long dead, think.
  • If the answer were simply to push more and more kids into college, the United States would be entering its democratic prime
  • So the question isn’t just how much education, but what kind. Is it quaint, or utopian, to talk about teaching our children to be capable of governing themselves?
  • The COVID era, with Donald Trump out of office but still in power and with battles over mask mandates and critical race theory convulsing Twitter and school-board meetings, shows how badly Americans are able to think about our collective problems—let alone read, listen, empathize, debate, reconsider, and persuade in the search for solutions.
  • democratic citizenship can, at least in part, be learned.
  • The history warriors build their metaphysics of national good or evil on a foundation of ignorance. In a 2019 survey, only 40 percent of Americans were able to pass the test that all applicants for U.S. citizenship must take, which asks questions like “Who did the United States fight in World War II?” and “We elect a President for how many years?” The only state in which a majority passed was Vermont.
  • he orthodoxies currently fighting for our children’s souls turn the teaching of U.S. history into a static and morally simple quest for some American essence. They proceed from celebration or indictment toward a final judgment—innocent or guilty—and bury either oppression or progress in a subordinate clause. The most depressing thing about this gloomy pedagogy of ideologies in service to fragile psyches is how much knowledge it takes away from students who already have so little
  • A central goal for history, social-studies, and civics instruction should be to give students something more solid than spoon-fed maxims—to help them engage with the past on its own terms, not use it as a weapon in the latest front of the culture wars.
  • Releasing them to do “research” in the vast ocean of the internet without maps and compasses, as often happens, guarantees that they will drown before they arrive anywhere.
  • The truth requires a grounding in historical facts, but facts are quickly forgotten without meaning and context
  • The goal isn’t just to teach students the origins of the Civil War, but to give them the ability to read closely, think critically, evaluate sources, corroborate accounts, and back up their claims with evidence from original documents.
  • This kind of instruction, which requires teachers to distinguish between exposure and indoctrination, isn’t easy; it asks them to be more sophisticated professionals than their shabby conditions and pay (median salary: $62,000, less than accountants and transit police) suggest we are willing to support.
  • To do that, we’ll need to help kids restore at least part of their crushed attention spans.
  • staring at a screen for hours is a heavy depressant, especially for teenagers.
  • we’ll look back on the amount of time we let our children spend online with the same horror that we now feel about earlier generations of adults who hooked their kids on smoking.
  • “It’s not a choice between tech or no tech,” Bill Tally, a researcher with the Education Development Center, told me. “The question is what tech infrastructure best enables the things we care about,” such as deep engagement with instructional materials, teachers, and other students.
  • The pandemic should have forced us to reassess what really matters in public school; instead, it’s a crisis that we’ve just about wasted.
  • Like learning to read as historians, learning to sift through the tidal flood of memes for useful, reliable information can emancipate children who have been heedlessly hooked on screens by the adults in their lives
  • Finally, let’s give children a chance to read books—good books. It’s a strange feature of all the recent pedagogical innovations that they’ve resulted in the gradual disappearance of literature from many classrooms.
  • The best way to interest young people in literature is to have them read good literature, and not just books that focus with grim piety on the contemporary social and psychological problems of teenagers.
  • We sell them insultingly short in thinking that they won’t read unless the subject is themselves. Mirrors are ultimately isolating; young readers also need windows, even if the view is unfamiliar, even if it’s disturbing
  • connection through language to universal human experience and thought is the reward of great literature, a source of empathy and wisdom.
  • The culture wars, with their atmosphere of resentment, fear, and petty faultfinding, are hostile to the writing and reading of literature.
  • W. E. B. Du Bois wrote: “Nations reel and stagger on their way; they make hideous mistakes; they commit frightful wrongs; they do great and beautiful things. And shall we not best guide humanity by telling the truth about all this, so far as the truth is ascertainable?”
  • The classroom has become a half-abandoned battlefield, where grown-ups who claim to be protecting students from the virus, from books, from ideologies and counter-ideologies end up using children to protect themselves and their own entrenched camps.
  • American democracy can’t afford another generation of adults who don’t know how to talk and listen and think. We owe our COVID-scarred children the means to free themselves from the failures of the past and the present.
  • Students are leaving as well. Since 2020, nearly 1.5 million children have been removed from public schools to attend private or charter schools or be homeschooled.
  • “COVID has encouraged poor parents to question the quality of public education. We are seeing diminished numbers of children in our public schools, particularly our urban public schools.” In New York, more than 80,000 children have disappeared from city schools; in Los Angeles, more than 26,000; in Chicago, more than 24,000.
« First ‹ Previous 41 - 60 of 102 Next › Last »
Showing 20 items per page