Skip to main content

Home/ TOK Friends/ Group items tagged oppression

Rss Feed Group items tagged

Javier E

It's Win-Win When Trump and the Democrats Work Together - 0 views

  • The “punch a Nazi” thread that became popular earlier this year among the left-liberal journalistic class opened my eyes to this, as more than a few liberal thought leaders loved it when they saw a video of Richard Spencer being clocked by a masked thug.
  • How has political violence now become acceptable on lefty Twitter and among one in five college students? I’d argue that it’s too easy to overlook the influence of the neo-Marxist ideology now pervasive on countless campuses — specifically the late philosopher Herbert Marcuse’s concepts of “violence of defense” and “violence of aggression” in the context of what he called “repressive tolerance.” For parts of the New Left, racist democratic capitalism perpetuates so much systemic oppression that any defense of it or acquiescence in it amounts to violence against the victims. Therefore violence in defense of the victims is perfectly defensible. It just levels the playing field.
  • Hence it’s okay to punch a Nazi, but not okay to punch a communist. It’s defensible for an oppressed person of color to assault a white person but never the other way round. Hence a recent discussion in The Guardian about whether cold-cocking a racist is defensible: “A punch may be uncivil, but racism is worse.”
  • ...2 more annotations...
  • Actually, speech is not just interchangeable with violence; even silence is! One of the more popular signs at the rally in Boston a few weeks back was the following: “White Silence = Violence.” If you are not actively speaking out against white supremacy, in other words, you are actively enforcing it. Once you’ve apologized for being born white, and asked permission to speak, your next and only step is to inveigh against racism/sexism, etc. … or be accused of being a white supremacist yourself. At some point your head begins to explode. What is this: a Maoist boot camp?
  • We often discuss these things in the media without understanding the core ideas that animate them. But it’s important to understand that for the social-justice left, there is nothing irrational about any of this. If you take their ideas seriously, oppressive speech is violence and self-defense is legitimate. Violence is therefore not some regrettable incident. Violence to achieve liberation is a key part of the ideology they believe in.
Javier E

YOU ARE NOT A RACIST TO CRITICIZE CRITICAL RACE THEORY. - It Bears Mentioning - 0 views

  • The early writings by people like Regina Austin, Richard Delgado, Kimberlé Crenshaw are simply hard-leftist legal analysis, proposing a revised conception of justice that takes oppression into account, including a collective sense of subordinate group identity. These are hardly calls to turn schools into Maoist re-education camps fostering star chambers and struggle sessions.However, this, indeed, is what is happening to educational institutions across the country.
  • 1. Young children should not be taught if white to be guilty and if black to feel a) oppressed and b) wary of white kids around them (and if South Asian to be very, very confused …).
  • "What we are interested in here might be termed “critical pedagogy.” “Critical pedagogy” names — without exhaustively defining — the host of concepts, terms, practices, and theories that have lately taken hold in many public and private schools. This term alludes to a connection to CRT — it might be thought of as critical race theory as applied to schooling — but also to “critical studies” and “critical theory,” a broader set of contemporary philosophical ideas that have been particularly influential in certain circles of the modern Left."
  • ...7 more annotations...
  • In a dialogue premised on good faith, we can assume that when politicos and parents decry “Critical Race Theory,” what they refer to is the idea of oppression and white perfidy treated as the main meal of an entire school’s curriculum.
  • what most of us (as opposed to the Establishment in schools of education) think, and are correct about, is this:
  • it is no tort to call it "CRT" in shorthand when:1) these developments are descended from its teachings and2) their architects openly bill themselves as following the tenets of CRT.
  • 2. Young children should not be taught that the American story is mainly (note I write mainly rather than only, but mainly is just as awful here) one of oppression and racism. Not because it’s unpleasant and because sinister characters want to “hide” it, but because it’s dumb.
  • 3. While there is room for the above ideas to be presented to children as some among many – maybe; I’m bending over backwards here – this kind of thought should certainly not be the fulcrum of a school’s entire curriculum, as has been reported at schools like Dalton and others in New York.
  • 1) Criticizing Critical Race Theory as it operates in 2021 does not require perusing the oeuvre of Kimberlé Crenshaw, and the critique is not invalidated by the differences between what articles like that contained and what’s happening in our schools now.
  • 2) Criticizing Critical Race Theory does not mean teaching students that America has been nothing but great.
Javier E

The Philosopher Redefining Equality | The New Yorker - 0 views

  • The bank experience showed how you could be oppressed by hierarchy, working in an environment where you were neither free nor equal. But this implied that freedom and equality were bound together in some way beyond the basic state of being unenslaved, which was an unorthodox notion. Much social thought is rooted in the idea of a conflict between the two.
  • If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.
  • What if they weren’t opposed, Anderson wondered, but, like the sugar-phosphate chains in DNA, interlaced in a structure that we might not yet understand?
  • ...54 more annotations...
  • At fifty-nine, Anderson is the chair of the University of Michigan’s department of philosophy and a champion of the view that equality and freedom are mutually dependent, enmeshed in changing conditions through time.
  • She has built a case, elaborated across decades, that equality is the basis for a free society
  • Because she brings together ideas from both the left and the right to battle increasing inequality, Anderson may be the philosopher best suited to this awkward moment in American life. She builds a democratic frame for a society in which people come from different places and are predisposed to disagree.
  • she sketched out the entry-level idea that one basic way to expand equality is by expanding the range of valued fields within a society.
  • The ability not to have an identity that one carries from sphere to sphere but, rather, to be able to slip in and adopt whatever values and norms are appropriate while retaining one’s identities in other domains?” She paused. “That is what it is to be free.”
  • How do you move from a basic model of egalitarian variety, in which everybody gets a crack at being a star at something, to figuring out how to respond to a complex one, where people, with different allotments of talent and virtue, get unequal starts, and often meet with different constraints along the way?
  • The problem, she proposed, was that contemporary egalitarian thinkers had grown fixated on distribution: moving resources from lucky-seeming people to unlucky-seeming people, as if trying to spread the luck around.
  • Egalitarians should agree about clear cases of blameless misfortune: the quadriplegic child, the cognitively impaired adult, the teen-ager born into poverty with junkie parents. But Anderson balked there, too. By categorizing people as lucky or unlucky, she argued, these egalitarians set up a moralizing hierarchy.
  • In Anderson’s view, the way forward was to shift from distributive equality to what she called relational, or democratic, equality: meeting as equals, regardless of where you were coming from or going to.
  • By letting the lucky class go on reaping the market’s chancy rewards while asking others to concede inferior status in order to receive a drip-drip-drip of redistributive aid, these egalitarians were actually entrenching people’s status as superior or subordinate.
  • To the ugly and socially awkward: . . . Maybe you won’t be such a loser in love once potential dates see how rich you are.
  • . To the stupid and untalented: Unfortunately, other people don’t value what little you have to offer in the system of production. . . . Because of the misfortune that you were born so poorly endowed with talents, we productive ones will make it up to you: we’ll let you share in the bounty of what we have produced with our vastly superior and highly valued abilities. . . 
  • she imagined some citizens getting a state check and a bureaucratic letter:
  • This was, at heart, an exercise of freedom. The trouble was that many people, picking up on libertarian misconceptions, thought of freedom only in the frame of their own actions.
  • To be truly free, in Anderson’s assessment, members of a society had to be able to function as human beings (requiring food, shelter, medical care), to participate in production (education, fair-value pay, entrepreneurial opportunity), to execute their role as citizens (freedom to speak and to vote), and to move through civil society (parks, restaurants, workplaces, markets, and all the rest).
  • Anderson’s democratic model shifted the remit of egalitarianism from the idea of equalizing wealth to the idea that people should be equally free, regardless of their differences.
  • A society in which everyone had the same material benefits could still be unequal, in this crucial sense; democratic equality, being predicated on equal respect, wasn’t something you could simply tax into existence. “People, not nature, are responsible for turning the natural diversity of human beings into oppressive hierarchies,”
  • Her first book, “Value in Ethics and Economics,” appeared that year, announcing one of her major projects: reconciling value (an amorphous ascription of worth that is a keystone of ethics and economics) with pluralism (the fact that people seem to value things in different ways).
  • Philosophers have often assumed that pluralistic value reflects human fuzziness—we’re loose, we’re confused, and we mix rational thought with sentimental responses.
  • She offered an “expressive” theory: in her view, each person’s values could be various because they were socially expressed, and thus shaped by the range of contexts and relationships at play in a life. Instead of positing value as a basic, abstract quality across society (the way “utility” functioned for economists), she saw value as something determined by the details of an individual’s history.
  • Like her idea of relational equality, this model resisted the temptation to flatten human variety toward a unifying standard. In doing so, it helped expand the realm of free and reasoned economic choice.
  • Anderson’s model unseated the premises of rational-choice theory, in which individuals invariably make utility-maximizing decisions, occasionally in heartless-seeming ways. It ran with, rather than against, moral intuition. Because values were plural, it was perfectly rational to choose to spend evenings with your family, say, and have guilt toward the people you left in the lurch at work.
  • The theory also pointed out the limits on free-market ideologies, such as libertarianism.
  • In ethics, it broke across old factional debates. The core idea “has been picked up on by people across quite a range of positions,” Peter Railton, one of Anderson’s longtime colleagues, says. “Kantians and consequentialists alike”—people who viewed morality in terms of duties and obligations, and those who measured the morality of actions by their effects in the world—“could look at it and see something important.”
  • Traditionally, the discipline is taught through a-priori thought—you start with basic principles and reason forward. Anderson, by contrast, sought to work empirically, using information gathered from the world, identifying problems to be solved not abstractly but through the experienced problems of real people.
  • “Dewey argued that the primary problems for ethics in the modern world concerned the ways society ought to be organized, rather than personal decisions of the individual,”
  • In 2004, the Stanford Encyclopedia of Philosophy asked Anderson to compose its entry on the moral philosophy of John Dewey, who helped carry pragmatist methods into the social realm. Dewey had an idea of democracy as a system of good habits that began in civil life. He was an anti-ideologue with an eye for pluralism.
  • She started working with historians, trying to hone her understanding of ideas by studying them in the context of their creation. Take Rousseau’s apparent support of direct democracy. It’s rarely mentioned that, at the moment when he made that argument, his home town of Geneva had been taken over by oligarchs who claimed to represent the public. Pragmatism said that an idea was an instrument, which naturally gave rise to such questions as: an instrument for what, and where, and when?
  • In “What Is the Point of Equality?,” Anderson had already started to drift away from what philosophers, following Rawls, call ideal theory, based on an end vision for a perfectly just society. As Anderson began a serious study of race in America, though, she found herself losing faith in that approach entirely.
  • Broadly, there’s a culturally right and a culturally left ideal theory for race and society. The rightist version calls for color blindness. Instead of making a fuss about skin and ethnicity, its advocates say, society should treat people as people, and let the best and the hardest working rise.
  • The leftist theory envisions identity communities: for once, give black people (or women, or members of other historically oppressed groups) the resources and opportunities they need, including, if they want it, civil infrastructure for themselves.
  • In “The Imperative of Integration,” published in 2010, Anderson tore apart both of these models. Sure, it might be nice to live in a color-blind society, she wrote, but that’s nothing like the one that exists.
  • But the case for self-segregation was also weak. Affinity groups provided welcome comfort, yet that wasn’t the same as power or equality, Anderson pointed out. And there was a goose-and-gander problem. Either you let only certain groups self-segregate (certifying their subordinate status) or you also permitted, say, white men to do it,
  • Anderson’s solution was “integration,” a concept that, especially in progressive circles, had been uncool since the late sixties. Integration, by her lights, meant mixing on the basis of equality.
  • in attending to these empirical findings over doctrine, she announced herself as a non-ideal theorist: a philosopher with no end vision of society. The approach recalls E. L. Doctorow’s description of driving at night: “You can see only as far as the headlights, but you can make the whole trip that way.”
  • or others, though, a white woman making recommendations on race policy raised questions of perspective. She was engaging through a mostly white Anglo-American tradition. She worked from the premise that, because she drew on folders full of studies, the limits of her own perspective were not constraining.
  • Some philosophers of color welcomed the book. “She’s taking the need for racial justice seriously, and you could hardly find another white political philosopher over a period of decades doing that,”
  • Recently, Anderson changed the way she assigns undergraduate essays: instead of requiring students to argue a position and fend off objections, doubling down on their original beliefs, she asks them to discuss their position with someone who disagrees, and to explain how and why, if at all, the discussion changed their views.
  • The challenge of pluralism is the challenge of modern society: maintaining equality amid difference in a culture given to constant and unpredictable change.
  • Rather than fighting for the ascendancy of certain positions, Anderson suggests, citizens should fight to bolster healthy institutions and systems—those which insure that all views and experiences will be heard. Today’s righteous projects, after all, will inevitably seem fatuous and blinkered from the vantage of another age.
  • Smith saw the markets as an escape from that order. Their “most important” function, he explained, was to bring “liberty and security” to those “who had before lived almost in a continual state of war with their neighbours, and of servile dependency upon their superiors.”
  • Anderson zeroed in on Adam Smith, whose “The Wealth of Nations,” published in 1776, is taken as a keystone of free-market ideology. At the time, English labor was subject to uncompensated apprenticeships, domestic servitude, and some measure of clerical dominion.
  • Smith, in other words, was an egalitarian. He had written “The Wealth of Nations” in no small part to be a solution to what we’d now call structural inequality—the intractable, compounding privileges of an arbitrary hierarchy.
  • It was a historical irony that, a century later, writers such as Marx pointed to the market as a structure of dominion over workers; in truth, Smith and Marx had shared a socioeconomic project. And yet Marx had not been wrong to trash Smith’s ideas, because, during the time between them, the world around Smith’s model had changed, and it was no longer a useful tool.
  • mages of free market society that made sense prior to the Industrial Revolution continue to circulate today as ideals, blind to the gross mismatch between the background social assumptions reigning in the seventeenth and eighteenth centuries, and today’s institutional realities. We are told that our choice is between free markets and state control, when most adults live their working lives under a third thing entirely: private government.
  • Today, people still try to use, variously, both Smith’s and Marx’s tools on a different, postindustrial world:
  • The unnaturalness of this top-heavy arrangement, combined with growing evidence of power abuses, has given many people reason to believe that something is fishy about the structure of American equality. Socialist and anti-capitalist models are again in vogue.
  • Anderson offers a different corrective path. She thinks it’s fine for some people to earn more than others. If you’re a brilliant potter, and people want to pay you more than the next guy for your pottery, great!
  • The problem isn’t that talent and income are distributed in unequal parcels. The problem is that Jeff Bezos earns more than a hundred thousand dollars a minute, while Amazon warehouse employees, many talented and hardworking, have reportedly resorted to urinating in bottles in lieu of a bathroom break. That circumstance reflects some structure of hierarchical oppression. It is a rip in the democratic fabric, and it’s increasingly the norm.
  • Andersonism holds that we don’t have to give up on market society if we can recognize and correct for its limitations—it may even be our best hope, because it’s friendlier to pluralism than most alternatives are.
  • we must be flexible. We must remain alert. We must solve problems collaboratively, in the moment, using society’s ears and eyes and the best tools that we can find.
  • “You can see that, from about 1950 to 1970, the typical American’s wages kept up with productivity growth,” she said. Then, around 1974, she went on, hourly compensation stagnated. American wages have been effectively flat for the past few decades, with the gains of productivity increasingly going to shareholders and to salaries for big bosses.
  • What changed? Anderson rattled off a constellation of factors, from strengthened intellectual-property law to winnowed antitrust law. Financialization, deregulation. Plummeting taxes on capital alongside rising payroll taxes. Privatization, which exchanged modest public-sector salaries for C.E.O. paydays. She gazed into the audience and blinked. “So now we have to ask: What has been used to justify this rather dramatic shift of labor-share of income?”
  • It was no wonder that industrial-age thinking was riddled with contradictions: it reflected what Anderson called “the plutocratic reversal” of classical liberal ideas. Those perversely reversed ideas about freedom were the ones that found a home in U.S. policy, and, well, here we were.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Andrew Sullivan: Nature, Nurture, and Weight Loss - 0 views

  • In his brilliant encyclopedia of “critical studies,” James Lindsay explains the core argument: “Like disability studies, fat studies draws on the work of Michel Foucault and queer theory to argue that negative attitudes to obesity are socially constructed and the result of systemic power that marginalizes and oppresses fat people (and fat perspectives) and of unjust medicalized narratives in order to justify prejudice against obese people.
  • Fatness — like race or gender — is not grounded in physical or biological reality. It is a function of systemic power. The task of fat studies is to “interrogate” this oppressive power and then dismantle it.
  • take the polar opposite position: Fatness is an unhealthy lifestyle that can be stopped by people just eating less and better. We haven’t always been this fat, and we should take responsibility for it, and the physical and psychological damage it brings. Some level of stigma is thereby inevitable, and arguably useful. Humans are not healthy when they are badly overweight; and the explosion in obesity in America has become a serious public-health issue.
  • ...7 more annotations...
  • “When did it become taboo in this country to talk about getting healthy?” my friend Bill Maher asked in a recent monologue. “Fat shaming doesn’t need to end; it needs to make a comeback. Some amount of shame is good. We shamed people out of smoking and into wearing seat belts. We shamed them out of littering and most of them out of racism.”
  • On one side are helpless victims, who react to any debate with cries of oppression, and take no responsibility for their own physical destiny; on the other are brutal realists, with a callous touch, often refusing to see the genetic, social, and psychological complexity of fatness, or that serious health issues are not universal among heavier types
  • This is our reality. We are neither angels nor beasts, but we partake of both. We can rarely make the ugly beautiful, and if we do, it’s a moral achievement. However much we try, we will never correct the core natural inequalities and differences of our mammalian existence. But we can hazard a moral middle, seeing beauty in many ways, acknowledging the humanity of all shapes and sizes, while managing our health and weight in ways that are not totally subject to the gaze of others.
  • is to grapple with complexity in a way that can be rigorously empirical and yet also humane.
  • We are all driven by instinctive attraction, but men are particularly subject to fixed and crude notions of hotness. Beauty will thereby always be the source of extraordinary and extraordinarily unfair advantage, even if it captures only a tiny slice of what being human is about.
  • the two stances reflect our two ideological poles — not so much left and right anymore as nurture and nature. One pole argues nature doesn’t independently exist and everything is social; and one blithely asserts that nature determines everything. Both are ruinous attempts to bludgeon uncomfortable reality into satisfying ideology.
  • What we needed, in some ways, for our collective mental health, was a catalyst for greater physical socialization, more human contact, and more meaningful community. What we’re getting, I fear, is the opposite
Javier E

The Constitution of Knowledge - Persuasion - 0 views

  • But ideas in the marketplace do not talk directly to each other, and for the most part neither do individuals.
  • It is a good metaphor as far as it goes, yet woefully incomplete. It conjures up an image of ideas being traded by individuals in a kind of flea market, or of disembodied ideas clashing and competing in some ethereal realm of their own
  • When Americans think about how we find truth amid a world full of discordant viewpoints, we usually turn to a metaphor, that of the marketplace of ideas
  • ...31 more annotations...
  • Rather, our conversations are mediated through institutions like journals and newspapers and social-media platforms. They rely on a dense network of norms and rules, like truthfulness and fact-checking. They depend on the expertise of professionals, like peer reviewers and editors. The entire system rests on a foundation of values: a shared understanding that there are right and wrong ways to make knowledge.
  • Those values and rules and institutions do for knowledge what the U.S. Constitution does for politics: They create a governing structure, forcing social contestation onto peaceful and productive pathways.
  • I call them, collectively, the Constitution of Knowledge. If we want to defend that system from its many persistent attackers, we need to understand it—and its very special notion of reality.
  • What reality really is
  • The question “What is reality?” may seem either too metaphysical to answer meaningfully or too obvious to need answering
  • The whole problem is that humans have no direct access to an objective world independent of our minds and senses, and subjective certainty is no guarantee of truth. Faced with those problems and others, philosophers and practitioners think of reality as a set of propositions (or claims, or statements) that have been validated in some way, and that have thereby been shown to be at least conditionally true—true, that is, unless debunked
  • Some propositions reflect reality as we perceive it in everyday life (“The sky is blue”). Others, like the equations on a quantum physicist’s blackboard, are incomprehensible to intuition. Many fall somewhere in between.
  • a phrase I used a few sentences ago, “validated in some way,” hides a cheat. In epistemology, the whole question is, validated in what way? If we care about knowledge, freedom, and peace, then we need to stake a strong claim: Anyone can believe anything, but liberal science—open-ended, depersonalized checking by an error-seeking social network—is the only legitimate validator of knowledge, at least in the reality-based community.
  • That is a very bold, very broad, very tough claim, and it goes down very badly with lots of people and communities who feel ignored or oppressed by the Constitution of Knowledge: creationists, Christian Scientists, homeopaths, astrologists, flat-earthers, anti-vaxxers, birthers, 9/11 truthers, postmodern professors, political partisans, QAnon followers, and adherents of any number of other belief systems and religions.
  • But, like the U.S. Constitution’s claim to exclusivity in governing (“unconstitutional” means “illegal,” period), the Constitution of Knowledge’s claim to exclusivity is its sine qua non.
  • Rules for reality
  • The specific proposition does not matter. What does matter is that the only way to validate it is to submit it to the reality-based community. Otherwise, you could win dominance for your proposition by, say, brute force, threatening and jailing and torturing and killing those who see things differently—a standard method down through history
  • Say you believe something (X) to be true, and you believe that its acceptance as true by others is important or at least warranted
  • Or you and your like-minded friends could go off and talk only to each other, in which case you would have founded a cult—which is lawful but socially divisive and epistemically worthless.
  • Or you could engage in a social-media campaign to shame and intimidate those who disagree with you—a very common method these days, but one that stifles debate and throttles knowledge (and harms a lot of people).
  • What the reality-based community does is something else again. Its distinctive qualities derive from two core rules: 
  • what counts is the way the rule directs us to behave: You must assume your own and everyone else’s fallibility and you must hunt for your own and others’ errors, even if you are confident you are right. Otherwise, you are not reality-based.
  • The fallibilist rule: No one gets the final say. You may claim that a statement is established as knowledge only if it can be debunked, in principle, and only insofar as it withstands attempts to debunk it.
  • The empirical rule: No one has personal authority. You may claim that a statement has been established as knowledge only insofar as the method used to check it gives the same result regardless of the identity of the checker, and regardless of the source of the statement
  • Who you are does not count; the rules apply to everybody and persons are interchangeable. If your method is valid only for you or your affinity group or people who believe as you do, then you are not reality-based.
  • Whatever you do to check a proposition must be something that anyone can do, at least in principle, and get the same result. Also, no one proposing a hypothesis gets a free pass simply because of who she is or what group she belongs to.
  • Both rules have very profound social implications. “No final say” insists that to be knowledge, a statement must be checked; and it also says that knowledge is always provisional, standing only as long as it withstands checking.
  • “No personal authority” adds a crucial second step by defining what properly counts as checking. The point, as the great American philosopher Charles Sanders Peirce emphasized more than a century ago, is not that I look or you look but that we look; and then we compare, contest, and justify our views. Critically, then, the empirical rule is a social principle that forces us into the same conversation—a requirement that all of us, however different our viewpoints, agree to discuss what is in principle only one reality.
  • By extension, the empirical rule also dictates what does not count as checking: claims to authority by dint of a personally or tribally privileged perspective.
  • In principle, persons and groups are interchangeable. If I claim access to divine revelation, or if I claim the support of miracles that only believers can witness, or if I claim that my class or race or historically dominant status or historically oppressed status allows me to know and say things that others cannot, then I am breaking the empirical rule by exempting my views from contestability by others.
  • Though seemingly simple, the two rules define a style of social learning that prohibits a lot of the rhetorical moves we see every day.
  • Claiming that a conversation is too dangerous or blasphemous or oppressive or traumatizing to tolerate will almost always break the fallibilist rule.
  • Claims which begin “as a Jew,” or “as a queer,” or for that matter “as minister of information” or “as Pope” or “as head of the Supreme Soviet,” can be valid if they provide useful information about context or credentials; but if they claim to settle an argument by appealing to personal or tribal authority, rather than earned authority, they violate the empirical rule. 
  • “No personal authority” says nothing against trying to understand where people are coming from. If we are debating same-sex marriage, I may mention my experience as a gay person, and my experience may (I hope) be relevant.
  • But statements about personal standing and interest inform the conversation; they do not control it, dominate it, or end it. The rule acknowledges, and to an extent accepts, that people’s social positions and histories matter; but it asks its adherents not to burrow into their social identities, and not to play them as rhetorical trump cards, but to bring them to the larger project of knowledge-building and thereby transcend them.
  • the fallibilist and empirical rules are the common basis of science, journalism, law, and all the other branches of today’s reality-based community. For that reason, both rules also attract hostility, defiance, interference, and open warfare from those who would rather manipulate truth than advance it.
sissij

The Year of Conquering Negative Thinking - The New York Times - 1 views

  • All humans have a tendency to be a bit more like Eeyore than Tigger, to ruminate more on bad experiences than positive ones. It’s an evolutionary adaptation that helps us avoid danger and react quickly in a crisis.
  • Thinking styles can be genetic or the result of childhood experiences, said Judith Beck
  • “We were built to overlearn from negative experiences, but under learn from positive ones,” said Rick Hanson, a psychologist and senior fellow at the Greater Good Science Center at the University of California, Berkeley.
  • ...4 more annotations...
  • The first step to stopping negative thoughts is a surprising one. Don’t try to stop them.
  • You can remind yourself to notice your thoughts in a nonjudgmental manner, without trying to change or alter them right away.
  • A study conducted at Ohio State University found that this method — known as Socratic questioning — was a simple way to reduce depressive symptoms in adults.
  • If you’re ruminating on your financial problems during a run around the track in hopes of finding a solution, then that is useful. But fretting for lap after lap about the president-elect or a foreign crisis is not going to accomplish anything.
  •  
    Negative thinking is the result of the logic of survival. It is sometime essential and beneficial for us. I always think that negative things are bad and useless, but even negative things have their reason to exist. I was very surprised to know that our emotion is just like a child in puberty. If we oppressed it too much, it will stand back against. Instead, we should try to be friends with our emotion and accept them. It's just like parents should be calm and reasonable when their children do something wrong. They should talk reasonable with their children and try to persuade them. --Sissi (1/4/2017)
kushnerha

How 'Empowerment' Became Something for Women to Buy - The New York Times - 0 views

  • The mix of things presumed to transmit and increase female power is without limit yet still depressingly limiting.“Empowerment” wasn’t always so trivialized, or so corporate, or even so clamorously attached to women.
  • Four decades ago, the word had much more in common with Latin American liberation theology than it did with “Lean In.” In 1968, the Brazilian academic Paulo Freire coined the word “conscientization,” empowerment’s precursor, as the process by which an oppressed person perceives the structural conditions of his oppression and is subsequently able to take action against his oppressors.
  • Eight years later, the educator Barbara Bryant Solomon, writing about American black communities, gave this notion a new name, “empowerment.” It was meant as an ethos for social workers in marginalized communities, to discourage paternalism and encourage their clients to solve problems in their own ways. Then in 1981, Julian Rappaport, a psychologist, broadened the concept into a political theory of power that viewed personal competency as fundamentally limitless; it placed faith in the individual and laid at her feet a corresponding amount of responsibility too.
  • ...7 more annotations...
  • Sneakily, empowerment had turned into a theory that applied to the needy while describing a process more realistically applicable to the rich. The word was built on a misaligned foundation; no amount of awareness can change the fact that it’s the already-powerful who tend to experience empowerment at any meaningful rate. Today “empowerment” invokes power while signifying the lack of it. It functions like an explorer staking a claim on new territory with a white flag.
  • highly marketable “women’s empowerment,” neither practice nor praxis, nor really theory, but a glossy, dizzying product instead. Women’s empowerment borrows the virtuous window-dressing of the social worker’s doctrine and kicks its substance to the side. It’s about pleasure, not power; it’s individualistic and subjective, tailored to insecurity and desire.
  • The new empowerment doesn’t increase potential so much as it assures you that your potential is just fine. Even when the thing being described as “empowering” is personal and mildly defiant (not shaving, not breast-feeding, not listening to men, et cetera), what’s being mar­keted is a certain identity.
  • When consumer purchases aren’t made out to be a path to female empowerment, a branded corporate experience often is. There’s TEDWomen (“about the power of women”), the Forbes Women’s Summit (“#RedefinePower”) and Fortune’s Most Powerful Women Conference (tickets are $10,000).
  • This consumption-and-conference empowerment dilutes the word to pitch-speak, and the concept to something that imitates rather than alters the structures of the world. This version of empowerment can be actively disempowering: It’s a series of objects and experiences you can purchase while the conditions determining who can access and accumulate power stay the same. The ready partici­pation of well-off women in this strat­egy also points to a deep truth about the word “empowerment”: that it has never been defined by the people who actually need it. People who talk empowerment are, by definition, already there.
  • I have never said “empowerment” sincerely or heard it from a single one of my friends. The formulation has been diluted to something representational and bloodless — an architectural rendering of a building that will never be built.But despite its nonexistence in honest conversation, “empowerment” goes on thriving. It’s uniquely marketable, like the female body, which is where women’s empowerment is forced to live.
  • Like Sandberg, Kardashian is the apotheosis of a particular brand of largely contentless feminism, a celebratory form divorced from material politics, which makes it palatable — maybe irresistible — to the business world. Advertisement Continue reading the main story The mistake would be to locate further empowerment in choosing between the two. Corporate empowerment — as well as the lightweight, self-exculpatory feminism it rides on — feeds rav­enously on the distracting performance of identity, that buffet of false opposition.
Javier E

Forget the Money, Follow the Sacredness - NYTimes.com - 0 views

  • Despite what you might have learned in Economics 101, people aren’t always selfish. In politics, they’re more often groupish. When people feel that a group they value — be it racial, religious, regional or ideological — is under attack, they rally to its defense, even at some cost to themselves. We evolved to be tribal, and politics is a competition among coalitions of tribes.
  • The key to understanding tribal behavior is not money, it’s sacredness. The great trick that humans developed at some point in the last few hundred thousand years is the ability to circle around a tree, rock, ancestor, flag, book or god, and then treat that thing as sacred. People who worship the same idol can trust one another, work as a team and prevail over less cohesive groups. So if you want to understand politics, and especially our divisive culture wars, you must follow the sacredness.
  • A good way to follow the sacredness is to listen to the stories that each tribe tells about itself and the larger nation.
  • ...3 more annotations...
  • The Notre Dame sociologist Christian Smith once summarized the moral narrative told by the American left like this: “Once upon a time, the vast majority” of people suffered in societies that were “unjust, unhealthy, repressive and oppressive.” These societies were “reprehensible because of their deep-rooted inequality, exploitation and irrational traditionalism — all of which made life very unfair, unpleasant and short. But the noble human aspiration for autonomy, equality and prosperity struggled mightily against the forces of misery and oppression and eventually succeeded in establishing modern, liberal, democratic, capitalist, welfare societies.” Despite our progress, “there is much work to be done to dismantle the powerful vestiges of inequality, exploitation and repression.” This struggle, as Smith put it, “is the one mission truly worth dedicating one’s life to achieving.”This is a heroic liberation narrative. For the American left, African-Americans, women and other victimized groups are the sacred objects at the center of the story. As liberals circle around these groups, they bond together and gain a sense of righteous common purpose.
  • the Reagan narrative like this: “Once upon a time, America was a shining beacon. Then liberals came along and erected an enormous federal bureaucracy that handcuffed the invisible hand of the free market. They subverted our traditional American values and opposed God and faith at every step of the way.” For example, “instead of requiring that people work for a living, they siphoned money from hard-working Americans and gave it to Cadillac-driving drug addicts and welfare queens.” Instead of the “traditional American values of family, fidelity and personal responsibility, they preached promiscuity, premarital sex and the gay lifestyle” and instead of “projecting strength to those who would do evil around the world, they cut military budgets, disrespected our soldiers in uniform and burned our flag.” In response, “Americans decided to take their country back from those who sought to undermine it.”This, too, is a heroic narrative, but it’s a heroism of defense. In this narrative it’s God and country that are sacred — hence the importance in conservative iconography of the Bible, the flag, the military and the founding fathers. But the subtext in this narrative is about moral order. For social conservatives, religion and the traditional family are so important in part because they foster self-control, create moral order and fend off chaos.
  • Part of Reagan’s political genius was that he told a single story about America that rallied libertarians and social conservatives, who are otherwise strange bedfellows. He did this by presenting liberal activist government as the single devil that is eternally bent on destroying two different sets of sacred values — economic liberty and moral order. Only if all nonliberals unite into a coalition of tribes can this devil be defeated.
carolinewren

How Media Bias Is Killing Black America - The Root - 0 views

  • leads to both the erasure and criminalization of marginalized communities, has had dire consequences for both the psyches and lived experiences of black people in the United States since at least the 18th century,
  • “This is the press, an irresponsible press,” he said. “It will make the criminal look like he’s the victim and make the victim look like he’s the criminal. If you aren’t careful, the newspapers will have you hating the people who are being oppressed and loving the people who are doing the oppressing.”
  • Many studies have tackled implicit racial bias in law enforcement, health care and the legal field. In recent years, the phrase has become a buzzword used to broadly frame bigotry and racism as something so entrenched that some people aren’t aware that they subconsciously harbor racist feelings, associating black skin with negative behavior
  • ...8 more annotations...
  • their “conditioning has been conditioned,” and marginalized groups are often left to pick up the pieces in the wake of brutality and/or neglect by those in positions of power, trust and influence.
  • tackles media bias (pdf) and how it indiscriminately pathologizes communities of color for mass consumption.
  • “Implicit bias impacts the way black communities are treated across practically all sectors of life in America, from courtrooms to doctors’ offices,
  • “The media is no different, whether it be the use of pejorative terms like ‘thug’ and ‘animal’ to describe protesters in Ferguson and Baltimore, or the widespread overreporting of crime stories involving black suspects in New York City.”
  • Media bias not only negatively impacts black America’s relationship with law enforcement and the judicial system (pdf) but also extends to how African Americans are perceived in society at large.
  • “Television newsrooms are nearly 80 percent white, according to the Radio and Television News Directors Association, while radio newsrooms are 92 percent white,”
  • “The percentage of minority journalists has remained between 12 and 14 percent for more than a decade.”
  • This lays the groundwork for an intrinsically racist media structure that, according to The Atlantic’s Riva Gold, means “news organizations are losing their ability to empower, represent, and—especially in cases where language ability is crucial—even to report on minority populations in their communities.”
Javier E

Can We Improve? - The New York Times - 1 views

  • are we capable of substantial moral improvement? Could we someday be much better ethically than we are now? Is it likely that members of our species could become, on average, more generous or more honest, less self-deceptive or less self-interested?
  • I’d like to focus here on a more recent moment: 19th-century America, where the great optimism and idealism of a rapidly rising nation was tempered by a withering realism.
  • Emerson thought that “the Spirit who led us hither” would help perfect us; others have believed the agent of improvement to be evolution, or the inevitable progress of civilization. More recent advocates of our perfectibility might focus on genetic or neurological interventions, or — as in Ray Kurzweil’s “When Singularity Is Near” — information technologies.
  • ...10 more annotations...
  • One reason that a profound moral improvement of humankind is hard to envision is that it seems difficult to pull ourselves up morally by our own bootstraps; our attempts at improvement are going to be made by the unimproved
  • People and societies occasionally improve, managing to enfranchise marginalized groups, for example, or reduce violence, but also often degenerate into war, oppression or xenophobia. It is difficult to improve and easy to convince yourself that you have improved, until the next personality crisis, the next bad decision, the next war, the next outbreak of racism, the next “crisis” in educatio
  • It’s difficult to teach your children what you yourself do not know, and it’s difficult to be good enough actually to teach your children to be good.
  • Plans for our improvement have resulted in progress here and there, but they’ve also led to many disasters of oppression, many wars and genocides.
  • One thing that Twain is saying is that many forms of evil — envy, for example, or elaborate dishonesty — appear on earth only with human beings and are found wherever we are. Creatures like us can’t see clearly what we’d be making progress toward.
  • His story “The Imp of the Perverse” shows another sort of reason that humans find it difficult to improve. The narrator asserts that a basic human impulse is to act wrongly on purpose, or even to do things because we know they’re wrong: “We act, for the reason that we should not,” the narrator declares. This is one reason that human action tends to undermine itself; our desires are contradictory.
  • Perhaps, then if we cannot improve systematically, we can improve inadvertently — or even by sheer perversity
  • As to evolution, it, too, is as likely to end in our extinction as our flourishing; it has of course extinguished most of the species to which it has given rise, and it does not clearly entail that every or any species gets better in any dimension over time
  • Our technologies may, as Kurzweil believes, allow us to transcend our finitude. On the other hand, they may end in our or even the planet’s total destruction.
  • “I have no faith in human perfectibility. I think that human exertion will have no appreciable effect on humanity. Man is … not more happy — nor more wise, than he was 6,000 years ago.”
  •  
    are we capable of substantial moral improvement? Could we someday be much better ethically than we are now? Is it likely that members of our species could become, on average, more generous or more honest, less self-deceptive or less self-interested?
Javier E

Julian Assange on Living in a Surveillance Society - NYTimes.com - 0 views

  • Describing the atomic bomb (which had only two months before been used to flatten Hiroshima and Nagasaki) as an “inherently tyrannical weapon,” he predicts that it will concentrate power in the hands of the “two or three monstrous super-states” that have the advanced industrial and research bases necessary to produce it. Suppose, he asks, “that the surviving great nations make a tacit agreement never to use the atomic bomb against one another? Suppose they only use it, or the threat of it, against people who are unable to retaliate?”
  • The likely result, he concludes, will be “an epoch as horribly stable as the slave empires of antiquity.” Inventing the term, he predicts “a permanent state of ‘cold war,"’ a “peace that is no peace,” in which “the outlook for subject peoples and oppressed classes is still more hopeless.”
  • the destruction of privacy widens the existing power imbalance between the ruling factions and everyone else, leaving “the outlook for subject peoples and oppressed classes,” as Orwell wrote, “still more hopeless.
  • ...10 more annotations...
  • At present even those leading the charge against the surveillance state continue to treat the issue as if it were a political scandal that can be blamed on the corrupt policies of a few bad men who must be held accountable. It is widely hoped that all our societies need to do to fix our problems is to pass a few laws.
  • The cancer is much deeper than this. We live not only in a surveillance state, but in a surveillance society. Totalitarian surveillance is not only embodied in our governments; it is embedded in our economy, in our mundane uses of technology and in our everyday interactions.
  • The very concept of the Internet — a single, global, homogenous network that enmeshes the world — is the essence of a surveillance state. The Internet was built in a surveillance-friendly way because governments and serious players in the commercial Internet wanted it that way. There were alternatives at every step of the way. They were ignored.
  • there is an undeniable “tyrannical” side to the Internet. But the Internet is too complex to be unequivocally categorized as a “tyrannical” or a “democratic” phenomenon.
  • At their core, companies like Google and Facebook are in the same business as the U.S. government’s National Security Agency. They collect a vast amount of information about people, store it, integrate it and use it to predict individual and group behavior, which they then sell to advertisers and others. This similarity made them natural partners for the NSA
  • Unlike intelligence agencies, which eavesdrop on international telecommunications lines, the commercial surveillance complex lures billions of human beings with the promise of “free services.” Their business model is the industrial destruction of privacy. And yet even the more strident critics of NSA surveillance do not appear to be calling for an end to Google and Facebook
  • It is possible for more people to communicate and trade with others in more places in a single instant than it ever has been in history. The same developments that make our civilization easier to surveil make it harder to predict. They have made it easier for the larger part of humanity to educate itself, to race to consensus, and to compete with entrenched power groups.
  • If there is a modern analogue to Orwell’s “simple” and “democratic weapon,” which “gives claws to the weak” it is cryptography, the basis for the mathematics behind Bitcoin and the best secure communications programs. It is cheap to produce: cryptographic software can be written on a home computer. It is even cheaper to spread: software can be copied in a way that physical objects cannot. But it is also insuperable — the mathematics at the heart of modern cryptography are sound, and can withstand the might of a superpower. The same technologies that allowed the Allies to encrypt their radio communications against Axis intercepts can now be downloaded over a dial-up Internet connection and deployed with a cheap laptop.
  • It is too early to say whether the “democratizing” or the “tyrannical” side of the Internet will eventually win out. But acknowledging them — and perceiving them as the field of struggle — is the first step toward acting effectively
  • Humanity cannot now reject the Internet, but clearly we cannot surrender it either. Instead, we have to fight for it. Just as the dawn of atomic weapons inaugurated the Cold War, the manifold logic of the Internet is the key to understanding the approaching war for the intellectual center of our civilization
Javier E

Andrew Sullivan: Is There a Way to Acknowledge Our Progress? - 0 views

  • ft of recent books have been full of the need for renewed rage against the oppression of women. The demonization of “white men” has intensified just as many working-class white men face a bleak economic future and as men are disappearing from the workforce. It is as if the less gender discrimination there is, the angrier you should become.
  • You see it in the gay-rights movement too. I get fundraising emails all the time reminding me how we live in a uniquely perilous moment for LGBTQ Americans and that this era, in the words of Human Rights Campaign spokesperson Charlotte Clymer, is one “that has seen unprecedented attacks on LGBTQ people.
  • Might I suggest some actual precedents: when all gay sex was criminal, when many were left by their government to die of AIDS, when no gay relationships were recognized in the law, when gay service members were hounded out of their mission, when the federal government pursued a purge of anyone suspected of being gay. All but the last one occurred in my adult lifetime. But today we’re under “unprecedented” assault?
  • ...19 more annotations...
  • a recent psychological study suggests a simpler explanation. Its core idea is what you might call “oppression creep” or, more neutrally, “prevalence-induced concept change.” The more progress we observe, the greater the remaining injustices appear.
  • We seem incapable of keeping a concept stable over time when the prevalence of that concept declines.
  • lthough modern societies have made extraordinary progress in solving a wide range of social problems, from poverty and illiteracy to violence and infant mortality, the majority of people believe that the world is getting worse. The fact that concepts grow larger when their instances grow smaller may be one source of that pessimism.
  • “In other words, when the prevalence of blue dots decreased, participants’ concept of blue expanded to include dots that it had previously excluded.”
  • When blue dots became rare, purple dots began to look blue; when threatening faces became rare, neutral faces began to appear threatening … This happened even when the change in the prevalence of instances was abrupt, even when participants were explicitly told that the prevalence of instances would change, and even when participants were instructed and paid to ignore these changes.
  • We seem to be wired to assume a given threat remains just as menacing even when its actual prevalence has declined:
  • We see relatively, not absolutely. We change our standards all the time, depending on context.
  • This study may help explain why, in the midst of tremendous gains for gays, women, and racial minorities, we still insist more than ever that we live in a patriarchal, misogynist, white supremacist, homophobic era.
  • We never seem to be able to say: “Okay, we’re done now, we’ve got this, politics has done all it reasonably could, now let’s move on with our lives.” We can only ever say: “It’s worse than ever!” And fe
  • whatever the cause, the result is that we steadfastly refuse to accept the fact of progress, in a cycle of eternal frustration at what injustices will always remain
  • picking someone who has bent the truth so often about so many things — her ancestry, her commitment to serving a full term as senator, the schools her kids went to, the job her father had (according to her brother), or the time she was “fired” for being pregnant — is an unnecessary burden.
  • The Democrat I think is most likely to lose to Trump is Elizabeth Warren.I admire her ambition and grit and aggression, but nominating a woke, preachy Harvard professor plays directly into Trump’s hands
  • Pete Buttigieg’s appeal has waned for me.
  • over time, the combination of his perfect résumé, his actorly ability to change register as he unpacks a sentence, and his smoothness and self-love have begun to worry me. My fear is that his appeal will fade
  • Klobuchar, to my mind, is the better midwestern option. She is an engaging and successful politician. But there’s a reason she seemingly can’t get more traction. She just doesn’t command a room
  • I so want Biden to be ten years younger. I can’t help but be very fond of the man, and he does have a mix of qualities that appeal to both African-Americans and white working-class midwesterners. What I worry about is his constant stumbling in his speech, his muddling of words, those many moments when his eyes close, and his face twitches, as he tries to finish a sentence
  • Sanders has been on the far left all his life, and the oppo research the GOP throws at him could be brutal. He’s a man, after all, who sided with a Marxist-Leninist party that supported Ayatollah Khomeini during the hostage crisis in 1979. He loved the monstrous dictator Fidel Castro and took his 1988 honeymoon in the Soviet Union, no less, where he openly and publicly criticized his own country and praised many aspects of the Soviet system
  • On two key issues, immigration and identity politics, Bernie has sensibilities and instincts that could neutralize these two strong points for Trump. Sanders has always loathed the idea of open borders and the effect they have on domestic wages, and he doesn’t fit well with the entire woke industry. He still believes in class struggle, not the culture war
  • Biden has an advantage because of Obama, his appeal to the midwestern voters (if he wins back Pennsylvania, that would work wonders), and his rapport with African-Americans. But he also seems pretty out of it.
Javier E

He Wants to Save Classics From Whiteness. Can the Field Survive? - The New York Times - 0 views

  • Padilla laid out an indictment of his field. “If one were intentionally to design a discipline whose institutional organs and gatekeeping protocols were explicitly aimed at disavowing the legitimate status of scholars of color,” he said, “one could not do better than what classics has done.”
  • Padilla believes that classics is so entangled with white supremacy as to be inseparable from it. “Far from being extrinsic to the study of Greco-Roman antiquity,” he has written, “the production of whiteness turns on closer examination to reside in the very marrows of classics.”
  • Rather than kowtowing to criticism, Williams said, “maybe we should start defending our discipline.” She protested that it was imperative to stand up for the classics as the political, literary and philosophical foundation of European and American culture: “It’s Western civilization. It matters because it’s the West.” Hadn’t classics given us the concepts of liberty, equality and democracy?
  • ...46 more annotations...
  • “I believe in merit. I don’t look at the color of the author.” She pointed a finger in Padilla’s direction. “You may have got your job because you’re Black,” Williams said, “but I would prefer to think you got your job because of merit.”
  • Williams ceded the microphone, and Padilla was able to speak. “Here’s what I have to say about the vision of classics that you outlined,” he said. “I want nothing to do with it. I hope the field dies that you’ve outlined, and that it dies as swiftly as possible.”
  • What he did find was a slim blue-and-white textbook titled “How People Lived in Ancient Greece and Rome.” “Western civilization was formed from the union of early Greek wisdom and the highly organized legal minds of early Rome,” the book began. “The Greek belief in a person’s ability to use his powers of reason, coupled with Roman faith in military strength, produced a result that has come to us as a legacy, or gift from the past.” Thirty years later, Padilla can still recite those opening lines.
  • In 2017, he published a paper in the journal Classical Antiquity that compared evidence from antiquity and the Black Atlantic to draw a more coherent picture of the religious life of the Roman enslaved. “It will not do merely to adopt a pose of ‘righteous indignation’ at the distortions and gaps in the archive,” he wrote. “There are tools available for the effective recovery of the religious experiences of the enslaved, provided we work with these tools carefully and honestly.”
  • Padilla sensed that his pursuit of classics had displaced other parts of his identity, just as classics and “Western civilization” had displaced other cultures and forms of knowledge. Recovering them would be essential to dismantling the white-supremacist framework in which both he and classics had become trapped. “I had to actively engage in the decolonization of my mind,” he told me.
  • He also gravitated toward contemporary scholars like José Esteban Muñoz, Lorgia García Peña and Saidiya Hartman, who speak of race not as a physical fact but as a ghostly system o
  • In response to rising anti-immigrant sentiment in Europe and the United States, Mary Beard, perhaps the most famous classicist alive, wrote in The Wall Street Journal that the Romans “would have been puzzled by our modern problems with migration and asylum,” because the empire was founded on the “principles of incorporation and of the free movement of people.”
  • In November 2015, he wrote an essay for Eidolon, an online classics journal, clarifying that in Rome, as in the United States, paeans to multiculturalism coexisted with hatred of foreigners. Defending a client in court, Cicero argued that “denying foreigners access to our city is patently inhumane,” but ancient authors also recount the expulsions of whole “suspect” populations, including a roundup of Jews in 139 B.C., who were not considered “suitable enough to live alongside Romans.”
  • The job of classicists is not to “point out the howlers,” he said on a 2017 panel. “To simply take the position of the teacher, the qualified classicist who knows things and can point to these mistakes, is not sufficient.”
  • Dismantling structures of power that have been shored up by the classical tradition will require more than fact-checking; it will require writing an entirely new story about antiquity, and about who we are today
  • To find that story, Padilla is advocating reforms that would “explode the canon” and “overhaul the discipline from nuts to bolts,” including doing away with the label “classics” altogether.
  • . “What I want to be thinking about in the next few weeks,” he told them, “is how we can be telling the story of the early Roman Empire not just through a variety of sources but through a variety of persons.” He asked the students to consider the lives behind the identities he had assigned them, and the way those lives had been shaped by the machinery of empire, which, through military conquest, enslavement and trade, creates the conditions for the large-scale movement of human beings.
  • ultimately, he decided that leaving enslaved characters out of the role play was an act of care. “I’m not yet ready to turn to a student and say, ‘You are going to be a slave.’”
  • Privately, even some sympathetic classicists worry that Padilla’s approach will only hasten the field’s decline. “I’ve spoken to undergrad majors who say that they feel ashamed to tell their friends they’re studying classics,”
  • “I very much admire Dan-el’s work, and like him, I deplore the lack of diversity in the classical profession,” Mary Beard told me via email. But “to ‘condemn’ classical culture would be as simplistic as to offer it unconditional admiration.”
  • In a 2019 talk, Beard argued that “although classics may become politicized, it doesn’t actually have a politics,” meaning that, like the Bible, the classical tradition is a language of authority — a vocabulary that can be used for good or ill by would-be emancipators and oppressors alike.
  • Over the centuries, classical civilization has acted as a model for people of many backgrounds, who turned it into a matrix through which they formed and debated ideas about beauty, ethics, power, nature, selfhood, citizenship and, of course, race
  • Anthony Grafton, the great Renaissance scholar, put it this way in his preface to “The Classical Tradition”: “An exhaustive exposition of the ways in which the world has defined itself with regard to Greco-Roman antiquity would be nothing less than a comprehensive history of the world.”
  • Classics as we know it today is a creation of the 18th and 19th centuries. During that period, as European universities emancipated themselves from the control of the church, the study of Greece and Rome gave the Continent its new, secular origin story. Greek and Latin writings emerged as a competitor to the Bible’s moral authority, which lent them a liberatory power
  • Historians stress that such ideas cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions. “The whiter the body is, the more beautiful it is,” Winkelmann wrote.
  • While Renaissance scholars were fascinated by the multiplicity of cultures in the ancient world, Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below.
  • Jefferson, along with most wealthy young men of his time, studied classics at college, where students often spent half their time reading and translating Greek and Roman texts. “Next to Christianity,” writes Caroline Winterer, a historian at Stanford, “the central intellectual project in America before the late 19th century was classicism.
  • Of the 2.5 million people living in America in 1776, perhaps only 3,000 had gone to college, but that number included many of the founders
  • They saw classical civilization as uniquely educative — a “lamp of experience,” in the words of Patrick Henry, that could light the path to a more perfect union. However true it was, subsequent generations would come to believe, as Hannah Arendt wrote in “On Revolution,” that “without the classical example … none of the men of the Revolution on either side of the Atlantic would have possessed the courage for what then turned out to be unprecedented action.”
  • Comparisons between the United States and the Roman Empire became popular as the country emerged as a global power. Even after Latin and Greek were struck from college-entrance exams, the proliferation of courses on “great books” and Western civilization, in which classical texts were read in translation, helped create a coherent national story after the shocks of industrialization and global warfare.
  • even as the classics were pulled apart, laughed at and transformed, they continued to form the raw material with which many artists shaped their visions of modernity.
  • Over the centuries, thinkers as disparate as John Adams and Simone Weil have likened classical antiquity to a mirror. Generations of intellectuals, among them feminist, queer and Black scholars, have seen something of themselves in classical texts, flashes of recognition that held a kind of liberatory promise
  • The language that is used to describe the presence of classical antiquity in the world today — the classical tradition, legacy or heritage — contains within it the idea of a special, quasi-genetic relationship. In his lecture “There Is No Such Thing as Western Civilization,” Kwame Anthony Appiah (this magazine’s Ethicist columnist) mockingly describes the belief in such a kinship as the belief in a “golden nugget” of insight — a precious birthright and shimmering sign of greatness — that white Americans and Europeans imagine has been passed down to them from the ancients.
  • To see classics the way Padilla sees it means breaking the mirror; it means condemning the classical legacy as one of the most harmful stories we’ve told ourselves
  • Padilla is wary of colleagues who cite the radical uses of classics as a way to forestall change; he believes that such examples have been outmatched by the field’s long alliance with the forces of dominance and oppression.
  • Classics and whiteness are the bones and sinew of the same body; they grew strong together, and they may have to die together. Classics deserves to survive only if it can become “a site of contestation” for the communities who have been denigrated by it in the past.
  • if classics fails his test, Padilla and others are ready to give it up. “I would get rid of classics altogether,” Walter Scheidel, another of Padilla’s former advisers at Stanford, told me. “I don’t think it should exist as an academic field.”
  • One way to get rid of classics would be to dissolve its faculties and reassign their members to history, archaeology and language departments.
  • many classicists are advocating softer approaches to reforming the discipline, placing the emphasis on expanding its borders. Schools including Howard and Emory have integrated classics with Ancient Mediterranean studies, turning to look across the sea at Egypt, Anatolia, the Levant and North Africa. The change is a declaration of purpose: to leave behind the hierarchies of the Enlightenment and to move back toward the Renaissance model of the ancient world as a place of diversity and mixture.
  • Ian Morris put it more bluntly. “Classics is a Euro-American foundation myth,” Morris said to me. “Do we really want that sort of thing?”
  • There’s a more interesting story to be told about the history of what we call the West, the history of humanity, without valorizing particular cultures in it,” said Josephine Quinn, a professor of ancient history at Oxford. “It seems to me the really crucial mover in history is always the relationship between people, between cultures.”
  • “In some moods, I feel that this is just a moment of despair, and people are trying to find significance even if it only comes from self-accusation,” he told me. “I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.”
  • “One of the dubious successes of my generation is that it did break the canon,” Richlin told me. “I don’t think we could believe at the time that we would be putting ourselves out of business, but we did.” She added: “If they blew up the classics departments, that would really be the end.”
  • Padilla, like Douglass, now sees the moment of absorption into the classical, literary tradition as simultaneous with his apprehension of racial difference; he can no longer find pride or comfort in having used it to bring himself out of poverty.
  • “Claiming dignity within this system of structural oppression,” Padilla has said, “requires full buy-in into its logic of valuation.” He refuses to “praise the architects of that trauma as having done right by you at the end.”
  • Last June, as racial-justice protests unfolded across the nation, Padilla turned his attention to arenas beyond classics. He and his co-authors — the astrophysicist Jenny Greene, the literary theorist Andrew Cole and the poet Tracy K. Smith — began writing their open letter to Princeton with 48 proposals for reform. “Anti-Blackness is foundational to America,” the letter began. “Indifference to the effects of racism on this campus has allowed legitimate demands for institutional support and redress in the face of microaggression and outright racist incidents to go long unmet.”
  • Padilla believes that the uproar over free speech is misguided. “I don’t see things like free speech or the exchange of ideas as ends in themselves,” he told me. “I have to be honest about that. I see them as a means to the end of human flourishing.”
  • “There is a certain kind of classicist who will look on what transpired and say, ‘Oh, that’s not us,’” Padilla said when we spoke recently. “What is of interest to me is why is it so imperative for classicists of a certain stripe to make this discursive move? ‘This is not us.’
  • Joel Christensen, the Brandeis professor, now feels that it is his “moral and ethical and intellectual responsibility” to teach classics in a way that exposes its racist history. “Otherwise we’re just participating in propaganda,”
  • Christensen, who is 42, was in graduate school before he had his “crisis of faith,” and he understands the fear that many classicists may experience at being asked to rewrite the narrative of their life’s work. But, he warned, “that future is coming, with or without Dan-el.”
  • On Jan. 6, Padilla turned on the television minutes after the windows of the Capitol were broken. In the crowd, he saw a man in a Greek helmet with TRUMP 2020 painted in white. He saw a man in a T-shirt bearing a golden eagle on a fasces — symbols of Roman law and governance — below the logo 6MWE, which stands for “Six Million Wasn’t Enough,
Javier E

Sleight of the 'Invisible Hand' - NYTimes.com - 1 views

  • The wealthy, says Smith, spend their days establishing an “economy of greatness,” one founded on “luxury and caprice” and fueled by “the gratification of their own vain and insatiable desires.” Any broader benefit that accrues from their striving is not the consequence of foresight or benevolence, but “in spite of their natural selfishness and rapacity.” They don’t do good, they are led to it.
  • In other words, the invisible hand did not solve the problem of politics by making politics altogether unnecessary. “We don’t think government can solve all our problems,” President Obama said in his convention address, “But we don’t think that government is the source of all our problems.” Smith would have appreciated this formulation. For him, whether government should get out of the way in any given matter, economic or otherwise, was a question for considered judgment abetted by scientific inquiry.
  • What it did not do, however, was void any proposal outright, much less prove that all government activity was counterproductive. Smith held that the sovereign had a role supporting education, building infrastructure and public institutions, and providing security from foreign and domestic threats — initiatives that should be paid for, in part, by a progressive tax code and duties on luxury goods. He even believed the government had a “duty” to protect citizens from “oppression,” the inevitable tendency of the strong to take advantage of the ignorance and necessity of the weak.
  • ...4 more annotations...
  • Smith described this state of affairs as “the obvious and simple system of natural liberty,” and he knew that it made for the revolutionary implication of his work. It shifted the way we thought about the relationship between government action and economic growth, making less means more the rebuttable presumption of policy proposals.
  • politics is a practical venture, and Smith distrusted those statesmen who confused their work with an exercise in speculative philosophy. Their proposals should be judged not by the delusive lights of the imagination, but by the metrics of science and experience, what President Obama described in the first presidential debate as “math, common sense and our history.”
  • John Paul Rollert teaches business ethics at the University of Chicago Booth School of Business and leadership at the Harvard Extension School.  He is the author of a recent paper on President Obama’s “Empathy Standard” for the Yale Law Journal Online.
  • Adam Smith, analytic philosophy, economics, Elections 2012
  •  
    "Adam Smith, analytic philosophy, economics"
Javier E

Jonathan Haidt and the Moral Matrix: Breaking Out of Our Righteous Minds | Guest Blog, ... - 2 views

  • What did satisfy Haidt’s natural thirst for understanding human beings was social psychology.
  • Haidt initially found moral psychology “really dull.” He described it to me as “really missing the heart of the matter and too cerebral.” This changed in his second year after he took a course from the anthropologist Allen Fiske and got interested in moral emotions.
  • “The Emotional Dog and its Rational Trail,” which he describes as “the most important article I’ve ever written.”
  • ...13 more annotations...
  • it helped shift moral psychology away from rationalist models that dominated in the 1980s and 1990s. In its place Haidt offered an understanding of morality from an intuitive and automatic level. As Haidt says on his website, “we are just not very good at thinking open-mindedly about moral issues, so rationalist models end up being poor descriptions of actual moral psychology.”
  • “the mind is divided into parts that sometimes conflict. Like a rider on the back of an elephant, the conscious, reasoning part of the mind has only limited control of what the elephant does.”
  • In the last few decades psychology began to understand the unconscious mind not as dark and suppressed as Freud did, but as intuitive, highly intelligent and necessary for good conscious reasoning. “Elephants,” he reminded me, “are really smart, much smarter than horses.”
  • we are 90 percent chimp 10 percent bee. That is to say, though we are inherently selfish, human nature is also about being what he terms “groupish.” He explained to me like this:
  • they developed the idea that humans possess six universal moral modules, or moral “foundations,” that get built upon to varying degrees across culture and time. They are: Care/harm, Fairness/cheating, Loyalty/betrayal, Authority/subversion, Sanctity/degradation, and Liberty/oppression. Haidt describes these six modules like a “tongue with six taste receptors.” “In this analogy,” he explains in the book, “the moral matrix of a culture is something like its cuisine: it’s a cultural construction, influenced by accidents of environment and history, but it’s not so flexible that anything goes. You can’t have a cuisine based on grass and tree bark, or even one based primarily on bitter tastes. Cuisines vary, but they all must please tongues equipped with the same five taste receptors. Moral matrices vary, but they all must please righteous minds equipped with the same six social receptors.”
  • The questionnaire eventually manifested itself into the website www.YourMorals.org, and it has since gathered over two hundred thousand data points. Here is what they found:
  • This is the crux of the disagreement between liberals and conservatives. As the graph illustrates, liberals value Care and Fairness much more than the other three moral foundations whereas conservative endorse all five more or less equally. This shouldn’t sound too surprising, liberals tend to value universal rights and reject the idea of the United States being superior while conservatives tend to be less concerned about the latest United Nation declaration and more partial to the United States as a superior nation.
  • Haidt began reading political psychology. Karen Stenner’s The Authoritarian Dynamic, “conveyed some key insights about protecting the group that were particularly insightful,” he said. The work of the French sociologist Emile Durkheim was also vital. In contrast to John Stuart Mill, a Durkheimian society, as Haidt explains in an essay for edge.org, “would value self-control over self-expression, duty over rights, and loyalty to one’s groups over concerns for out-groups.”
  • He was motivated to write The Righteous Mind after Kerry lost the 2004 election: “I thought he did a terrible job of making moral appeals so I began thinking about how I could apply moral psychology to understand political divisions. I started studying the politics of culture and realized how liberals and conservatives lived in their own closed worlds.” Each of these worlds, as Haidt explains in the book, “provides a complete, unified, and emotionally compelling worldview, easily justified by observable evidence and nearly impregnable to attack by arguments from outsiders.” He describes them as “moral matrices,” and thinks that moral psychology can help him understand them.
  • “When I say that human nature is selfish, I mean that our minds contain a variety of mental mechanisms that make us adept at promoting our own interests, in competition with our peers. When I say that human nature is also groupish, I mean that our minds contain a variety of mental mechanisms that make us adept at promoting our group’s interests, in competition with other groups. We are not saints, but we are sometimes good team players.” This is what people who had studied morality had not realized, “that we evolved not just so I can treat you well or compete with you, but at the same time we can compete with them.”
  • At first, Haidt reminds us that we are all trapped in a moral matrix where
  • our “elephants” only look for what confirms its moral intuitions while our “riders” play the role of the lawyer; we team up with people who share similar matrices and become close-minded; and we forget that morality is diverse. But on the other hand, Haidt is offering us a choice: take the blue pill and remain happily delusional about your worldview, or take the red pill, and, as he said in his 2008 TED talk, “learn some moral psychology and step outside your moral matrix.”
  • The great Asian religions, Haidt reminded the crowd at TED, swallowed their pride and took the red pill millennia ago. And by stepping out of their moral matrices they realized that societies flourish when they value all of the moral foundations to some degree. This is why Ying and Yang aren’t enemies, “they are both necessary, like night and day, for the functioning of the world.” Or, similarly, why the two of the high Gods in Hinduism, Vishnu the preserver (who stands for conservative principles) and Shiva the destroyer (who stands for liberal principles) work together.
Javier E

Occupy Language? - NYTimes.com - 0 views

  • It has already succeeded in shifting the terms of the debate, taking phrases like “debt-ceiling” and “budget crisis” out of the limelight and putting terms like “inequality” and “greed” squarely in the center. This discursive shift has made it more difficult for Washington to obscure the spurious reasons for the financial meltdown and the unequal outcomes it has exposed
  • In early September, “occupy” signaled on-going military incursions. Now it signifies progressive political protest. It’s no longer primarily about force of military power; instead it signifies standing up to injustice, inequality and abuse of power. It’s no longer about simply occupying a space; it’s about transforming that space.
  • This is a far cry from some of its earlier meanings. In fact, The Oxford English Dictionary tells us that “occupy” once meant “to have sexual intercourse with.”
  • ...3 more annotations...
  • Occupy Language might also support the campaign to stop the media from using the word “illegal” to refer to “undocumented” immigrants. From the campaign’s perspective, only inanimate objects and actions are labeled illegal in English; therefore the use of “illegals” to refer to human beings is dehumanizing.
  • the F.B.I.’s annual Hate Crime Statistics show that Latinos comprised two thirds of the victims of ethnically motivated hate crimes in 2010. When someone is repeatedly described as something, language has quietly paved the way for violent action.
  • By occupying language, we can expose how educational, political, and social institutions use language to further marginalize oppressed groups; resist colonizing language practices that elevate certain languages over others; resist attempts to define people with terms rooted in negative stereotypes; and begin to reshape the public discourse about our communities, and about the central role of language in racism and discrimination.
Javier E

Jennifer Rosoff's death and the Associated Press's sexist reporting of it. - 0 views

  • the minor details that journalists choose to include or exclude from their reporting are one of many subtle ways that oppressive gender norms are perpetuated
  • the fact that totally irrelevant details about Rosoff’s love life and cigarette habit made it into the lede and nut graf of an ostensibly unbiased news article—and that no editor stopped to ask, “Hmm, why is this information here?”—just goes to show how deeply ingrained sexist attitudes can b
Javier E

Our Biased Brains - NYTimes.com - 0 views

  • The human brain seems to be wired so that it categorizes people by race in the first one-fifth of a second after seeing a face
  • Racial bias also begins astonishingly early: Even infants often show a preference for their own racial group. In one study, 3-month-old white infants were shown photos of faces of white adults and black adults; they preferred the faces of whites. For 3-month-old black infants living in Africa, it was the reverse.
  • in evolutionary times we became hard-wired to make instantaneous judgments about whether someone is in our “in group” or not — because that could be lifesaving. A child who didn’t prefer his or her own group might have been at risk of being clubbed to death.
  • ...7 more annotations...
  • I encourage you to test yourself at implicit.harvard.edu. It’s sobering to discover that whatever you believe intellectually, you’re biased about race, gender, age or disability.
  • unconscious racial bias turns up in children as soon as they have the verbal skills to be tested for it, at about age 4. The degree of unconscious bias then seems pretty constant: In tests, this unconscious bias turns out to be roughly the same for a 4- or 6-year-old as for a senior citizen who grew up in more racially oppressive times.
  • Many of these experiments on in-group bias have been conducted around the world, and almost every ethnic group shows a bias favoring its own. One exception: African-Americans.
  • in contrast to other groups, African-Americans do not have an unconscious bias toward their own. From young children to adults, they are essentially neutral and favor neither whites nor blacks.
  • even if we humans have evolved to have a penchant for racial preferences from a very young age, this is not destiny. We can resist the legacy that evolution has bequeathed us.
  • “We wouldn’t have survived if our ancestors hadn’t developed bodies that store sugar and fat,” Banaji says. “What made them survive is what kills us.” Yet we fight the battle of the bulge and sometimes win — and, likewise, we can resist a predisposition for bias against other groups.
  • Deep friendships, especially romantic relationships with someone of another race, also seem to mute bias
catbclark

Lila Abu-Lughod: Do Muslim Women Need Saving? | TIME.com - 0 views

  • But it has also reduced Muslim women to a stereotyped singularity, plastering a handy cultural icon over much more complicated historical and political dynamics.
  • they had become such symbols of oppression in the West. But we were confusing veiling with a lack of agency. What most of us didn’t know is that 30 years ago the anthropologist Hanna Papanek described the burqa as “portable seclusion” and noted that many women saw it as a liberating invention because it enabled them to move out of segregated living spaces while still observing the requirements of separating and protecting women from unrelated men. People all over the globe, including Americans, wear the appropriate form of dress for their socially shared standards, religious beliefs and moral ideals.
  • Ultimately, saving Muslim women allows us to ignore the complex entanglements in which we are all implicated and creates a polarization that places feminism only on the side of the West.
1 - 20 of 60 Next › Last »
Showing 20 items per page