Skip to main content

Home/ TOK Friends/ Group items tagged book review

Rss Feed Group items tagged

Javier E

Liu Cixin's War of the Worlds | The New Yorker - 0 views

  • he briskly dismissed the idea that fiction could serve as commentary on history or on current affairs. “The whole point is to escape the real world!” he said.
  • Chinese tech entrepreneurs discuss the Hobbesian vision of the trilogy as a metaphor for cutthroat competition in the corporate world; other fans include Barack Obama, who met Liu in Beijing two years ago, and Mark Zuckerberg. Liu’s international career has become a source of national pride. In 2015, China’s then Vice-President, Li Yuanchao, invited Liu to Zhongnanhai—an off-limits complex of government accommodation sometimes compared to the Kremlin—to discuss the books and showed Liu his own copies, which were dense with highlights and annotations.
  • In China, one of his stories has been a set text in the gao kao—the notoriously competitive college-entrance exams that determine the fate of ten million pupils annually; another has appeared in the national seventh-grade-curriculum textbook. When a reporter recently challenged Liu to answer the middle-school questions about the “meaning” and the “central themes” of his story, he didn’t get a single one right. “I’m a writer,” he told me, with a shrug.
  • ...20 more annotations...
  • Liu’s tomes—they tend to be tomes—have been translated into more than twenty languages, and the trilogy has sold some eight million copies worldwide. He has won China’s highest honor for science-fiction writing, the Galaxy Award, nine times, and in 2015 he became the first Asian writer to win the Hugo Award, the most prestigious international science-fiction prize
  • “The Three-Body Problem” takes its title from an analytical problem in orbital mechanics which has to do with the unpredictable motion of three bodies under mutual gravitational pull. Reading an article about the problem, Liu thought, What if the three bodies were three suns? How would intelligent life on a planet in such a solar system develop? From there, a structure gradually took shape that almost resembles a planetary system, with characters orbiting the central conceit like moons. For better or worse, the characters exist to support the framework of the story rather than to live as individuals on the page.
  • Concepts that seemed abstract to others took on, for him, concrete forms; they were like things he could touch, inducing a “druglike euphoria.” Compared with ordinary literature, he came to feel, “the stories of science are far more magnificent, grand, involved, profound, thrilling, strange, terrifying, mysterious, and even emotional
  • Pragmatic choices like this one, or like the decision his grandparents made when their sons were conscripted, recur in his fiction—situations that present equally unconscionable choices on either side of a moral fulcrum
  • The great flourishing of science fiction in the West at the end of the nineteenth century occurred alongside unprecedented technological progress and the proliferation of the popular press—transformations that were fundamental to the development of the genre
  • Joel Martinsen, the translator of the second volume of Liu’s trilogy, sees the series as a continuation of this tradition. “It’s not hard to read parallels between the Trisolarans and imperialist designs on China, driven by hunger for resources and fear of being wiped out,” he told me. Even Liu, unwilling as he is to endorse comparisons between the plot and China’s current face-off with the U.S., did at one point let slip that “the relationship between politics and science fiction cannot be underestimated.”
  • Speculative fiction is the art of imagining alternative worlds, and the same political establishment that permits it to be used as propaganda for the existing regime is also likely to recognize its capacity to interrogate the legitimacy of the status quo.
  • Liu has been criticized for peopling his books with characters who seem like cardboard cutouts installed in magnificent dioramas. Liu readily admits to the charge. “I did not begin writing for love of literature,” he told me. “I did so for love of science.”
  • Liu believes that this trend signals a deeper shift in the Chinese mind-set—that technological advances have spurred a new excitement about the possibilities of cosmic exploration.
  • Liu’s imagination is dauntingly capacious, his narratives conceived on a scale that feels, at times, almost hallucinogenic. The time line of the trilogy spans 18,906,450 years, encompassing ancient Egypt, the Qin dynasty, the Byzantine Empire, the Cultural Revolution, the present, and a time eighteen million years in the future
  • The first book is set on Earth, though some of its scenes take place in virtual reality; by the end of the third book, the scope of the action is interstellar and annihilation unfolds across several dimensions. The London Review of Books has called the trilogy “one of the most ambitious works of science fiction ever written.”
  • Although physics furnishes the novels’ premises, it is politics that drives the plots. At every turn, the characters are forced to make brutal calculations in which moral absolutism is pitted against the greater good
  • In Liu’s fictional universe, idealism is fatal and kindness an exorbitant luxury. As one general says in the trilogy, “In a time of war, we can’t afford to be too scrupulous.” Indeed, it is usually when people do not play by the rules of Realpolitik that the most lives are lost.
  • “I know what you are thinking,” he told me with weary clarity. “What about individual liberty and freedom of governance?” He sighed, as if exhausted by a debate going on in his head. “But that’s not what Chinese people care about. For ordinary folks, it’s the cost of health care, real-estate prices, their children’s education. Not democracy.”
  • Liu closed his eyes for a long moment and then said quietly, “This is why I don’t like to talk about subjects like this. The truth is you don’t really—I mean, can’t truly—understand.”
  • Liu explained to me, the existing regime made the most sense for today’s China, because to change it would be to invite chaos. “If China were to transform into a democracy, it would be hell on earth,”
  • It was an opinion entirely consistent with his systems-level view of human societies, just as mine reflected a belief in democracy and individualism as principles to be upheld regardless of outcomes
  • “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark. Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains.
  • Chinese people of his generation were lucky, he said. The changes they had seen were so huge that they now inhabited a world entirely different from that of their childhood. “China is a futuristic country,” he said. “I realized that the world around me became more and more like science fiction, and this process is speeding up.”
  • “We have statues of a few martyrs, but we never—We don’t memorialize those, the individuals.” He took off his glasses and blinked, peering into the wide expanse of green and concrete. “This is how we Chinese have always been,” he said. “When something happens, it passes, and time buries the stories.”
Javier E

Harold Bloom Is Dead. But His 'Rage for Reading' Is Undiminished. - The New York Times - 0 views

  • It’s a series of meditations on what Bloom believes to be the most important novels we have, and it takes for granted that its readers already know the books under consideration; in other words, that they have already absorbed “the canon,” and are eager to reconsider it later in their lives.
  • A not atypical, almost throwaway passage for you to test the waters on: “Tolstoy, as befits the writer since Shakespeare who most has the art of the actual, combines in his representational praxis the incompatible powers of Homer and the Yahwist.” This is not Bloom showing off; it’s the way Bloom thinks and proceeds.
  • Apart from his novelists, his frame of reference rests on Shakespeare above all others, Homer, Chaucer, Dante, Montaigne, Emerson, Dr. Johnson (the “shrewdest of all literary critics”), Blake, Wordsworth, Whitman (for him, the central American writer of the 19th century), Wallace Stevens, Freud
  • ...6 more annotations...
  • Among the novelists, Cervantes, Tolstoy (supreme), Melville, Austen, Proust, Joyce.
  • He is inevitably at his strongest when dealing with those writers he cares most about. With Jane Austen, for one. And, above all, with Tolstoy:
  • As for Dickens, whose “David Copperfield” was a direct influence on Tolstoy, to Bloom his greatest achievement is “Bleak House”
  • He pairs it with Dickens’s final complete novel, “Our Mutual Friend,” a book I care for so extravagantly that I’ve read it three times
  • The two works in which Bloom is most fully invested are “Moby-Dick” (40 pages) and “Ulysses” (54)
  • He chooses to give room to not one but two of Le Guin’s novels, “The Left Hand of Darkness” and “The Dispossessed,”
pier-paolo

THE CLOSE READER; Powers of Perception - The New York Times - 0 views

  • Keller's writing jars the contemporary reader in three ways. First, she composes in the grandiose manner favored by the late-19th-century genteel essayist, with lots of quotations and inverted sentences. Second, she gushes with a girlish gratefulness that registers, in our more cynical time, as more ingratiating than genuine
  • Keller violates a cardinal rule of autobiography, which is to distinguish what you have been told from what you know from experience. She narrates, as if she knew them firsthand, events from very early childhood and the first stages of her education -- neither of which she could possibly remember herself, at least not in such detail.
  • When Keller's book came out in 1903, she was criticized by one reviewer for her constant, un-self-conscious allusions to color and music. ''All her knowledge is hearsay knowledge,'' this critic wrote in The Nation, ''her very sensations are for the most part vicarious, and yet she writes of things beyond her powers of perception with the assurance of one who has verified every word.'
  • ...5 more annotations...
  • Maybe Shattuck is right and we are all like this -- creatures of language, rather than its masters. Much of what we think we know firsthand we probably picked up from books or newspapers or friends or lovers and never checked against the world at all.
  • Her ability to experience what others felt and heard, she said, illustrated the power of imagination, particularly one that had been developed and extended, as hers was, by books.
  • What she knew of her own observation is exactly what we want to know from her. We want to know what it felt like to be Helen Keller. We want to locate the boundaries between what was real to her and what she was forced to imagine. At least in this book, she seems not to have known where that boundary might lie.
  • He tries to remember what he looks like and discovers that he cannot. He asks: ''To what extent is loss of the image of the face connected with loss of the image of the self? Is this one of the reasons why I often feel that I am mere spirit, a ghost, a memory?''
  • Keller, in short, matured, both as a person and a writer. She mastered a lesson that relatively few with all their senses have ever mastered, which is to write about what you know.
Javier E

The Folly of Fools - By Robert Trivers - Book Review - NYTimes.com - 0 views

  • Fooling others yields obvious benefits, but why do we so often fool ourselves? Trivers provides a couple of answers. First, believing that we’re smarter, sexier and more righteous than we really are — or than others consider us to be — can help us seduce and persuade others and even improve our health, via the placebo effect, for example. And the more we believe our own lies, the more sincerely, and hence effectively, we can lie to others.
  • One intriguing theme running through “The Folly of Fools” is that self-­deception can affect our susceptibility to disease, for ill or good.
Javier E

Book Review - The Information - By James Gleick - NYTimes.com - 0 views

  • Information, he argues, is more than just the contents of our overflowing libraries and Web servers. It is “the blood and the fuel, the vital principle” of the world. Human consciousness, society, life on earth, the cosmos — it’s bits all the way down.
  • Shannon’s paper, published the same year as the invention of the transistor, instantaneously created the field of information theory, with broad applications in engineering and computer science.
  • information theory wound up reshaping fields from economics to philosophy, and heralded a dramatic rethinking of biology and physics.
  • ...1 more annotation...
  • molecular biologists were soon speaking of information, not to mention codes, libraries, alphabets and transcription, without any sense of metaphor. In Gleick’s words, “Genes themselves are made of bits.” At the same time, physicists exploring what Einstein had called the “spooky” paradoxes of quantum mechanics began to see information as the substance from which everything else in the universe derives. As the physicist John Archibald Wheeler put it in a paper title, “It From Bit.”
Javier E

Book Review: 'Life Is Hard,' by Kieran Setiya - The New York Times - 0 views

  • “Life Is Hard” pushes back against many platitudes of contemporary American self-improvement culture. Setiya is no friend to positive thinking — at best, it requires self-deception, and at worst, such glass-half-full optimism can be cruel to those whose pain we refuse to recognize.
  • Another theory Setiya challenges is the idea that happiness should be life’s primary pursuit. Instead, he argues that we should try to live well within our limits, even if this sometimes means acknowledging difficult truths.
  • If you really consider “happiness” in its everyday sense — a feeling of contentment and pleasure — its desirability is complicated; we can certainly be made to feel good by ignoring injustice, wars, climate change or the hardships of aging. But we cannot live meaningfully that way.
  • ...8 more annotations...
  • what does living well mean in practice? To Setiya, it lies in embracing one of the many possible “good-enough lives” instead of aching for a perfect one
  • Setiya’s approach blends empathy with common sense. True, a person who is blind or lacks full movement may not be able to enjoy certain pleasures — at least, in the typical way. And suffering injury can be traumatic. But none of us can fit everything worth doing into one lifetime. Our possibilities and our choices are always limited, and we can live fully within those limits.
  • he invites the reader to join him as he looks at life’s challenges — loneliness, injustice, grief — and in turning them over to examine every angle. Sometimes these twists make it difficult to grasp his ultimate point; in his discussion of the potential extinction of human beings, for instance, Setiya argues movingly that it is hard to find meaning in our actions without the promise of future societies who will enjoy the result
  • The golden thread running through “Life Is Hard” is Setiya’s belief in the value of well-directed attention. Pain, as much as we wish to avoid it, forces us to remember that we are indelibly connected to our bodies.
  • Listening carefully, whether to good friends or to strangers on a bus, can help us feel less lonely. “Close reading” other people, trying as hard as possible to see them in their full humanity, is a small step toward a more just world. By cultivating our sensitivity to ourselves and to others, we escape another destructive modern myth: that we are separate from other people, and that we can live well without caring for them.
  • Mindfulness is also Setiya’s answer to the threat of personal failure. If we can teach ourselves to notice all the splendid, varied incidents of our lives, he claims, we are much less likely to brand ourselves with a single label, winner or loser.
  • He encourages readers to abandon simple narratives about success over the course of a lifetime. I suspect this is why Setiya so often finds his conclusions in poetry, not in philosophy: The experience of suffering leads to messy, counterintuitive truths.
  • “Life Is Hard” is a humane consolation for challenging times. Reading it is like speaking with a thoughtful friend who never tells you to cheer up, but, by offering gentle companionship and a change of perspective, makes you feel better anyway.
Javier E

'Trespassing on Einstein's Lawn,' by Amanda Gefter - NYTimes.com - 0 views

  • It all began when Warren Gefter, a radiologist “prone to posing Zen-koan-like questions,” asked his 15-year-old daughter, Amanda, over dinner at a Chinese restaurant near their home just outside Philadelphia: “How would you define nothing?”
  • “I think we should figure it out,” he said. And his teenage daughter — sullen, rebellious, wallowing in existential dread — smiled for the first time “in what felt like years.” The project proved to be a gift from a wise, insightful father. It was Warren Gefter’s way of rescuing his child.
  • Tracking down the meaning of nothing — and, by extension, secrets about the origin of the universe and whether observer-independent reality exists — became the defining project of their lives. They spent hours together working on the puzzle, two dark heads bent over their physics books far into the night.
  • ...6 more annotations...
  • She became a science journalist. At first it was a lark, a way to get free press passes to conferences where she and her father could ask questions of the greatest minds in quantum mechanics, string theory and cosmology. But within a short time, as she started getting assignments, journalism became a calling, and an identity.
  • “If observers create reality, where do the observers come from?” But the great man responded in riddles. “The universe is a self-­excited circuit,” Wheeler said. “The boundary of a boundary is zero.” The unraveling of these mysteries propels the next 400 or so pages.
  • she has an epiphany — that for something to be real, it must be invariant — she flies home to share it with her father. They discuss her insight over breakfast at a neighborhood haunt, where they make a list on what they will affectionately call “the IHOP napkin.” They list all the possible “ingredients of ultimate reality,” planning to test each item for whether it is “real,” that is whether it is invariant and can exist in the absence of an observer.
  • their readings and interviews reveal that each item in turn is observer-dependent. Space? Observer-dependent, and therefore not real. Gravity, electromagnetism, angular momentum? No, no, and no. In the end, every putative “ingredient of ultimate reality” is eliminated, including one they hadn’t even bothered to put on the list because it seemed weird to: reality itself
  • What remained was an unsettling and essential insight: that “physics isn’t the machinery behind the workings of the world; physics is the machinery behind the illusion that there is a world.”
  • In the proposal, she clarifies how cosmology and quantum mechanics have evolved as scientists come to grips with the fact that things they had taken to be real — quantum particles, space-time, gravity, dimension — turn out to be ­observer-dependent.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
clairemann

Robinhood app makes Wall Street feel like a game to win - instead of a place ... - 0 views

  • Wall Street has long been likened to a casino. Robinhood, an investment app that just filed plans for an initial public offering, makes the comparison more apt than ever.
  • Similarly, Robinhood’s slick and easy-to-use app resembles a thrill-inducing video game rather than a sober investment tool
  • Using gamelike features to influence real-life actions can be beneficial, such as when a health app uses rewards and rankings to encourage people to move more or eat healthier food. But there’s a dark side too, and so-called gamification can lead people to forget the real-world consequences of their decisions.
  • ...9 more annotations...
  • sometimes with disastrous consequences, such as last year when a Robinhood user died by suicide after mistakenly believing that he’d lost US$750,000.
  • The reason games are so captivating is that they challenge the mind to learn new things and are generally safe spaces to face and overcome failure.
  • Games also mimic rites of passage similar to religious rituals and draw players into highly focused “flow states” that dramatically alter self-awareness. This sensory blend of flow and mastery are what make games fun and sometimes addicting: “Just one more turn” thinking can last for hours, and players forget to eat and sleep. Players who barely remember yesterday’s breakfast recall visceral details from games played decades ago.
  • The psychological impact of game play can also be harnessed for profit.
  • For example, many free-to-play video games such as Angry Birds 2 and Fortnite give players the option to spend real money on in-game items such as new and even angrier birds or character skins.
  • This “free-to-play” model is so profitable that it’s grown increasingly popular with video game designers and publishers.
  • Gamification, however, goes one step further and uses gaming elements to influence real-world behavior.
  • . Common elements include badges, points, rankings and progress bars that visually encourage players to achieve goals.
  • Many readers likely have experienced this type of gamification to improve personal fitness, get better grades, build savings accounts and even solve major scientific problems. Some initiatives also include offering rewards that can be cashed in for participating in actual civic projects, such as volunteering in a park, commenting on a piece of legislation or visiting a government website.
anonymous

Excellence Runs in the Family. Her Novel's Heroine Wants Something Else. - The New York... - 0 views

  • Excellence Runs in the Family. Her Novel’s Heroine Wants Something Else
  • Kaitlyn Greenidge and her sisters achieved success in their respective fields
  • In her historical novel, “Libertie,” she focuses on a Black woman who doesn’t yearn to be the first or only one of anything.
  • ...23 more annotations...
  • Kaitlyn Greenidge learned about the first Black woman to become a doctor in New York. “I filed it away and thought, if I ever got a chance to write a novel, I would want it to be about this,” she said.
  • Libertie, the rebellious heroine of Kaitlyn Greenidge’s new novel, comes from an extraordinary family, but longs to be ordinary.
  • As a young Black woman growing up in Reconstruction-era Brooklyn, Libertie is expected to follow in the footsteps of her trailblazing mother, a doctor who founded a women’s clinic.
  • “So much of Black history is focused on exceptional people,”
  • I wanted to explore is, what’s the emotional and psychological toll of being an exception, of being exceptional, and also, what about the people who just want to have a regular life and find freedom and achievement in being able to live in peace with their family — which is what Libertie wants?”
  • “If you come from a marginalized community, one of the ways you are marginalized is people telling you that you don’t have any history, or that your history is somehow diminished, or it’s very flat, or it’s not somehow as rich as the dominant history.”
  • “That idea of being the first and the only was a big piece of our experience,”
  • They are engaged in ongoing conversations about their writing, though they draw the line at reading and editing drafts of one another’s work.
  • Libertie
  • The novel has drawn praise from writers like Jacqueline Woodson, Mira Jacob and Garth Greenwell, who wrote in a blurb that Greenidge “adds an indelible new sound to American literature, and confirms her status as one of our most gifted young writers.”
  • raised by a single mother who struggled to support the family on her social worker’s salary,
  • “I’ve always been interested in the histories of things that are lesser known,”
  • “There’s a really powerful lyricism that feels new in this voice,”
  • Greenidge and her sisters developed a reverence for storytelling and history early on, when their parents and grandparents would tell stories about their ancestors and what life was like during the civil rights movement.
  • “That fracture was really formative for me,” she said. “It made me hyper aware of inequality and the doublespeak that goes on in America around the American dream and American exceptionalism, because that was proven to me not to be true.”
  • Greenidge was collecting stories from people whose ancestors had lived there, and tracked down a woman named Ellen Holly, who was the first Black actress to have a lead, recurring role on daytime TV, in “One Life to Live.”
  • Greenidge filed the family’s saga away in her mind, thinking she had the premise for a novel. When she got a writing fellowship, she was able to quit her side jobs and immerse herself in the research the novel required.
  • The resulting story feels both epic and intimate. As she reimagined the lives of the doctor and her daughter, Greenidge wove in other historical figures and events.
  • In one horrific scene, Libertie and her mother tend to Black families who fled Manhattan during the New York City draft riots.
  • Greenidge also drew on her own family history, and her experience of being a new mother.
  • Her daughter, Mavis, was born days after she finished a second draft of the book, and is now 18 months old. She finished revisions while living in a multigenerational household with her own mother and sisters.
  • “Mother-daughter relationships are like the central relationships in my life,”
  • “I cannot think of a greater freedom than raising you,”
caelengrubb

Copernicus, Galileo, and the Church: Science in a Religious World - Inquiries Journal - 0 views

  • During most of the 16th and 17th centuries, fear of heretics spreading teachings and opinions that contradicted the Bible dominated the Catholic Church
  • A type of war between science and religion was in play but there would be more casualties on the side of science.
  • Nicholas Copernicus and Galileo Galilei were two scientists who printed books that later became banned
  • ...8 more annotations...
  • Copernicus faced no persecution when he was alive because he died shortly after publishing his book. Galileo, on the other hand, was tried by the Inquisition after his book was published
  • As the contents of the Bible were taken literally, the publishing of these books proved, to the Church, that Copernicus and Galileo were sinners; they preached, through their writing, that the Bible was wrong.
  • By writing in this fashion, Copernicus would have been able to deny that he himself believed in heliocentrism because he phrased it as nothing more than a hypothesis and as a result, would be able to slip past the Church's dislike of heliocentrism
  • fter his death, the Church was heavily involved in the Council of Trent during the years 1545 to 1563 and other matters10.) . Thus, Revolutions escaped prohibition for many years and eventually influenced Galileo Galilei, who read it and wrote on the subject himself
  • In 1616, Galileo was issued an injunction not to “hold, defend, or teach” heliocentrism
  • The Master of the Sacred Palace ordered Galileo to have someone the Master chose review the manuscript to ensure it was fit for publishing.
  • Also, the title with the sea in it might have made the Church feel threatened that Galileo was supporting heliocentrism, which would have resulted in Galileo being charged with heresy.
  • With that decision, it was determined that Galileo would be tried by the Inquisition. The Inquisition did not need to decide if Galileo was innocent or guilty, they already knew he was guilty. The Inquisition wanted to determine what Galileo's intentions were. Galileo tried to delay going to Rome for the trial, most likely due to the Inquisition's infamous methods.
Javier E

There Is More to Us Than Just Our Brains - The New York Times - 0 views

  • we are less like data processing machines and more like soft-bodied mollusks, picking up cues from within and without and transforming ourselves accordingly.
  • Still, we “insist that the brain is the sole locus of thinking, a cordoned-off space where cognition happens, much as the workings of my laptop are sealed inside its aluminum case,”
  • We get constant messages about what’s going on inside our bodies, sensations we can either attend to or ignore. And we belong to tribes that cosset and guide us
  • ...14 more annotations...
  • we’re networked organisms who move around in shifting surroundings, environments that have the power to transform our thinking
  • Annie Murphy Paul’s new book, “The Extended Mind,” which exhorts us to use our entire bodies, our surroundings and our relationships to “think outside the brain.”
  • In 2011, she published “Origins,” which focused on all the ways we are shaped by the environment, before birth and minute to minute thereafter.
  • “In the nature-nurture dynamic, nurture begins at the time of conception. The food the mother eats, the air she breathes, the water she drinks, the stress or trauma she experiences — all may affect her child for better or worse, over the decades to come.”
  • a down-to-earth take on the science of epigenetics — how environmental signals become catalysts for gene expression
  • the parallel to this latest book is that the boundaries we commonly assume to be fixed are actually squishy. The moment of a child’s birth, her I.Q. scores or fMRI snapshots of what’s going on inside her brain — all are encroached upon and influenced by outside forces.
  • awareness of our internal signals, such as exactly when our hearts beat, or how cold and clammy our hands are, can boost our performance at the poker table or in the financial markets, and even improve our pillow talk
  • “Though we typically think of the brain as telling the body what to do, just as much does the body guide the brain with an array of subtle nudges and prods. One psychologist has called this guide our ‘somatic rudder,’
  • The “body scan” aspect of mindfulness meditation that has been deployed by the behavioral medicine pioneer Jon Kabat-Zinn may help people lower their heart rates and blood pressure,
  • techniques that help us pinpoint their signals can foster well-being
  • Tania Singer has shown how the neural circuitry underlying compassion is strengthened by meditation practice
  • our thoughts “are powerfully shaped by the way we move our bodies.” Gestures help us understand spatial concepts; indeed, “without gesture as an aid, students may fail to understand spatial ideas at all,”
  • looking out on grassy expanses near loose clumps of trees and a source of water helps us solve problems. “Passive attention,” she writes, is “effortless: diffuse and unfocused, it floats from object to object, topic to topic. This is the kind of attention evoked by nature, with its murmuring sounds and fluid motions; psychologists working in the tradition of James call this state of mind ‘soft fascination.’”
  • The chapters on the ways natural and built spaces reflect universal preferences and enhance the thinking process felt like a respite
Javier E

Among the Disrupted - The New York Times - 0 views

  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university,
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods
  • ...27 more annotations...
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • Greif’s book is a prehistory of our predicament, of our own “crisis of man.” (The “man” is archaic, the “crisis” is not.) It recognizes that the intellectual history of modernity may be written in part as the epic tale of a series of rebellions against humanism
  • We are not becoming transhumanists, obviously. We are too singular for the Singularity. But are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.
  • Here is his conclusion: “Anytime your inquiries lead you to say, ‘At this moment we must ask and decide who we fundamentally are, our solution and salvation must lie in a new picture of ourselves and humanity, this is our profound responsibility and a new opportunity’ — just stop.” Greif seems not to realize that his own book is a lasting monument to precisely such inquiry, and to its grandeur
  • “Answer, rather, the practical matters,” he counsels, in accordance with the current pragmatist orthodoxy. “Find the immediate actions necessary to achieve an aim.” But before an aim is achieved, should it not be justified? And the activity of justification may require a “picture of ourselves.” Don’t just stop. Think harder. Get it right.
  • — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • Who has not felt superior to humanism? It is the cheapest target of all: Humanism is sentimental, flabby, bourgeois, hypocritical, complacent, middlebrow, liberal, sanctimonious, constricting and often an alibi for power
  • what is humanism? For a start, humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating
  • The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality
  • Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.
  • And posthumanism? It elects to understand the world in terms of impersonal forces and structures, and to deny the importance, and even the legitimacy, of human agency.
  • There have been humane posthumanists and there have been inhumane humanists. But the inhumanity of humanists may be refuted on the basis of their own worldview
  • the condemnation of cruelty toward “man the machine,” to borrow the old but enduring notion of an 18th-century French materialist, requires the importation of another framework of judgment. The same is true about universalism, which every critic of humanism has arraigned for its failure to live up to the promise of a perfect inclusiveness
  • there has never been a universalism that did not exclude. Yet the same is plainly the case about every particularism, which is nothing but a doctrine of exclusion; and the correction of particularism, the extension of its concept and its care, cannot be accomplished in its own name. It requires an idea from outside, an idea external to itself, a universalistic idea, a humanistic idea.
  • Asking universalism to keep faith with its own principles is a perennial activity of moral life. Asking particularism to keep faith with its own principles is asking for trouble.
  • there is no more urgent task for American intellectuals and writers than to think critically about the salience, even the tyranny, of technology in individual and collective life
  • a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion
  • “Our very mastery seems to escape our mastery,” Michel Serres has anxiously remarked. “How can we dominate our domination; how can we master our own mastery?”
  • universal accessibility is not the end of the story, it is the beginning. The humanistic methods that were practiced before digitalization will be even more urgent after digitalization, because we will need help in navigating the unprecedented welter
  • Searches for keywords will not provide contexts for keywords. Patterns that are revealed by searches will not identify their own causes and reasons
  • The new order will not relieve us of the old burdens, and the old pleasures, of erudition and interpretation.
  • Is all this — is humanism — sentimental? But sentimentality is not always a counterfeit emotion. Sometimes sentiment is warranted by reality.
  • The persistence of humanism through the centuries, in the face of formidable intellectual and social obstacles, has been owed to the truth of its representations of our complexly beating hearts, and to the guidance that it has offered, in its variegated and conflicting versions, for a soulful and sensitive existence
  • a complacent humanist is a humanist who has not read his books closely, since they teach disquiet and difficulty. In a society rife with theories and practices that flatten and shrink and chill the human subject, the humanist is the dissenter.
Javier E

Eric A. Posner Reviews Jim Manzi's "Uncontrolled" | The New Republic - 0 views

  • Most urgent questions of public policy turn on empirical imponderables, and so policymakers fall back on ideological predispositions or muddle through. Is there a better way?
  • The gold standard for empirical research is the randomized field trial (RFT).
  • The RFT works better than most other types of empirical investigation. Most of us use anecdotes or common sense empiricism to make inferences about the future, but psychological biases interfere with the reliability of these methods
  • ...15 more annotations...
  • Serious empiricists frequently use regression analysis.
  • Regression analysis is inferior to RFT because of the difficulty of ruling out confounding factors (for example, that a gene jointly causes baldness and a preference for tight hats) and of establishing causation
  • RFT has its limitations as well. It is enormously expensive because you must (usually) pay a large number of people to participate in an experiment, though one can obtain a discount if one uses prisoners, especially those in a developing country. In addition, one cannot always generalize from RFTs.
  • academic research proceeds in fits and starts, using RFT when it can, but otherwise relying on regression analysis and similar tools, including qualitative case studies,
  • businesses also use RFT whenever they can. A business such as Wal-Mart, with thousands of stores, might try out some innovation like a new display in a random selection of stores, using the remaining stores as a control group
  • Manzi argues that the RFT—or more precisely, the overall approach to empirical investigation that the RFT exemplifies—provides a way of thinking about public policy. Thi
  • the universe is shaky even where, as in the case of physics, “hard science” plays the dominant role. The scientific method cannot establish truths; it can only falsify hypotheses. The hypotheses come from our daily experience, so even when science prunes away intuitions that fail the experimental method, we can never be sure that the theories that remain standing reflect the truth or just haven’t been subject to the right experiment. And even within its domain, the experimental method is not foolproof. When an experiment contradicts received wisdom, it is an open question whether the wisdom is wrong or the experiment was improperly performed.
  • The book is less interested in the RFT than in the limits of empirical knowledge. Given these limits, what attitude should we take toward government?
  • Much of scientific knowledge turns out to depend on norms of scientific behavior, good faith, convention, and other phenomena that in other contexts tend to provide an unreliable basis for knowledge.
  • Under this view of the world, one might be attracted to the cautious conservatism associated with Edmund Burke, the view that we should seek knowledge in traditional norms and customs, which have stood the test of time and presumably some sort of Darwinian competition—a human being is foolish, the species is wise. There are hints of this worldview in Manzi’s book, though he does not explicitly endorse it. He argues, for example, that we should approach social problems with a bias for the status quo; those who seek to change it carry the burden of persuasion. Once a problem is identified, we should try out our ideas on a small scale before implementing them across society
  • Pursuing the theme of federalism, Manzi argues that the federal government should institutionalize policy waivers, so states can opt out from national programs and pursue their own initiatives. A state should be allowed to opt out of federal penalties for drug crimes, for example.
  • It is one thing to say, as he does, that federalism is useful because we can learn as states experiment with different policies. But Manzi takes away much of the force of this observation when he observes, as he must, that the scale of many of our most urgent problems—security, the economy—is at the national level, so policymaking in response to these problems cannot be left to the states. He also worries about social cohesion, which must be maintained at a national level even while states busily experiment. Presumably, this implies national policy of some sort
  • Manzi’s commitment to federalism and his technocratic approach to policy, which relies so heavily on RFT, sit uneasily together. The RFT is a form of planning: the experimenter must design the RFT and then execute it by recruiting subjects, paying them, and measuring and controlling their behavior. By contrast, experimentation by states is not controlled: the critical element of the RFT—randomization—is absent.
  • The right way to go would be for the national government to conduct experiments by implementing policies in different states (or counties or other local units) by randomizing—that is, by ordering some states to be “treatment” states and other states to be “control” states,
  • Manzi’s reasoning reflects the top-down approach to social policy that he is otherwise skeptical of—although, to be sure, he is willing to subject his proposals to RFTs.
Javier E

Movie Review: Inside Job - Barron's - 0 views

  • On the outsize role of the GSEs and other federal agencies in high-risk mortgages, figures compiled by former Fannie Mae Chief Credit Officer Edward Pinto show that as of mid-2008, more than 70% were accounted for by the federal government in one way or another, with nearly two-thirds of that held by Fannie and Freddie.
  • As has been documented, for example, in a forthcoming book on the GSEs called Guaranteed to Fail, there was a steady increase in affordable housing mandates imposed on these enterprises by Congress, one of several reasons why they were hardly like other capitalist enterprises, but tools and beneficiaries of government.
  • I asked Ferguson why Inside Job made such brief mention of Fannie Mae and Freddie Mac, and even then without noting that they are government-sponsored enterprises, subject to special protection by the federal government—which their creditors clearly appreciated, given the unusually low interest rates their debt commanded.
  • ...7 more annotations...
  • Ferguson replied that their role in subprime mortgages was not very significant, and that in any case their behavior was not much different from that of other capitalist enterprises.
  • We get no inkling that Rajan's views on what made the world riskier, as set forth in his book, veer quite radically from those of Inside Job. They include, as he has written, "the political push for easy housing credit in the United States and the lax monetary policy [by the Federal Reserve] in the years 2003-2005."
  • Rajan, author of Fault Lines, a recent book on the debacle, speaks with special authority to fans of Inside Job. Not only is he in the movie—one of the talking heads speaking wisdom about what occurred—he is accurately presented as having anticipated the meltdown in a 2005 paper called "Has Financial Development Made the World Riskier?" But the things he is quoted as saying in the film are restricted to serving its themes.
  • Yet it's impossible to understand what happened without grasping the proactive role played by government. "The banking sector did not decide out of the goodness of its heart to extend mortgages to poor people," commented University of Chicago Booth School of Business Finance Professor Raghuram Rajan in a telephone interview last week. "Politicians did that, and they would have taken great umbrage if the regulator stood in the way of more housing credit."
  • THE STORY RECOUNTED in Inside Job is that principles like safety and soundness were flouted by greedy Wall Street capitalists who brought down the economy with the help of certain politicians, political appointees and corrupt academicians. Despite the attempts and desires of some, including Barney Frank, to regulate the mania, the juggernaut prevails to this day, under the presidency of Barack Obama.
  • This version of the story contains some elements of truth.
  • Text Size Regular Medium Large "A MASTERPIECE OF INVESTIGATIVE nonfiction moviemaking," wrote the film critic of the Boston Globe. "Rests its outrage on reason, research and careful argument," opined the New York Times. The "masterpiece" referred to was the recently released Inside Job, a documentary film that focuses on the causes of the 2008 financial crisis.
Javier E

Book Review: 'The Maniac,' by Benjamín Labatut - The New York Times - 0 views

  • it quickly becomes clear that what “The Maniac” is really trying to get a lock on is our current age of digital-informational mastery and subjection
  • When von Neumann proclaims that, thanks to his computational advances, “all processes that are stable we shall predict” and “all processes that are unstable we shall control,” we’re being prompted to reflect on today’s ubiquitous predictive-slash-determinative algorithms.
  • When he publishes a paper about the feasibility of a self-reproducing machine — “you need to have a mechanism, not only of copying a being, but of copying the instructions that specify that being” — few contemporary readers will fail to home straight in on the fraught subject of A.I.
  • ...9 more annotations...
  • Haunting von Neumann’s thought experiment is the specter of a construct that, in its very internal perfection, lacks the element that would account for itself as a construct. “If someone succeeded in creating a formal system of axioms that was free of all internal paradoxes and contradictions,” another of von Neumann’s interlocutors, the logician Kurt Gödel, explains, “it would always be incomplete, because it would contain truths and statements that — while being undeniably true — could never be proven within the laws of that system.”
  • its deeper (and, for me, more compelling) theme: the relation between reason and madness.
  • Almost all the scientists populating the book are mad, their desire “to understand, to grasp the core of things” invariably wedded to “an uncontrollable mania”; even their scrupulously observed reason, their mode of logic elevated to religion, is framed as a form of madness. Von Neumann’s response to the detonation of the Trinity bomb, the world’s first nuclear explosion, is “so utterly rational that it bordered on the psychopathic,” his second wife, Klara Dan, muses
  • fanaticism, in the 1930s, “was the norm … even among us mathematicians.”
  • Pondering Gödel’s own descent into mania, the physicist Eugene Wigner claims that “paranoia is logic run amok.” If you’ve convinced yourself that there’s a reason for everything, “it’s a small step to begin to see hidden machinations and agents operating to manipulate the most common, everyday occurrences.”
  • the game theory-derived system of mutually assured destruction he devises in its wake is “perfectly rational insanity,” according to its co-founder Oskar Morgenstern.
  • Labatut has Morgenstern end his MAD deliberations by pointing out that humans are not perfect poker players. They are irrational, a fact that, while instigating “the ungovernable chaos that we see all around us,” is also the “mercy” that saves us, “a strange angel that protects us from the mad dreams of reason.”
  • But does von Neumann really deserve the title “Father of Computers,” granted him here by his first wife, Mariette Kovesi? Doesn’t Ada Lovelace have a prior claim as their mother? Feynman’s description of the Trinity bomb as “a little Frankenstein monster” should remind us that it was Mary Shelley, not von Neumann and his coterie, who first grasped the monumental stakes of modeling the total code of life, its own instructions for self-replication, and that it was Rosalind Franklin — working alongside, not under, Maurice Wilkins — who first carried out this modeling.
  • he at least grants his women broader, more incisive wisdom. Ehrenfest’s lover Nelly Posthumus Meyjes delivers a persuasive lecture on the Pythagorean myth of the irrational, suggesting that while scientists would never accept the fact that “nature cannot be cognized as a whole,” artists, by contrast, “had already fully embraced it.”
caelengrubb

How History Gets Things Wrong: The Neuroscience of Our Addiction to Stories | Reviews |... - 1 views

  • In this book, Rosenberg elaborates further such arguments to take issue with historical narratives and, more generally, with the way in which we most pervasively make sense of people's actions and motivations through narratives.
  • His main contention is that such narratives are always wrong or, to put it differently, that they can't possibly be right.
  • Rosenberg argues that neuroscience itself, our only reliable method to study psychological capacities, shows that theory of mind's posits do not exist
  • ...12 more annotations...
  • The reason is that the best available evidence on how the brain works shows that the brain does not deal with the kind of things that beliefs and desires are supposed to trade with: contents.
  • When we believe or desire, something is believed or desired: that I have bread, that it rains, that the Parliament passes the bill, etc. Beliefs and desires are about something.
  • . After being presented with the assassin and the crime, the book moves on to explain why, even if always wrong, narratives in general, and historical narratives in particular, are so compelling for us. Even if we have no claim to truth or correctness for our narratives, narratives seem to be highly convincing in moving us to act
  • Furthermore, we cannot but think in terms of them. Rosenberg's explanation for this 'addiction to stories' is that it has been entrenched in us by evolutionary processes that took place over the last million years of natural history
  • Narrative explanations emerged out of Darwinian processes of natural selection -- or "environmental filtration", in the less purposive parlance Rosenberg prefers -- that allowed our ancestors to coordinate efforts, collaborate and flourish, moving from the bottom to the top of the Pleistocene's food chain. Rosenberg argues that while the basic mechanisms of mindreading pervasive in the animal kingdom, based on mutual tracking and monitoring of animals' behavior, are a sound method for getting agents to coordinate behavior, these mechanisms' more recent successor, the theory of mind, crafted by the use of co-evolved languages, turned those mindreading abilities into a theory with empirical hypothesis about agents' beliefs and desires but no facts to match them.
  • The error historians allegedly make lies in mistaking stories for real explanations, surmising that behind our behavior there are purposes, rational motivations.
  • Historians -- in particular narrative historians -- make a pervasive use of folk psychological explanations, i.e., explanations that describe events in terms of the beliefs and desires of historical agents, including individuals and groups.
  • In order for folk psychological and historical narratives to be right there have to be facts of the matter about what sentences in such explanation refer to that make them true.
  • Folk psychological explanations of actions in terms of platitudes about beliefs and desires pairings evolved in natural history closely related to mind-reading mechanisms that allowed our ancestors to deal with cooperation and coordination problems.
  • There are no interpretative mechanisms in the brain (at any level of description) that can vindicate the attribution of contents to beliefs and desires.
  • There are no facts of the matter that allow us to select belief/desire pairings as those actually operating 'behind' an agent's behavior.
  • Folk psychological explanations do not track any facts and thus can't be correct.
Javier E

Big Data Is Great, but Don't Forget Intuition - NYTimes.com - 2 views

  • THE problem is that a math model, like a metaphor, is a simplification. This type of modeling came out of the sciences, where the behavior of particles in a fluid, for example, is predictable according to the laws of physics.
  • In so many Big Data applications, a math model attaches a crisp number to human behavior, interests and preferences. The peril of that approach, as in finance, was the subject of a recent book by Emanuel Derman, a former quant at Goldman Sachs and now a professor at Columbia University. Its title is “Models. Behaving. Badly.”
  • A report last year by the McKinsey Global Institute, the research arm of the consulting firm, projected that the United States needed 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers, whether retrained or hired.
  • ...4 more annotations...
  • A major part of managing Big Data projects, he says, is asking the right questions: How do you define the problem? What data do you need? Where does it come from? What are the assumptions behind the model that the data is fed into? How is the model different from reality?
  • Society might be well served if the model makers pondered the ethical dimensions of their work as well as studying the math, according to Rachel Schutt, a senior statistician at Google Research. “Models do not just predict, but they can make things happen,” says Ms. Schutt, who taught a data science course this year at Columbia. “That’s not discussed generally in our field.”
  • the increasing use of software that microscopically tracks and monitors online behavior has raised privacy worries. Will Big Data usher in a digital surveillance state, mainly serving corporate interests?
  • my bigger concern is that the algorithms that are shaping my digital world are too simple-minded, rather than too smart. That was a theme of a book by Eli Pariser, titled “The Filter Bubble: What the Internet Is Hiding From You.”
Javier E

Are the New 'Golden Age' TV Shows the New Novels? - NYTimes.com - 0 views

  • it’s become common to hear variations on the idea that quality cable TV shows are the new novels.
  • Thomas Doherty, writing in The Chronicle of Higher Education, called the new genre “Arc TV” — because its stories follow long, complex arcs of development — and insisted that “at its best, the world of Arc TV is as exquisitely calibrated as the social matrix of a Henry James novel.”
  • Mixed feelings about literature — the desire to annex its virtues while simultaneously belittling them — are typical of our culture today, which doesn’t know quite how to deal with an art form, like the novel, that is both democratic and demanding.
  • ...6 more annotations...
  • comparing even the best TV shows with Dickens, or Henry James, also suggests how much the novel can achieve that TV doesn’t even attempt.
  • Televised evil, for instance, almost always takes melodramatic form: Our anti-heroes are mobsters, meth dealers or terrorists. But this has nothing to do with the way we encounter evil in real life, which is why a character like Gilbert Osmond, in “The Portrait of a Lady,” is more chilling in his bullying egotism than Tony Soprano
  • Spectacle and melodrama remain at the heart of TV, as they do with all arts that must reach a large audience in order to be economically viable. But it is voice, tone, the sense of the author’s mind at work, that are the essence of literature, and they exist in language, not in images.
  • At this point in our technological evolution, to read a novel is to engage in probably the second-largest single act of pleasure-based data transfer that can take place between two human beings, exceeded only by sex. Novels are characterized by their intimacy, which is extreme, by their scale, which is vast, and by their form, which is linguistic and synesthetic. The novel is a kinky beast.
  • Television gives us something that looks like a small world, made by a group of people who are themselves a small world. The novel gives us sounds pinned down by hieroglyphs, refracted flickerings inside an individual.
  • television and the novel travel in opposite directions.
‹ Previous 21 - 40 of 146 Next › Last »
Showing 20 items per page