Skip to main content

Home/ TOK Friends/ Group items matching "concrete" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Emilio Ergueta

A Stunning Statistic About China and Concrete | Bill Gates - 0 views

  • Smil cites studies that say replacing mud floors with concrete floors in the world’s poorest homes would improve sanitation and cut the incidence of parasitic diseases by nearly 80 percent. Paving streets, he says, “boosts land and rental values, school enrollment, and overall economic activity and also improves access to credit.”
  • . In the coming decades, the United States and China alone will need to spend trillions of dollars replacing and disposing of concrete laid down in the past generation. There are also environmental problems, including all the carbon dioxide that’s released during production.
  • I am optimistic that innovation can help reduce the downsides of concrete. For example, mini-sensors embedded inside it could alert engineers when it needs to be replaced.
kirkpatrickry

Time to call an end to free market supremacy | GulfNews.com - 0 views

  • Time to call an end to free market supremacy Recently published book, ‘Concrete Economics’, advocates a new industrialism that requires some form of activist government
  • If you’re at all concerned about economic policy, this is a book you need to read. It will take you only a couple of hours, and the time will be well spent
  • But despite the inherent limitations of historical analysis, the message of ‘Concrete Economics’ is one that US policymakers need to hear. One reason is that DeLong and Cohen are absolutely right — the American mind has been far too captured by the beguilingly simple and powerful theory of free-market dogma. That theory was oversold, and we need a corrective. We need history.
Javier E

Metaphors: Johnson: The impossibility of being literal | The Economist - 1 views

  • IT IS literally impossible to be literal.
  • Guy Deutscher, a linguist at the University of Manchester, calls all language “a reef of dead metaphor”. Most of the time we do not realise that nearly every word that comes out of our mouths has made some kind of jump from older, concrete meanings to the ones we use today.  This process is simple language change. Yesterday’s metaphors become so common that today we don’t process them as metaphors at all. 
  • if “tree” and “rock” aren’t metaphors, nearly everything else in our vocabulary seems to be. For example, you cannot use “independent” without metaphor, unless you mean “not hanging from”. You can’t use “transpire” unless you mean “to breathe through”. The first English meaning of a book was “a written document”. If we want to avoid all metaphorised language (If we want to be “literal”), we must constantly rush to a historical dictionary and frantically check
  • ...2 more annotations...
  • In every language, pretty much everything is metaphor—even good old “literally”, the battle-axe of those who think that words can always be pinned down precisely.
  • The body of educated English speakers has decided, by voice and by deed, that “literally” does mean something real in the real world. Namely, “not figuratively, allegorically”. Widespread educated usage is ultimately what determines its meaning. And perhaps that is concrete enough.
Javier E

Why Are Hundreds of Harvard Students Studying Ancient Chinese Philosophy? - Christine Gross-Loh - The Atlantic - 0 views

  • Puett's course Classical Chinese Ethical and Political Theory has become the third most popular course at the university. The only classes with higher enrollment are Intro to Economics and Intro to Computer Science.
  • the class fulfills one of Harvard's more challenging core requirements, Ethical Reasoning. It's clear, though, that students are also lured in by Puett's bold promise: “This course will change your life.”
  • Puett uses Chinese philosophy as a way to give undergraduates concrete, counter-intuitive, and even revolutionary ideas, which teach them how to live a better life. 
  • ...18 more annotations...
  • Puett puts a fresh spin on the questions that Chinese scholars grappled with centuries ago. He requires his students to closely read original texts (in translation) such as Confucius’s Analects, the Mencius, and the Daodejing and then actively put the teachings into practice in their daily lives. His lectures use Chinese thought in the context of contemporary American life to help 18- and 19-year-olds who are struggling to find their place in the world figure out how to be good human beings; how to create a good society; how to have a flourishing life. 
  • Puett began offering his course to introduce his students not just to a completely different cultural worldview but also to a different set of tools. He told me he is seeing more students who are “feeling pushed onto a very specific path towards very concrete career goals”
  • Puett tells his students that being calculating and rationally deciding on plans is precisely the wrong way to make any sort of important life decision. The Chinese philosophers they are reading would say that this strategy makes it harder to remain open to other possibilities that don’t fit into that plan.
  • Students who do this “are not paying enough attention to the daily things that actually invigorate and inspire them, out of which could come a really fulfilling, exciting life,” he explains. If what excites a student is not the same as what he has decided is best for him, he becomes trapped on a misguided path, slated to begin an unfulfilling career.
  • He teaches them that:   The smallest actions have the most profound ramifications. 
  • From a Chinese philosophical point of view, these small daily experiences provide us endless opportunities to understand ourselves. When we notice and understand what makes us tick, react, feel joyful or angry, we develop a better sense of who we are that helps us when approaching new situations. Mencius, a late Confucian thinker (4th century B.C.E.), taught that if you cultivate your better nature in these small ways, you can become an extraordinary person with an incredible influence
  • Decisions are made from the heart. Americans tend to believe that humans are rational creatures who make decisions logically, using our brains. But in Chinese, the word for “mind” and “heart” are the same.
  • If the body leads, the mind will follow. Behaving kindly (even when you are not feeling kindly), or smiling at someone (even if you aren’t feeling particularly friendly at the moment) can cause actual differences in how you end up feeling and behaving, even ultimately changing the outcome of a situation.
  • In the same way that one deliberately practices the piano in order to eventually play it effortlessly, through our everyday activities we train ourselves to become more open to experiences and phenomena so that eventually the right responses and decisions come spontaneously, without angst, from the heart-mind.
  • Whenever we make decisions, from the prosaic to the profound (what to make for dinner; which courses to take next semester; what career path to follow; whom to marry), we will make better ones when we intuit how to integrate heart and mind and let our rational and emotional sides blend into one. 
  • Aristotle said, “We are what we repeatedly do,” a view shared by thinkers such as Confucius, who taught that the importance of rituals lies in how they inculcate a certain sensibility in a person.
  • “The Chinese philosophers we read taught that the way to really change lives for the better is from a very mundane level, changing the way people experience and respond to the world, so what I try to do is to hit them at that level. I’m not trying to give my students really big advice about what to do with their lives. I just want to give them a sense of what they can do daily to transform how they live.”
  • Their assignments are small ones: to first observe how they feel when they smile at a stranger, hold open a door for someone, engage in a hobby. He asks them to take note of what happens next: how every action, gesture, or word dramatically affects how others respond to them. Then Puett asks them to pursue more of the activities that they notice arouse positive, excited feelings.
  • Once they’ve understood themselves better and discovered what they love to do they can then work to become adept at those activities through ample practice and self-cultivation. Self-cultivation is related to another classical Chinese concept: that effort is what counts the most, more than talent or aptitude. We aren’t limited to our innate talents; we all have enormous potential to expand our abilities if we cultivate them
  • To be interconnected, focus on mundane, everyday practices, and understand that great things begin with the very smallest of acts are radical ideas for young people living in a society that pressures them to think big and achieve individual excellence.
  • One of Puett’s former students, Adam Mitchell, was a math and science whiz who went to Harvard intending to major in economics. At Harvard specifically and in society in general, he told me, “we’re expected to think of our future in this rational way: to add up the pros and cons and then make a decision. That leads you down the road of ‘Stick with what you’re good at’”—a road with little risk but little reward.
  • after his introduction to Chinese philosophy during his sophomore year, he realized this wasn’t the only way to think about the future. Instead, he tried courses he was drawn to but wasn’t naturally adroit at because he had learned how much value lies in working hard to become better at what you love. He became more aware of the way he was affected by those around him, and how they were affected by his own actions in turn. Mitchell threw himself into foreign language learning, feels his relationships have deepened, and is today working towards a master’s degree in regional studies.
  • “I can happily say that Professor Puett lived up to his promise, that the course did in fact change my life.”
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-Paul Sartre, Simone de Beauvoir, Albert Camus, Martin Heidegger, Maurice Merleau-Ponty and Others (Sarah Bakewell) - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Liu Cixin's War of the Worlds | The New Yorker - 0 views

  • he briskly dismissed the idea that fiction could serve as commentary on history or on current affairs. “The whole point is to escape the real world!” he said.
  • Chinese tech entrepreneurs discuss the Hobbesian vision of the trilogy as a metaphor for cutthroat competition in the corporate world; other fans include Barack Obama, who met Liu in Beijing two years ago, and Mark Zuckerberg. Liu’s international career has become a source of national pride. In 2015, China’s then Vice-President, Li Yuanchao, invited Liu to Zhongnanhai—an off-limits complex of government accommodation sometimes compared to the Kremlin—to discuss the books and showed Liu his own copies, which were dense with highlights and annotations.
  • In China, one of his stories has been a set text in the gao kao—the notoriously competitive college-entrance exams that determine the fate of ten million pupils annually; another has appeared in the national seventh-grade-curriculum textbook. When a reporter recently challenged Liu to answer the middle-school questions about the “meaning” and the “central themes” of his story, he didn’t get a single one right. “I’m a writer,” he told me, with a shrug.
  • ...20 more annotations...
  • Liu’s tomes—they tend to be tomes—have been translated into more than twenty languages, and the trilogy has sold some eight million copies worldwide. He has won China’s highest honor for science-fiction writing, the Galaxy Award, nine times, and in 2015 he became the first Asian writer to win the Hugo Award, the most prestigious international science-fiction prize
  • “The Three-Body Problem” takes its title from an analytical problem in orbital mechanics which has to do with the unpredictable motion of three bodies under mutual gravitational pull. Reading an article about the problem, Liu thought, What if the three bodies were three suns? How would intelligent life on a planet in such a solar system develop? From there, a structure gradually took shape that almost resembles a planetary system, with characters orbiting the central conceit like moons. For better or worse, the characters exist to support the framework of the story rather than to live as individuals on the page.
  • Concepts that seemed abstract to others took on, for him, concrete forms; they were like things he could touch, inducing a “druglike euphoria.” Compared with ordinary literature, he came to feel, “the stories of science are far more magnificent, grand, involved, profound, thrilling, strange, terrifying, mysterious, and even emotional
  • Pragmatic choices like this one, or like the decision his grandparents made when their sons were conscripted, recur in his fiction—situations that present equally unconscionable choices on either side of a moral fulcrum
  • The great flourishing of science fiction in the West at the end of the nineteenth century occurred alongside unprecedented technological progress and the proliferation of the popular press—transformations that were fundamental to the development of the genre
  • Joel Martinsen, the translator of the second volume of Liu’s trilogy, sees the series as a continuation of this tradition. “It’s not hard to read parallels between the Trisolarans and imperialist designs on China, driven by hunger for resources and fear of being wiped out,” he told me. Even Liu, unwilling as he is to endorse comparisons between the plot and China’s current face-off with the U.S., did at one point let slip that “the relationship between politics and science fiction cannot be underestimated.”
  • Speculative fiction is the art of imagining alternative worlds, and the same political establishment that permits it to be used as propaganda for the existing regime is also likely to recognize its capacity to interrogate the legitimacy of the status quo.
  • Liu has been criticized for peopling his books with characters who seem like cardboard cutouts installed in magnificent dioramas. Liu readily admits to the charge. “I did not begin writing for love of literature,” he told me. “I did so for love of science.”
  • Liu believes that this trend signals a deeper shift in the Chinese mind-set—that technological advances have spurred a new excitement about the possibilities of cosmic exploration.
  • Liu’s imagination is dauntingly capacious, his narratives conceived on a scale that feels, at times, almost hallucinogenic. The time line of the trilogy spans 18,906,450 years, encompassing ancient Egypt, the Qin dynasty, the Byzantine Empire, the Cultural Revolution, the present, and a time eighteen million years in the future
  • The first book is set on Earth, though some of its scenes take place in virtual reality; by the end of the third book, the scope of the action is interstellar and annihilation unfolds across several dimensions. The London Review of Books has called the trilogy “one of the most ambitious works of science fiction ever written.”
  • Although physics furnishes the novels’ premises, it is politics that drives the plots. At every turn, the characters are forced to make brutal calculations in which moral absolutism is pitted against the greater good
  • In Liu’s fictional universe, idealism is fatal and kindness an exorbitant luxury. As one general says in the trilogy, “In a time of war, we can’t afford to be too scrupulous.” Indeed, it is usually when people do not play by the rules of Realpolitik that the most lives are lost.
  • “I know what you are thinking,” he told me with weary clarity. “What about individual liberty and freedom of governance?” He sighed, as if exhausted by a debate going on in his head. “But that’s not what Chinese people care about. For ordinary folks, it’s the cost of health care, real-estate prices, their children’s education. Not democracy.”
  • Liu closed his eyes for a long moment and then said quietly, “This is why I don’t like to talk about subjects like this. The truth is you don’t really—I mean, can’t truly—understand.”
  • Liu explained to me, the existing regime made the most sense for today’s China, because to change it would be to invite chaos. “If China were to transform into a democracy, it would be hell on earth,”
  • It was an opinion entirely consistent with his systems-level view of human societies, just as mine reflected a belief in democracy and individualism as principles to be upheld regardless of outcomes
  • “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark. Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains.
  • Chinese people of his generation were lucky, he said. The changes they had seen were so huge that they now inhabited a world entirely different from that of their childhood. “China is a futuristic country,” he said. “I realized that the world around me became more and more like science fiction, and this process is speeding up.”
  • “We have statues of a few martyrs, but we never—We don’t memorialize those, the individuals.” He took off his glasses and blinked, peering into the wide expanse of green and concrete. “This is how we Chinese have always been,” he said. “When something happens, it passes, and time buries the stories.”
pier-paolo

Opinion | The Smile of Reason - The New York Times - 0 views

  • hey say Voltaire glowed with the smile of reason, and Friedman did too. And while I never became a libertarian as he was, the encounter was one of the turning points in my life. It opened new ways of seeing the world and was an exhilarating demonstration of the power of ideas.
  • Friedman’s trek from the intellectual wilderness to global influence is one of the most exhilarating exodus stories of our time
  • He was proudest of his contributions to technical economics, but he also possessed that rarest of gifts, a practical imagination, and was a fountain of concrete policy ideas.
  • ...2 more annotations...
  • Friedman roared with approving laughter. He believed in clear language, and as Samuel Brittan has noted, preferred the spoken to the written word.
  • because classical economics is under its greatest threat in a generation. Growing evidence suggests average workers are not seeing the benefits of their productivity gains — that the market is broken and requires heavy government correction. Friedman’s heirs have been avoiding this debate. They’re losing it badly and have offered no concrete remedies to address this problem, if it is one.
Javier E

Opinion | How to be Human - The New York Times - 0 views

  • I have learned something profound along the way. Being openhearted is a prerequisite for being a full, kind and wise human being. But it is not enough. People need social skills
  • The real process of, say, building a friendship or creating a community involves performing a series of small, concrete actions well: being curious about other people; disagreeing without poisoning relationships; revealing vulnerability at an appropriate pace; being a good listener; knowing how to ask for and offer forgiveness; knowing how to host a gathering where everyone feels embraced; knowing how to see things from another’s point of view.
  • People want to connect. Above almost any other need, human beings long to have another person look into their faces with love and acceptance
  • ...68 more annotations...
  • we lack practical knowledge about how to give one another the attention we crave
  • Some days it seems like we have intentionally built a society that gives people little guidance on how to perform the most important activities of life.
  • If I can shine positive attention on others, I can help them to blossom. If I see potential in others, they may come to see potential in themselves. True understanding is one of the most generous gifts any of us can give to another.
  • I see the results, too, in the epidemic of invisibility I encounter as a journalist. I often find myself interviewing people who tell me they feel unseen and disrespected
  • I’ve been working on a book called “How to Know a Person: The Art of Seeing Others Deeply and Being Deeply Seen.” I wanted it to be a practical book — so that I would learn these skills myself, and also, I hope, teach people how to understand others, how to make them feel respected, valued and understood.
  • I wanted to learn these skills for utilitarian reasons
  • If I’m going to work with someone, I don’t just want to see his superficial technical abilities. I want to understand him more deeply — to know whether he is calm in a crisis, comfortable with uncertainty or generous to colleagues.
  • I wanted to learn these skills for moral reasons
  • Many of the most productive researchers were in the habit of having breakfast or lunch with an electrical engineer named Harry Nyquist. Nyquist really listened to their challenges, got inside their heads, brought out the best in them. Nyquist, too, was an illuminator.
  • Finally, I wanted to learn these skills for reasons of national survival
  • We evolved to live with small bands of people like ourselves. Now we live in wonderfully diverse societies, but our social skills are inadequate for the divisions that exist. We live in a brutalizing time.
  • In any collection of humans, there are diminishers and there are illuminators. Diminishers are so into themselves, they make others feel insignificant
  • They stereotype and label. If they learn one thing about you, they proceed to make a series of assumptions about who you must be.
  • Illuminators, on the other hand, have a persistent curiosity about other people.
  • hey have been trained or have trained themselves in the craft of understanding others. They know how to ask the right questions at the right times — so that they can see things, at least a bit, from another’s point of view. They shine the brightness of their care on people and make them feel bigger, respected, lit up.
  • A biographer of the novelist E.M. Forster wrote, “To speak with him was to be seduced by an inverse charisma, a sense of being listened to with such intensity that you had to be your most honest, sharpest, and best self.” Imagine how good it would be to offer people that kind of hospitality.
  • social clumsiness I encounter too frequently. I’ll be leaving a party or some gathering and I’ll realize: That whole time, nobody asked me a single question. I estimate that only 30 percent of the people in the world are good question askers. The rest are nice people, but they just don’t ask. I think it’s because they haven’t been taught to and so don’t display basic curiosity about others.
  • Many years ago, patent lawyers at Bell Labs were trying to figure out why some employees were much more productive than others.
  • Illuminators are a joy to be around
  • The gift of attention.
  • Each of us has a characteristic way of showing up in the world. A person who radiates warmth will bring out the glowing sides of the people he meets, while a person who conveys formality can meet the same people and find them stiff and detached. “Attention,” the psychiatrist Iain McGilchrist writes, “is a moral act: It creates, brings aspects of things into being.”
  • When Jimmy sees a person — any person — he is seeing a creature with infinite value and dignity, made in the image of God. He is seeing someone so important that Jesus was willing to die for that person.
  • Accompaniment.
  • Accompaniment is an other-centered way of being with people during the normal routines of life.
  • If we are going to accompany someone well, we need to abandon the efficiency mind-set. We need to take our time and simply delight in another person’s way of being
  • I know a couple who treasure friends who are what they call “lingerable.” These are the sorts of people who are just great company, who turn conversation into a form of play and encourage you to be yourself. It’s a great talent, to be lingerable.
  • Other times, a good accompanist does nothing more than practice the art of presence, just being there.
  • The art of conversation.
  • If you tell me something important and then I paraphrase it back to you, what psychologists call “looping,” we can correct any misimpressions that may exist between us.
  • Be a loud listener. When another person is talking, you want to be listening so actively you’re burning calories.
  • He’s continually responding to my comments with encouraging affirmations, with “amen,” “aha” and “yes!” I love talking to that guy.
  • I no longer ask people: What do you think about that? Instead, I ask: How did you come to believe that? That gets them talking about the people and experiences that shaped their values.
  • Storify whenever possible
  • People are much more revealing and personal when they are telling stories.
  • Do the looping, especially with adolescents
  • If you want to know how the people around you see the world, you have to ask them. Here are a few tips I’ve collected from experts on how to become a better conversationalist:
  • Turn your partner into a narrator
  • People don’t go into enough detail when they tell you a story. If you ask specific follow-up questions — Was your boss screaming or irritated when she said that to you? What was her tone of voice? — then they will revisit the moment in a more concrete way and tell a richer story
  • If somebody tells you he is having trouble with his teenager, don’t turn around and say: “I know exactly what you mean. I’m having incredible problems with my own Susan.” You may think you’re trying to build a shared connection, but what you are really doing is shifting attention back to yourself.
  • Don’t be a topper
  • Big questions.
  • The quality of your conversations will depend on the quality of your questions
  • As adults, we get more inhibited with our questions, if we even ask them at all. I’ve learned we’re generally too cautious. People are dying to tell you their stories. Very often, no one has ever asked about them.
  • So when I first meet people, I tend to ask them where they grew up. People are at their best when talking about their childhoods. Or I ask where they got their names. That gets them talking about their families and ethnic backgrounds.
  • After you’ve established trust with a person, it’s great to ask 30,000-foot questions, ones that lift people out of their daily vantage points and help them see themselves from above.
  • These are questions like: What crossroads are you at? Most people are in the middle of some life transition; this question encourages them to step back and describe theirs
  • I’ve learned it’s best to resist this temptation. My first job in any conversation across difference or inequality is to stand in other people’s standpoint and fully understand how the world looks to them. I’ve found it’s best to ask other people three separate times and in three different ways about what they have just said. “I want to understand as much as possible. What am I missing here?”
  • Can you be yourself where you are and still fit in? And: What would you do if you weren’t afraid? Or: If you died today, what would you regret not doing?
  • “What have you said yes to that you no longer really believe in?
  • “What is the no, or refusal, you keep postponing?”
  • “What is the gift you currently hold in exile?,” meaning, what talent are you not using
  • “Why you?” Why was it you who started that business? Why was it you who ran for school board? She wants to understand why a person felt the call of responsibility. She wants to understand motivation.
  • “How do your ancestors show up in your life?” But it led to a great conversation in which each of us talked about how we’d been formed by our family heritages and cultures. I’ve come to think of questioning as a moral practice. When you’re asking good questions, you’re adopting a posture of humility, and you’re honoring the other person.
  • Stand in their standpoint
  • I used to feel the temptation to get defensive, to say: “You don’t know everything I’m dealing with. You don’t know that I’m one of the good guys here.”
  • If the next five years is a chapter in your life, what is the chapter about?
  • every conversation takes place on two levels
  • The official conversation is represented by the words we are saying on whatever topic we are talking about. The actual conversations occur amid the ebb and flow of emotions that get transmitted as we talk. With every comment I am showing you respect or disrespect, making you feel a little safer or a little more threatened.
  • If we let fear and a sense of threat build our conversation, then very quickly our motivations will deteriorate
  • If, on the other hand, I show persistent curiosity about your viewpoint, I show respect. And as the authors of “Crucial Conversations” observe, in any conversation, respect is like air. When it’s present nobody notices it, and when it’s absent it’s all anybody can think about.
  • the novelist and philosopher Iris Murdoch argued that the essential moral skill is being considerate to others in the complex circumstances of everyday life. Morality is about how we interact with each other minute by minute.
  • I used to think the wise person was a lofty sage who doled out life-altering advice in the manner of Yoda or Dumbledore or Solomon. But now I think the wise person’s essential gift is tender receptivity.
  • The illuminators offer the privilege of witness. They take the anecdotes, rationalizations and episodes we tell and see us in a noble struggle. They see the way we’re navigating the dialectics of life — intimacy versus independence, control versus freedom — and understand that our current selves are just where we are right now on our long continuum of growth.
  • The really good confidants — the people we go to when we are troubled — are more like coaches than philosopher kings.
  • They take in your story, accept it, but prod you to clarify what it is you really want, or to name the baggage you left out of your clean tale.
  • They’re not here to fix you; they are here simply to help you edit your story so that it’s more honest and accurate. They’re here to call you by name, as beloved
  • They see who you are becoming before you do and provide you with a reputation you can then go live into.
  • there has been a comprehensive shift in my posture. I think I’m more approachable, vulnerable. I know more about human psychology than I used to. I have a long way to go, but I’m evidence that people can change, sometimes dramatically, even in middle and older age.
Javier E

Do Scientists Regret Not Sticking to the Science? - WSJ - 0 views

  • In a preregistered large-sample controlled experiment, I randomly assigned participants to receive information about the endorsement of Joe Biden by the scientific journal Nature during the COVID-19 pandemic. The endorsement message caused large reductions in stated trust in Nature among Trump supporters. This distrust lowered the demand for COVID-related information provided by Nature, as evidenced by substantially reduced requests for Nature articles on vaccine efficacy when offered. The endorsement also reduced Trump supporters’ trust in scientists in general. The estimated effects on Biden supporters’ trust in Nature and scientists were positive, small and mostly statistically insignificant. I found little evidence that the endorsement changed views about Biden and Trump.
  • These results suggest that political endorsement by scientific journals can undermine and polarize public confidence in the endorsing journals and the scientific community.
  • ... scientists don’t have any special expertise on questions of values and policy. “Sticking to the science” keeps scientists speaking on issues precisely where they ought to be trusted by the public.
  • ...3 more annotations...
  • In the summer of 2020, “public-health experts” decided that racism is a public-health crisis comparable to the coronavirus pandemic. It was therefore, they claimed, within their purview to express public support for the Black Lives Matter protests following the murder of George Floyd and to argue that the benefits of such protests outweighed the increased risk of spreading the disease. Those supposed experts actually knew nothing about the likely effects of the protests. They made no concrete predictions about whether they would in any way ameliorate racism in America, just as Nature can make no concrete predictions about whether its political endorsements will actually help a preferred candidate without jeopardizing its other important goals. The political action was expressive, not evidence-based...
  • as is often the case, a debate which appears to be about the neutrality of institutions is not really about neutrality at all... Rather, it is about whether there is any room left for soberly weighing our goals and values and thinking in a measured way about the consequences of our actions rather than simply reacting to situations in an impulsive and expressive manner, broadcasting our views to the world so that people know where we stand.
  • Our goals and values might not be “neutral” at all, but they might still be best served by procedures, institutions, and even individuals that follow neutral principles.
Zack Lessner

Fiscal Ultimatum Fatigue - 0 views

  •  
    Congress must act soon to come up with a concrete plan regarding spending cuts.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
oliviaodon

EFFECTIVE USE OF LANGUAGE - 0 views

  • To communicate effectively, it is not enough to have well organized ideas expressed in complete and coherent sentences and paragraphs. One must also think about the style, tone and clarity of his/her writing, and adapt these elements to the reading audience. Again, analyzing one's audience and purpose is the key to writing effectiveness. In order to choose the most effective language, the writer must consider the objective of the document, the context in which it is being written, and who will be reading it.
  • Effective language is: (1) concrete and specific, not vague and abstract; (2) concise, not verbose; (3) familiar, not obscure; (4) precise and clear, not inaccurate or ambiguous; (5) constructive, not destructive; and (6) appropriately formal.
  •  
    This article is informative and extremely helpful if you are preparing for an  essay or presentation!
Javier E

Covering politics in a "post-truth" America | Brookings Institution - 0 views

  • The media scandal of 2016 isn’t so much about what reporters failed to tell the American public; it’s about what they did report on, and the fact that it didn’t seem to matter.
  • Facebook and Snapchat and the other social media sites should rightfully be doing a lot of soul-searching about their role as the most efficient distribution network for conspiracy theories, hatred, and outright falsehoods ever invented.
  • I’ve been obsessively looking back over our coverage, too, trying to figure out what we missed along the way to the upset of the century
  • ...28 more annotations...
  • (An early conclusion: while we were late to understand how angry white voters were, a perhaps even more serious lapse was in failing to recognize how many disaffected Democrats there were who would stay home rather than support their party’s flawed candidate.)
  • Stories that would have killed any other politician—truly worrisome revelations about everything from the federal taxes Trump dodged to the charitable donations he lied about, the women he insulted and allegedly assaulted, and the mob ties that have long dogged him—did not stop Trump from thriving in this election year
  • the Oxford Dictionaries announced that “post-truth” had been chosen as the 2016 word of the year, defining it as a condition “in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”
  • Meantime, Trump personally blacklisted news organizations like Politico and The Washington Post when they published articles he didn’t like during the campaign, has openly mused about rolling back press freedoms enshrined by the U.S. Supreme Court, and has now named Stephen Bannon, until recently the executive chairman of Breitbart—a right-wing fringe website with a penchant for conspiracy theories and anti-Semitic tropes—to serve as one of his top White House advisers.
  • none of this has any modern precedent. And what makes it unique has nothing to do with the outcome of the election. This time, the victor was a right-wing demagogue; next time, it may be a left-wing populist who learns the lessons of Trump’s win.
  • This is no mere academic argument. The election of 2016 showed us that Americans are increasingly choosing to live in a cloud of like-minded spin, surrounded by the partisan political hackery and fake news that poisons their Facebook feeds.
  • To help us understand it all, there were choices, but not that many: three TV networks that mattered, ABC, CBS, and NBC; two papers for serious journalism, The New York Times and The Washington Post; and two giant-circulation weekly newsmagazines, Time and Newsweek. That, plus whatever was your local daily newspaper, pretty much constituted the news.
  • Fake news is thriving In the final three months of the presidential campaign, the 20 top-performing fake election news stories generated more engagement on Facebook than the top stories from major news outlets such as The New York Times.
  • Eventually, I came to think of the major media outlets of that era as something very similar to the big suburban shopping malls we flocked to in the age of shoulder pads and supply-side economics: We could choose among Kmart and Macy’s and Saks Fifth Avenue as our budgets and tastes allowed, but in the end the media were all essentially department stores, selling us sports and stock tables and foreign news alongside our politics, whether we wanted them or not. It may not have been a monopoly, but it was something pretty close.
  • This was still journalism in the scarcity era, and it affected everything from what stories we wrote to how fast we could produce them. Presidents could launch global thermonuclear war with the Russians in a matter of minutes, but news from the American hinterlands often took weeks to reach their sleepy capital. Even information within that capital was virtually unobtainable without a major investment of time and effort. Want to know how much a campaign was raising and spending from the new special-interest PACs that had proliferated? Prepare to spend a day holed up at the Federal Election Commission’s headquarters down on E Street across from the hulking concrete FBI building, and be sure to bring a bunch of quarters for the copy machine.
  • I am writing this in the immediate, shocking aftermath of a 2016 presidential election in which the Pew Research Center found that a higher percentage of Americans got their information about the campaign from late-night TV comedy shows than from a national newspaper. Don Graham sold the Post three years ago and though its online audience has been skyrocketing with new investments from Amazon.com founder Jeff Bezos, it will never be what it was in the ‘80s. That same Pew survey reported that a mere 2 percent of Americans today turned to such newspapers as the “most helpful” guides to the presidential campaign.
  • In 2013, Mark Leibovich wrote a bestselling book called This Town about the party-hopping, lobbyist-enabling nexus between Washington journalists and the political world they cover. A key character was Politico’s Mike Allen, whose morning email newsletter “Playbook” had become a Washington ritual, offering all the news and tidbits a power player might want to read before breakfast—and Politico’s most successful ad franchise to boot. In many ways, even that world of just a few years ago now seems quaint: the notion that anyone could be a single, once-a-day town crier in This Town (or any other) has been utterly exploded by the move to Twitter, Facebook, and all the rest. We are living, as Mark put it to me recently, “in a 24-hour scrolling version of what ‘Playbook’ was.”
  • Whether it was Walter Cronkite or The New York Times, they preached journalistic “objectivity” and spoke with authority when they pronounced on the day’s developments—but not always with the depth and expertise that real competition or deep specialization might have provided. They were great—but they were generalists.
  • I remained convinced that reporting would hold its value, especially as our other advantages—like access to information and the expensive means to distribute it—dwindled. It was all well and good to root for your political team, but when it mattered to your business (or the country, for that matter), I reasoned, you wouldn’t want cheerleading but real reporting about real facts. Besides, the new tools might be coming at us with dizzying speed—remember when that radical new video app Meerkat was going to change absolutely everything about how we cover elections?—but we would still need reporters to find a way inside Washington’s closed doors and back rooms, to figure out what was happening when the cameras weren’t rolling.
  • And if the world was suffering from information overload—well, so much the better for us editors; we would be all the more needed to figure out what to listen to amid the noise.
  • Trump turned out to be more correct than we editors were: the more relevant point of the Access Hollywood tape was not about the censure Trump would now face but the political reality that he, like Bill Clinton, could survive this—or perhaps any scandal. Yes, we were wrong about the Access Hollywood tape, and so much else.
  • These days, Politico has a newsroom of 200-odd journalists, a glossy award-winning magazine, dozens of daily email newsletters, and 16 subscription policy verticals. It’s a major player in coverage not only of Capitol Hill but many other key parts of the capital, and some months during this election year we had well over 30 million unique visitors to our website, a far cry from the controlled congressional circulation of 35,000 that I remember Roll Call touting in our long-ago sales materials.
  • , we journalists were still able to cover the public theater of politics while spending more of our time, resources, and mental energy on really original reporting, on digging up stories you couldn’t read anywhere else. Between Trump’s long and checkered business past, his habit of serial lying, his voluminous and contradictory tweets, and his revision of even his own biography, there was lots to work with. No one can say that Trump was elected without the press telling us all about his checkered past.
  • politics was NEVER more choose-your-own-adventure than in 2016, when entire news ecosystems for partisans existed wholly outside the reach of those who at least aim for truth
  • Pew found that nearly 50 percent of self-described conservatives now rely on a single news source, Fox, for political information they trust.
  • As for the liberals, they trust only that they should never watch Fox, and have MSNBC and Media Matters and the remnants of the big boys to confirm their biases.
  • And then there are the conspiracy-peddling Breitbarts and the overtly fake-news outlets of this overwhelming new world; untethered from even the pretense of fact-based reporting, their version of the campaign got more traffic on Facebook in the race’s final weeks than all the traditional news outlets combined.
  • When we assigned a team of reporters at Politico during the primary season to listen to every single word of Trump’s speeches, we found that he offered a lie, half-truth, or outright exaggeration approximately once every five minutes—for an entire week. And it didn’t hinder him in the least from winning the Republican presidential nomination.
  • when we repeated the exercise this fall, in the midst of the general election campaign, Trump had progressed to fibs of various magnitudes just about once every three minutes!
  • By the time Trump in September issued his half-hearted disavowal of the Obama “birther” whopper he had done so much to create and perpetuate, one national survey found that only 1 in 4 Republicans was sure that Obama was born in the U.S., and various polls found that somewhere between a quarter and a half of Republicans believed he’s Muslim. So not only did Trump think he was entitled to his own facts, so did his supporters. It didn’t stop them at all from voting for him.
  • in part, it’s not just because they disagree with the facts as reporters have presented them but because there’s so damn many reporters, and from such a wide array of outlets, that it’s often impossible to evaluate their standards and practices, biases and preconceptions. Even we journalists are increasingly overwhelmed.
  • So much terrific reporting and writing and digging over the years and … Trump? What happened to consequences? Reporting that matters? Sunlight, they used to tell us, was the best disinfectant for what ails our politics.
  • 2016 suggests a different outcome: We’ve achieved a lot more transparency in today’s Washington—without the accountability that was supposed to come with it.
sissij

With Mass Protests, South Koreans Wield a Familiar Weapon in a New Era - The New York Times - 0 views

  • Then as now, mass protest was a powerful weapon deployed by enraged citizens who felt they had nowhere else to turn but the streets. Thirty years later, it’s clear how far Korean democracy has advanced. Then, South Korea was a dictatorship, protests were outlawed and the threat of torture, imprisonment and martial law ever-present.
  • Students have long been at the vanguard of South Korea’s robust history of protest, drawing on deep-rooted Confucian traditions that elevated scholars as guardians of morality.
  • Yonsei produced its own martyr, 21-year-old Lee Han-yol, who died after a tear-gas canister hit him in the head.
  • ...5 more annotations...
  • I saw an older woman, hair neatly coifed, beat a policeman with her handbag. A young father hoisted his little girl on his shoulder, carefully affixing a surgical mask to her face, an imperfect shield from the gas. A student in Kwangju bit his finger and wrote protest slogans in his own blood.
  • At the trial, his father lunged at the three policemen, small and scared now, protected by more than 50 guards. Screams broke out and a purse flew through the air at the judges as the light sentence was read. Women whose sons were still in jail stormed the bus carrying the policemen, throwing bottles against the windows. Plainclothes police shoved the four of them onto the concrete, where they lay unconscious.
  • Laws are still on the books that can be used as tools to stifle dissent. South Korea remains shadowed by legitimate fears of North Korean aggression and espionage, but those were and are exploited by the government.
  • This is not a tame society, for all the comforts its public has won in the years since. This may be the land of Psy and Gangnam style, a country so wired that some of its children are sent to boot camps to wean them from internet addiction.
  • Politicians buck the popular will at their peril.
  •  
    This article talks about the protests in South Korea. I found that there are deep conflict between the police and the the people. It makes me think of the police brutality we talked about recently. I think the police should protect the people but now they are using martial arts and tear-gas canisters. To what extent should the police yield the protest from the people? I also found it interesting that students are often more eager to attend those protest than adults. They are usually more radical than adults. Is it a good thing to always let out all their complaints? --Sissi (12/11/2016)
kirkpatrickry

How to get out of debt, according to behavioral economics. - 0 views

  • “Our emotions, which drive us so strongly, are inherently not rational,” says Dan Ariely, author of Predictably Irrational and a leader in the field, which examines the place where psychology and economics overlap. “If you think about an environment in which we have to think long-term and abstractly, that’s just not something we’re good at. Saving is about now versus later, it’s about concrete versus abstract, and we don’t do those well.”
  • One aspect of our nature that’s to blame is called present bias—the human tendency to emphasize now over later. “We get the money now, when we take a loan, and we pay it back some time in the future. This temporal component triggers a host of behavioral psychological effects,” says Oren Bar-Gill, a Harvard Law School professor who specializes in law and economics. “One of them is basic myopia. We think more about the present and less about the future, and so we’re more likely to take on debt, because we don’t put enough weight in our decision-making process on the future paybacks.”
  • although in theory you’d like to take care of yourself in the future, when it comes to how you live your lif
simoneveale

Why We Remember So Many Things Wrong - The New Yorker - 1 views

  • Two and a half years after the event, she remembered it as if it were yesterday: the TV, the terrible news, the call home. She could say with absolute certainty that that’s precisely how it happened. Except, it turns out, none of what she remembered was accurate.
  • Neisser became fascinated by the concept of flashbulb memories—the times when a shocking, emotional event seems to leave a particularly vivid imprint on the mind.
  • Nicole Harsch, handed out a questionnaire about the event to the hundred and six students in their ten o’clock psychology 101 class, “Personality Development.” Where were the students when they heard the news? Whom were they with? What were they doing? The professor and his assistant carefully filed the responses away.
  • ...16 more annotations...
  • two and a half years later, the questionnaire was given a second time to the same students.
  • It was then that R. T. recalled, with absolute confidence, her dorm-room experience.
  • She didn’t know any details of what had happened,
  • We don’t really remember an uneventful day the way that we remember a fight or a first kiss.
  • Her hope is to understand how, exactly, emotional memories behave at all stages of the remembering process: how we encode them, how we consolidate and store them, how we retrieve them.
  • When it comes to the central details of the event, like that the Challenger exploded, they are clearer and more accurate. But when it comes to peripheral details, they are worse. And our confidence in them, while almost always strong, is often misplaced.
  • Within the brain, memories are formed and consolidated largely due to the help of a small seahorse-like structure called the hippocampus; damage the hippocampus, and you damage the ability to form lasting recollections.
  • A key element of emotional-memory formation is the direct line of communication between the amygdala and the visual cortex.
  • Phelps has combined Neisser’s experiential approach with the neuroscience of emotional memory to explore how such memories work, and why they work the way they do.
  • Memory for the emotional scenes was significantly higher, and the vividness of the recollection was significantly greater.
  • hat is, if you were shocked when you saw animals, your memory of the earlier animals was also enhanced. And, more important, the effect only emerged after six or twenty-four hours: the memory needed time to consolidate.
  • o, if memory for events is strengthened at emotional times, why does everyone forget what they were doing when the Challenger exploded?
  • The strength of the central memory seems to make us confident of all of the details when we should only be confident of a few.
  • Our misplaced confidence in recalling dramatic events is troubling when we need to rely on a memory for something important—evidence in court, for instance
  • After reviewing the evidence, the committee made several concrete suggestions to changes in current procedures, including “blinded” eyewitness identification
  • standardized instructions to witnesses, along with extensive police training in vision and memory research as it relates to eyewitness testimony, videotaped identification, expert testimony early on in trials about the issues surrounding eyewitness reliability, and early and clear jury instruction on any prior identifications
Javier E

Choose to Be Grateful. It Will Make You Happier. - The New York Times - 2 views

  • Building the best life does not require fealty to feelings in the name of authenticity, but rather rebelling against negative impulses and acting right even when we don’t feel like it. In a nutshell, acting grateful can actually make you grateful.
  • some people are just naturally more grateful than others. A 2014 article in the journal Social Cognitive and Affective Neuroscience identified a variation in a gene (CD38) associated with gratitude. Some people simply have a heightened genetic tendency to experience, in the researchers’ words, “global relationship satisfaction, perceived partner responsiveness and positive emotions (particularly love).” That is, those relentlessly positive people you know who seem grateful all the time may simply be mutants.
  • Evidence suggests that we can actively choose to practice gratitude — and that doing so raises our happiness.
  • ...11 more annotations...
  • , researchers in one 2003 study randomly assigned one group of study participants to keep a short weekly list of the things they were grateful for, while other groups listed hassles or neutral events. Ten weeks later, the first group enjoyed significantly greater life satisfaction than the others
  • acting happy, regardless of feelings, coaxes one’s brain into processing positive emotions. In one famous 1993 experiment, researchers asked human subjects to smile forcibly for 20 seconds while tensing facial muscles, notably the muscles around the eyes called the orbicularis oculi (which create “crow’s feet”). They found that this action stimulated brain activity associated with positive emotions.
  • gratitude stimulates the hypothalamus (a key part of the brain that regulates stress) and the ventral tegmental area (part of our “reward circuitry” that produces the sensation of pleasure).
  • In the slightly more elegant language of the Stoic philosopher Epictetus, “He is a man of sense who does not grieve for what he has not, but rejoices in what he has.”
  • In addition to building our own happiness, choosing gratitude can also bring out the best in those around us
  • when their competence was questioned, the subjects tended to lash out with aggression and personal denigration. When shown gratitude, however, they reduced the bad behavior. That is, the best way to disarm an angry interlocutor is with a warm “thank you.”
  • A new study in the Journal of Consumer Psychology finds evidence that people begin to crave sweets when they are asked to express gratitude.
  • There are concrete strategies that each of us can adopt. First, start with “interior gratitude,” the practice of giving thanks privately
  • he recommends that readers systematically express gratitude in letters to loved ones and colleagues. A disciplined way to put this into practice is to make it as routine as morning coffee. Write two short emails each morning to friends, family or colleagues, thanking them for what they do.
  • Finally, be grateful for useless things
  • think of the small, useless things you experience — the smell of fall in the air, the fragment of a song that reminds you of when you were a kid. Give thanks.
Javier E

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
nataliedepaulo1

Companies are already lining up to build the U.S.-Mexico border wall - Mar. 2, 2017 - 0 views

  • Businesses will be asked to submit their proposals to design and build prototype wall structures near the United States border with Mexico starting next week, according to the Department of Homeland Security's Customs and Border Protection agency.
  • Trump has said that the entire wall will cost $10 billion, citing an estimate that he received during the campaign from the National Precast Concrete Association. But other estimates have put the cost at as much as $25 billion, according to a report from Bernstein Research, which tracks materials costs.
Javier E

New Statesman - All machine and no ghost? - 0 views

  • More subtly, there are many who insist that consciousness just reduces to brain states - a pang of regret, say, is just a surge of chemicals across a synapse. They are collapsers rather than deniers. Though not avowedly eliminative, this kind of view is tacitly a rejection of the very existence of consciousness
  • it occurred to me that the problem might lie not in nature but in ourselves: we just don't have the faculties of comprehension that would enable us to remove the sense of mystery. Ontologically, matter and consciousness are woven intelligibly together but epistemologically we are precluded from seeing how. I used Noam Chomsky's notion of "mysteries of nature" to describe the situation as I saw it. Soon, I was being labelled (by Owen Flanagan) a "mysterian"
  • Dualism makes the mind too separate, thereby precluding intelligible interaction and dependence.
  • ...11 more annotations...
  • At this point the idealist swooshes in: ladies and gentlemen, there is nothing but mind! There is no problem of interaction with matter because matter is mere illusion
  • idealism has its charms but taking it seriously requires an antipathy to matter bordering on the maniacal. Are we to suppose that material reality is just a dream, a baseless fantasy, and that the Big Bang was nothing but the cosmic spirit having a mental sneezing fit?
  • pan­psychism: even the lowliest of material things has a streak of sentience running through it, like veins in marble. Not just parcels of organic matter, such as lizards and worms, but also plants and bacteria and water molecules and even electrons. Everything has its primitive feelings and minute allotment of sensation.
  • The trouble with panpsychism is that there just isn't any evidence of the universal distribution of consciousness in the material world.
  • The dualist, by contrast, freely admits that consciousness exists, as well as matter, holding that reality falls into two giant spheres. There is the physical brain, on the one hand, and the conscious mind, on the other: the twain may meet at some point but they remain distinct entities.
  • The more we know of the brain, the less it looks like a device for creating consciousness: it's just a big collection of biological cells and a blur of electrical activity - all machine and no ghost.
  • mystery is quite pervasive, even in the hardest of sciences. Physics is a hotbed of mystery: space, time, matter and motion - none of it is free of mysterious elements. The puzzles of quantum theory are just a symptom of this widespread lack of understanding
  • The human intellect grasps the natural world obliquely and glancingly, using mathematics to construct abstract representations of concrete phenomena, but what the ultimate nature of things really is remains obscure and hidden. How everything fits together is particularly elusive, perhaps reflecting the disparate cognitive faculties we bring to bear on the world (the senses, introspection, mathematical description). We are far from obtaining a unified theory of all being and there is no guarantee that such a theory is accessible by finite human intelligence.
  • real naturalism begins with a proper perspective on our specifically human intelligence. Palaeoanthropologists have taught us that the human brain gradually evolved from ancestral brains, particularly in concert with practical toolmaking, centring on the anatomy of the human hand. This history shaped and constrained the form of intelligence now housed in our skulls (as the lifestyle of other species form their set of cognitive skills). What chance is there that an intelligence geared to making stone tools and grounded in the contingent peculiarities of the human hand can aspire to uncover all the mysteries of the universe? Can omniscience spring from an opposable thumb? It seems unlikely, so why presume that the mysteries of consciousness will be revealed to a thumb-shaped brain like ours?
  • The "mysterianism" I advocate is really nothing more than the acknowledgment that human intelligence is a local, contingent, temporal, practical and expendable feature of life on earth - an incremental adaptation based on earlier forms of intelligence that no one would reg
  • rd as faintly omniscient. The current state of the philosophy of mind, from my point of view, is just a reflection of one evolutionary time-slice of a particular bipedal species on a particular humid planet at this fleeting moment in cosmic history - as is everything else about the human animal. There is more ignorance in it than knowledge.
1 - 20 of 65 Next › Last »
Showing 20 items per page