Skip to main content

Home/ TOK Friends/ Group items tagged alien

Rss Feed Group items tagged

Javier E

Losing Our Touch - NYTimes.com - 0 views

  • Are we losing our senses? In our increasingly virtual world, are we losing touch with the sense of touch itself? And if so, so what?
  • Tactility is not blind immediacy — not merely sensorial but cognitive, too. Savoring is wisdom; in Latin, wisdom is “sapientia,” from “sapere,” to taste. These carnal senses make us human by keeping us in touch with things, by responding to people’s pain
  • But Aristotle did not win this battle of ideas. The Platonists prevailed and the Western universe became a system governed by “the soul’s eye.” Sight came to dominate the hierarchy of the senses, and was quickly deemed the appropriate ally of theoretical ideas.
  • ...6 more annotations...
  • Western philosophy thus sprang from a dualism between the intellectual senses, crowned by sight, and the lower “animal” senses, stigmatized by touch.
  • opto-centrism prevailed for over 2,000 years, culminating in our contemporary culture of digital simulation and spectacle. The eye continues to rule in what Roland Barthes once called our “civilization of the image.” The world is no longer our oyster, but our screen.
  • our current technology is arguably exacerbating our carnal alienation. While offering us enormous freedoms of fantasy and encounter, digital eros may also be removing us further from the flesh
  • The move toward excarnation is apparent in what is becoming more and more a fleshless society. In medicine, “bedside manner” and hand on pulse has ceded to the anonymous technologies of imaging in diagnosis and treatment. In war, hand-to-hand combat has been replaced by “targeted killing” via remote-controlled drones.
  • certain cyber engineers now envisage implanting transmission codes in brains so that we will not have to move a finger — or come into contact with another human being — to get what we want.
  • We need to return from head to foot, from brain to fingertip, from iCloud to earth. To close the distance, so that eros is more about proximity than proxy. So that soul becomes flesh, where it belongs. Such a move, I submit, would radically alter our “sense” of sex in our digital civilization. It would enhance the role of empathy, vulnerability and sensitivity in the art of carnal love, and ideally, in all of human relations. Because to love or be loved truly is to be able to say, “I have been touched.”
Javier E

The Amazing Trump-Wingnut Policy Conveyor Belt - 0 views

  • Over the course of just a few days Donald Trump has gone from saying that we might have to close down mosques and create a Muslim registry to saying that not only will we do this but we have to do it and anything less is an utter capitulation.
  • In other words, rapidly evolving from refusing to rule out a draconian policy to affirmatively endorsing it to being its leading advocate.
  • With his Muslim ID card and database, Wednesday he said he wouldn't rule out creating such a system. By the end of the day he was telling NBC News he would "absolutely" create such a system.
  • ...5 more annotations...
  • just as we saw in the summer with immigration writ large, the progression doesn't end with Trump. We've had three presidential elections since the 9/11 terror attacks and no presidential candidate has ever proposed shutting down mosques in the United States or creating a special registry and identification cards for Muslims living in the United States.
  • So yesterday Megyn Kelly asked Marco Rubio whether he'd shut down radical mosques like Trump. He tried to deflect the question by saying that it wasn't about mosques but closing down any facility that was promoting radicalism. In other words, Rubio, while clearly not eager to answer the question, pointedly refused to rule out following Trump's lead.
  • It is a very good example of how Trump is not only shaping the debate on the right but rapidly mainstreaming ideas that were as recently as a week ago considered entirely outside the realm of mainstream political discourse.
  • It's particularly effective with the less sophisticated and principled candidates like Rubio. Jeb Bush said flatly this morning that Trump's database proposal is "just wrong." But Ben Carson quickly took Trump's lead comparing Syrian refugees to "mad dogs." The difference is that Marco Rubio could very well be president in 18 months. Jeb Bush won't be.
  • this is no longer a matter of Trump yakking on about building a gilded 100-foot wall along the southern border and having Mexico agree to pay for it. Trump is now proposing things that sound like they put millions of American citizens and resident aliens on a road to something like the internment of Japanese Americans during World War II.
Javier E

Was There a Civilization On Earth Before Humans? - The Atlantic - 0 views

  • When it comes to direct evidence of an industrial civilization—things like cities, factories, and roads—the geologic record doesn’t go back past what’s called the Quaternary period 2.6 million years ago
  • if we’re going back this far, we’re not talking about human civilizations anymore. Homo sapiens didn’t make their appearance on the planet until just 300,000 years or so ago. That means the question shifts to other species, which is why Gavin called the idea the Silurian hypothesis
  • could researchers find clear evidence that an ancient species built a relatively short-lived industrial civilization long before our own? Perhaps, for example, some early mammal rose briefly to civilization building during the Paleocene epoch about 60 million years ago. There are fossils, of course. But the fraction of life that gets fossilized is always minuscule and varies a lot depending on time and habitat. It would be easy, therefore, to miss an industrial civilization that only lasted 100,000 years—which would be 500 times longer than our industrial civilization has made it so far.
  • ...11 more annotations...
  • Given that all direct evidence would be long gone after many millions of years, what kinds of evidence might then still exist? The best way to answer this question is to figure out what evidence we’d leave behind if human civilization collapsed at its current stage of development.
  • Now that our industrial civilization has truly gone global, humanity’s collective activity is laying down a variety of traces that will be detectable by scientists 100 million years in the future. The extensive use of fertilizer, for example
  • And then there’s all that plastic. Studies have shown increasing amounts of plastic “marine litter” are being deposited on the seafloor everywhere from coastal areas to deep basins and even in the Arctic. Wind, sun, and waves grind down large-scale plastic artifacts, leaving the seas full of microscopic plastic particles that will eventually rain down on the ocean floor, creating a layer that could persist for geological timescales.
  • Likewise our relentless hunger for the rare-Earth elements used in electronic gizmos. Far more of these atoms are now wandering around the planet’s surface because of us than would otherwise be the case. They might also show up in future sediments, too.
  • Once you realize, through climate change, the need to find lower-impact energy sources, the less impact you will leave. So the more sustainable your civilization becomes, the smaller the signal you’ll leave for future generations.
  • The more fossil fuels we burn, the more the balance of these carbon isotopes shifts. Atmospheric scientists call this shift the Suess effect, and the change in isotopic ratios of carbon due to fossil-fuel use is easy to see over the last century. Increases in temperature also leave isotopic signals. These shifts should be apparent to any future scientist who chemically analyzes exposed layers of rock from our era. Along with these spikes, this Anthropocene layer might also hold brief peaks in nitrogen, plastic nanoparticles, and even synthetic steroids
  • Fifty-six million years ago, Earth passed through the Paleocene-Eocene Thermal Maximum (PETM). During the PETM, the planet’s average temperature climbed as high as 15 degrees Fahrenheit above what we experience today. It was a world almost without ice, as typical summer temperatures at the poles reached close to a balmy 70 degrees Fahrenheit.
  • While there is evidence that the PETM may have been driven by a massive release of buried fossil carbon into the air, it’s the timescale of these changes that matter. The PETM’s isotope spikes rise and fall over a few hundred thousand years. But what makes the Anthropocene so remarkable in terms of Earth’s history is the speed at which we’re dumping fossil carbon into the atmosphere. There have been geological periods where Earth’s CO2 has been as high or higher than today, but never before in the planet’s multibillion-year history has so much buried carbon been dumped back into the atmosphere so quickly
  • So the isotopic spikes we do see in the geologic record may not be spiky enough to fit the Silurian hypothesis’s bill.
  • ronically, however, the most promising marker of humanity’s presence as an advanced civilization is a by-product of one activity that may threaten it most.
  • “How do you know we’re the only time there’s been a civilization on our own planet?”
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Look At Me by Patricia Snow | Articles | First Things - 0 views

  • Maurice stumbles upon what is still the gold standard for the treatment of infantile autism: an intensive course of behavioral therapy called applied behavioral analysis that was developed by psychologist O. Ivar Lovaas at UCLA in the 1970s
  • in a little over a year’s time she recovers her daughter to the point that she is indistinguishable from her peers.
  • Let Me Hear Your Voice is not a particularly religious or pious work. It is not the story of a miracle or a faith healing
  • ...54 more annotations...
  • Maurice discloses her Catholicism, and the reader is aware that prayer undergirds the therapy, but the book is about the therapy, not the prayer. Specifically, it is about the importance of choosing methods of treatment that are supported by scientific data. Applied behavioral analysis is all about data: its daily collection and interpretation. The method is empirical, hard-headed, and results-oriented.
  • on a deeper level, the book is profoundly religious, more religious perhaps than its author intended. In this reading of the book, autism is not only a developmental disorder afflicting particular individuals, but a metaphor for the spiritual condition of fallen man.
  • Maurice’s autistic daughter is indifferent to her mother
  • In this reading of the book, the mother is God, watching a child of his wander away from him into darkness: a heartbroken but also a determined God, determined at any cost to bring the child back
  • the mother doesn’t turn back, concedes nothing to the condition that has overtaken her daughter. There is no political correctness in Maurice’s attitude to autism; no nod to “neurodiversity.” Like the God in Donne’s sonnet, “Batter my heart, three-personed God,” she storms the walls of her daughter’s condition
  • Like God, she sets her sights high, commits both herself and her child to a demanding, sometimes painful therapy (life!), and receives back in the end a fully alive, loving, talking, and laughing child
  • the reader realizes that for God, the harrowing drama of recovery is never a singular, or even a twice-told tale, but a perennial one. Every child of his, every child of Adam and Eve, wanders away from him into darkness
  • we have an epidemic of autism, or “autism spectrum disorder,” which includes classic autism (Maurice’s children’s diagnosis); atypical autism, which exhibits some but not all of the defects of autism; and Asperger’s syndrome, which is much more common in boys than in girls and is characterized by average or above average language skills but impaired social skills.
  • At the same time, all around us, we have an epidemic of something else. On the street and in the office, at the dinner table and on a remote hiking trail, in line at the deli and pushing a stroller through the park, people go about their business bent over a small glowing screen, as if praying.
  • This latter epidemic, or experiment, has been going on long enough that people are beginning to worry about its effects.
  • for a comprehensive survey of the emerging situation on the ground, the interested reader might look at Sherry Turkle’s recent book, Reclaiming Conversation: The Power of Talk in a Digital Age.
  • she also describes in exhaustive, chilling detail the mostly horrifying effects recent technology has had on families and workplaces, educational institutions, friendships and romance.
  • many of the promises of technology have not only not been realized, they have backfired. If technology promised greater connection, it has delivered greater alienation. If it promised greater cohesion, it has led to greater fragmentation, both on a communal and individual level.
  • If thinking that the grass is always greener somewhere else used to be a marker of human foolishness and a temptation to be resisted, today it is simply a possibility to be checked out. The new phones, especially, turn out to be portable Pied Pipers, irresistibly pulling people away from the people in front of them and the tasks at hand.
  • all it takes is a single phone on a table, even if that phone is turned off, for the conversations in the room to fade in number, duration, and emotional depth.
  • an infinitely malleable screen isn’t an invitation to stability, but to restlessness
  • Current media, and the fear of missing out that they foster (a motivator now so common it has its own acronym, FOMO), drive lives of continual interruption and distraction, of virtual rather than real relationships, and of “little” rather than “big” talk
  • if you may be interrupted at any time, it makes sense, as a student explains to Turkle, to “keep things light.”
  • we are reaping deficits in emotional intelligence and empathy; loneliness, but also fears of unrehearsed conversations and intimacy; difficulties forming attachments but also difficulties tolerating solitude and boredom
  • consider the testimony of the faculty at a reputable middle school where Turkle is called in as a consultant
  • The teachers tell Turkle that their students don’t make eye contact or read body language, have trouble listening, and don’t seem interested in each other, all markers of autism spectrum disorder
  • Like much younger children, they engage in parallel play, usually on their phones. Like autistic savants, they can call up endless information on their phones, but have no larger context or overarching narrative in which to situate it
  • Students are so caught up in their phones, one teacher says, “they don’t know how to pay attention to class or to themselves or to another person or to look in each other’s eyes and see what is going on.
  • “It is as though they all have some signs of being on an Asperger’s spectrum. But that’s impossible. We are talking about a schoolwide problem.”
  • Can technology cause Asperger’
  • “It is not necessary to settle this debate to state the obvious. If we don’t look at our children and engage them in conversation, it is not surprising if they grow up awkward and withdrawn.”
  • In the protocols developed by Ivar Lovaas for treating autism spectrum disorder, every discrete trial in the therapy, every drill, every interaction with the child, however seemingly innocuous, is prefaced by this clear command: “Look at me!”
  • If absence of relationship is a defining feature of autism, connecting with the child is both the means and the whole goal of the therapy. Applied behavioral analysis does not concern itself with when exactly, how, or why a child becomes autistic, but tries instead to correct, do over, and even perhaps actually rewire what went wrong, by going back to the beginning
  • Eye contact—which we know is essential for brain development, emotional stability, and social fluency—is the indispensable prerequisite of the therapy, the sine qua non of everything that happens.
  • There are no shortcuts to this method; no medications or apps to speed things up; no machines that can do the work for us. This is work that only human beings can do
  • it must not only be started early and be sufficiently intensive, but it must also be carried out in large part by parents themselves. Parents must be trained and involved, so that the treatment carries over into the home and continues for most of the child’s waking hours.
  • there are foundational relationships that are templates for all other relationships, and for learning itself.
  • Maurice’s book, in other words, is not fundamentally the story of a child acquiring skills, though she acquires them perforce. It is the story of the restoration of a child’s relationship with her parents
  • it is also impossible to overstate the time and commitment that were required to bring it about, especially today, when we have so little time, and such a faltering, diminished capacity for sustained engagement with small children
  • The very qualities that such engagement requires, whether our children are sick or well, are the same qualities being bred out of us by technologies that condition us to crave stimulation and distraction, and by a culture that, through a perverse alchemy, has changed what was supposed to be the freedom to work anywhere into an obligation to work everywhere.
  • In this world of total work (the phrase is Josef Pieper’s), the work of helping another person become fully human may be work that is passing beyond our reach, as our priorities, and the technologies that enable and reinforce them, steadily unfit us for the work of raising our own young.
  • in Turkle’s book, as often as not, it is young people who are distressed because their parents are unreachable. Some of the most painful testimony in Reclaiming Conversation is the testimony of teenagers who hope to do things differently when they have children, who hope someday to learn to have a real conversation, and so o
  • it was an older generation that first fell under technology’s spell. At the middle school Turkle visits, as at many other schools across the country, it is the grown-ups who decide to give every child a computer and deliver all course content electronically, meaning that they require their students to work from the very medium that distracts them, a decision the grown-ups are unwilling to reverse, even as they lament its consequences.
  • we have approached what Turkle calls the robotic moment, when we will have made ourselves into the kind of people who are ready for what robots have to offer. When people give each other less, machines seem less inhuman.
  • robot babysitters may not seem so bad. The robots, at least, will be reliable!
  • If human conversations are endangered, what of prayer, a conversation like no other? All of the qualities that human conversation requires—patience and commitment, an ability to listen and a tolerance for aridity—prayer requires in greater measure.
  • this conversation—the Church exists to restore. Everything in the traditional Church is there to facilitate and nourish this relationship. Everything breathes, “Look at me!”
  • there is a second path to God, equally enjoined by the Church, and that is the way of charity to the neighbor, but not the neighbor in the abstract.
  • “Who is my neighbor?” a lawyer asks Jesus in the Gospel of Luke. Jesus’s answer is, the one you encounter on the way.
  • Virtue is either concrete or it is nothing. Man’s path to God, like Jesus’s path on the earth, always passes through what the Jesuit Jean Pierre de Caussade called “the sacrament of the present moment,” which we could equally call “the sacrament of the present person,” the way of the Incarnation, the way of humility, or the Way of the Cross.
  • The tradition of Zen Buddhism expresses the same idea in positive terms: Be here now.
  • Both of these privileged paths to God, equally dependent on a quality of undivided attention and real presence, are vulnerable to the distracting eye-candy of our technologies
  • Turkle is at pains to show that multitasking is a myth, that anyone trying to do more than one thing at a time is doing nothing well. We could also call what she was doing multi-relating, another temptation or illusion widespread in the digital age. Turkle’s book is full of people who are online at the same time that they are with friends, who are texting other potential partners while they are on dates, and so on.
  • This is the situation in which many people find themselves today: thinking that they are special to someone because of something that transpired, only to discover that the other person is spread so thin, the interaction was meaningless. There is a new kind of promiscuity in the world, in other words, that turns out to be as hurtful as the old kind.
  • Who can actually multitask and multi-relate? Who can love everyone without diluting or cheapening the quality of love given to each individual? Who can love everyone without fomenting insecurity and jealousy? Only God can do this.
  • When an individual needs to be healed of the effects of screens and machines, it is real presence that he needs: real people in a real world, ideally a world of God’s own making
  • Nature is restorative, but it is conversation itself, unfolding in real time, that strikes these boys with the force of revelation. More even than the physical vistas surrounding them on a wilderness hike, unrehearsed conversation opens up for them new territory, open-ended adventures. “It was like a stream,” one boy says, “very ongoing. It wouldn’t break apart.”
  • in the waters of baptism, the new man is born, restored to his true parent, and a conversation begins that over the course of his whole life reminds man of who he is, that he is loved, and that someone watches over him always.
  • Even if the Church could keep screens out of her sanctuaries, people strongly attached to them would still be people poorly positioned to take advantage of what the Church has to offer. Anxious people, unable to sit alone with their thoughts. Compulsive people, accustomed to checking their phones, on average, every five and a half minutes. As these behaviors increase in the Church, what is at stake is man’s relationship with truth itself.
Javier E

Declaration of Disruption - The New York Times - 0 views

  • A presidency characterized by pandemonium invades and infects that space, leaving people unsettled and on edge.
  • this, in turn, leads to greater polarization, to feelings of alienation and anger, to unrest and even to violence.
  • In short, chaotic leadership can inflict real trauma on political and civic culture.
  • ...4 more annotations...
  • Donald Trump, arguably the most disruptive and transgressive president in American history. He thrives on creating turbulence in every conceivable sphere. The blast radius of his tumultuous acts and chaotic temperament is vast
  • here’s the truly worrisome thing: The disruption is only going to increase, both because he’s facing criticism that seems to trigger him psychologically and because his theory of management involves the cultivation of chaos. He has shown throughout his life a defiant refusal to be disciplined. His disordered personality thrives on mayhem and upheaval, on vicious personal attacks and ceaseless conflict
  • We have as president the closest thing to a nihilist in our history — a man who believes in little or nothing, who has the impulse to burn down rather than to build up. When the president eventually faces a genuine crisis, his ignorance and inflammatory instincts will make everything worse.
  • Republican voters and politicians rallied around Mr. Trump in 2016, believing he was anti-establishment when in fact he was anti-order. He turns out to be an institutional arsonist. It is an irony of American history that the Republican Party, which has historically valued order and institutions, has become the conduit of chaos.
Javier E

HBO's 'Years and Years' and the Numbness of Survival - The Atlantic - 0 views

  • the thing that struck me most about the Lyonses wasn’t that they sighed and did nothing while the world around them disintegrated into disease and disinformation. It was that—mostly—they survived. The more things happened to them, the harder they clung to life and to one another. Most dystopian narratives deal with one kind of unimaginable crisis: a zombie apocalypse, a totalitarian regime, a terrifying disease
  • Years and Years, instead, shows how the alienation and paralysis sparked by a decade-plus of constant calamity are also symptoms of a kind of resilience. Human nature is to panic, to agonize, to fret and lose sleep and weep. Inevitably, though, it’s also to adapt.
  • The cost of getting through crisis after crisis, the show suggests, is numbness. “Emotion is a luxury,” Governor Andrew Cuomo of New York said in his daily press conference on Thursday morning. “We don’t have ... [that] luxury. Let’s just get through it.”
fischerry

Scientists 'wake' microbes, trapped in crystals for thousands of years - 0 views

  • Scientists 'wake' microbes, trapped in crystals for thousands of years
  • The researchers were able to “wake up” the long-dormant microbial lifeforms, which may have been trapped in the crystals between 10,000 and 50,000 years, a NASA researcher announced at the annual meeting of the American Association for the Advancement of Science on Friday, National Geographic reported. 
Javier E

The trouble with atheists: a defence of faith | Books | The Guardian - 1 views

  • My daughter has just turned six. Some time over the next year or so, she will discover that her parents are weird. We're weird because we go to church.
  • This means as she gets older there'll be voices telling her what it means, getting louder and louder until by the time she's a teenager they'll be shouting right in her ear. It means that we believe in a load of bronze-age absurdities. That we fetishise pain and suffering. That we advocate wishy-washy niceness. That we're too stupid to understand the irrationality of our creeds. That we build absurdly complex intellectual structures on the marshmallow foundations of a fantasy. That we're savagely judgmental.
  • that's not the bad news. Those are the objections of people who care enough about religion to object to it. Or to rent a set of recreational objections from Richard Dawkins or Christopher Hitchens. As accusations, they may be a hodge-podge, but at least they assume there's a thing called religion which looms with enough definition and significance to be detested.
  • ...25 more annotations...
  • the really painful message our daughter will receive is that we're embarrassing. For most people who aren't New Atheists, or old atheists, and have no passion invested in the subject, either negative or positive, believers aren't weird because we're wicked. We're weird because we're inexplicable; because, when there's no necessity for it that anyone sensible can see, we've committed ourselves to a set of awkward and absurd attitudes that obtrude, that stick out against the background of modern life, and not in some important or respectworthy or principled way, either.
  • Believers are people who try to insert Jee-zus into conversations at parties; who put themselves down, with writhings of unease, for perfectly normal human behaviour; who are constantly trying to create a solemn hush that invites a fart, a hiccup, a bit of subversion. Believers are people who, on the rare occasions when you have to listen to them, like at a funeral or a wedding, seize the opportunity to pour the liquidised content of a primary-school nativity play into your earhole, apparently not noticing that childhood is over.
  • What goes on inside believers is mysterious. So far as it can be guessed at it appears to be a kind of anxious pretending, a kind of continual, nervous resistance to reality.
  • to me, it's belief that involves the most uncompromising attention to the nature of things of which you are capable. Belief demands that you dispense with illusion after illusion, while contemporary common sense requires continual, fluffy pretending – pretending that might as well be systematic, it's so thoroughly incentivised by our culture.
  • The atheist bus says: "There's probably no God. So stop worrying and enjoy your life."
  • the word that offends against realism here is "enjoy". I'm sorry – enjoy your life?
  • If you based your knowledge of the human species exclusively on adverts, you'd think that the normal condition of humanity was to be a good-looking single person between 20 and 35, with excellent muscle-definition and/or an excellent figure, and a large disposable income. And you'd think the same thing if you got your information exclusively from the atheist bus
  • The implication of the bus slogan is that enjoyment would be your natural state if you weren't being "worried" by us believers and our hellfire preaching. Take away the malignant threat of God-talk, and you would revert to continuous pleasure
  • What's so wrong with this, apart from it being total bollocks? Well, in the first place, that it buys a bill of goods, sight unseen, from modern marketing. Given that human life isn't and can't be made up of enjoyment, it is in effect accepting a picture of human life in which those pieces of living where easy enjoyment is more likely become the only pieces that are visible.
  • But then, like every human being, I am not in the habit of entertaining only those emotions I can prove. I'd be an unrecognisable oddity if I did. Emotions can certainly be misleading: they can fool you into believing stuff that is definitely, demonstrably untrue. Yet emotions are also our indispensable tool for navigating, for feeling our way through, the much larger domain of stuff that isn't susceptible to proof or disproof, that isn't checkable against the physical universe. We dream, hope, wonder, sorrow, rage, grieve, delight, surmise, joke, detest; we form such unprovable conjectures as novels or clarinet concertos; we imagine. And religion is just a part of that, in one sense. It's just one form of imagining, absolutely functional, absolutely human-normal. It would seem perverse, on the face of it, to propose that this one particular manifestation of imagining should be treated as outrageous, should be excised if (which is doubtful) we can manage it.
  • suppose, as the atheist bus goes by, you are povertystricken, or desperate for a job, or a drug addict, or social services have just taken away your child. The bus tells you that there's probably no God so you should stop worrying and enjoy your life, and now the slogan is not just bitterly inappropriate in mood. What it means, if it's true, is that anyone who isn't enjoying themselves is entirely on their own. What the bus says is: there's no help coming.
  • Enjoyment is great. The more enjoyment the better. But enjoyment is one emotion. To say that life is to be enjoyed (just enjoyed) is like saying that mountains should only have summits, or that all colours should be purple, or that all plays should be by Shakespeare. This really is a bizarre category error.
  • A consolation you could believe in would be one that wasn't in danger of popping like a soap bubble on contact with the ordinary truths about us. A consolation you could trust would be one that acknowledged the difficult stuff rather than being in flight from it, and then found you grounds for hope in spite of it, or even because of it
  • The novelist Richard Powers has written that the Clarinet Concerto sounds the way mercy would sound, and that's exactly how I experienced it in 1997. Mercy, though, is one of those words that now requires definition. It does not only mean some tyrant's capacity to suspend a punishment he has himself inflicted. It can mean – and does mean in this case – getting something kind instead of the sensible consequences of an action, or as well as the sensible consequences of an action.
  • from outside, belief looks like a series of ideas about the nature of the universe for which a truth-claim is being made, a set of propositions that you sign up to; and when actual believers don't talk about their belief in this way, it looks like slipperiness, like a maddening evasion of the issue.
  • I am a fairly orthodox Christian. Every Sunday I say and do my best to mean the whole of the Creed, which is a series of propositions. But it is still a mistake to suppose that it is assent to the propositions that makes you a believer. It is the feelings that are primary. I assent to the ideas because I have the feelings; I don't have the feelings because I've assented to the ideas.
  • what I felt listening to Mozart in 1997 is not some wishy-washy metaphor for an idea I believe in, and it's not a front behind which the real business of belief is going on: it's the thing itself. My belief is made of, built up from, sustained by, emotions like that. That's what makes it real.
  • I think that Mozart, two centuries earlier, had succeeded in creating a beautiful and accurate report of an aspect of reality. I think that the reason reality is that way – that it is in some ultimate sense merciful as well as being a set of physical processes all running along on their own without hope of appeal, all the way up from quantum mechanics to the relative velocity of galaxies by way of "blundering, low and horridly cruel" biology (Darwin) – is that the universe is sustained by a continual and infinitely patient act of love. I think that love keeps it in being.
  • That's what I think. But it's all secondary. It all comes limping along behind my emotional assurance that there was mercy, and I felt it. And so the argument about whether the ideas are true or not, which is the argument that people mostly expect to have about religion, is also secondary for me.
  • No, I can't prove it. I don't know that any of it is true. I don't know if there's a God. (And neither do you, and neither does Professor Dawkins, and neither does anybody. It isn't the kind of thing you can know. It isn't a knowable item.)
  • let's be clear about the emotional logic of the bus's message. It amounts to a denial of hope or consolation on any but the most chirpy, squeaky, bubble-gummy reading of the human situation
  • It's got itself established in our culture, relatively recently, that the emotions involved in religious belief must be different from the ones involved in all the other kinds of continuous imagining, hoping, dreaming, and so on, that humans do. These emotions must be alien, freakish, sad, embarrassing, humiliating, immature, pathetic. These emotions must be quite separate from commonsensical us. But they aren't
  • The emotions that sustain religious belief are all, in fact, deeply ordinary and deeply recognisable to anybody who has ever made their way across the common ground of human experience as an adult.
  • It's just that the emotions in question are rarely talked about apart from their rationalisation into ideas. This is what I have tried to do in my new book, Unapologetic.
  • You can easily look up what Christians believe in. You can read any number of defences of Christian ideas. This, however, is a defence of Christian emotions – of their intelligibility, of their grown-up dignity.
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

Flying Saucers and Other Fairy Tales - The New York Times - 0 views

  • those of us who remain Christian — and yes, this is a Christmas column, U.F.O.s and all — can be agnostic about all these strange stories, not reflexively dismissive, since Christianity does not require that all paranormal experiences be either divinely sent or demonic or imaginary.
  • the Christian idea is that whatever capricious powers may exist, when the true God enters his creation, he does so honestly, straightforwardly, in a vulnerable and fully human form — and exposes himself publicly, whether in a crowded stable or on an execution hill. So the glamour of U.F.O.s, like the glamour of faerie, is an understandable object of curiosity but a dangerous object for any kind of faith. The only kind of God worth trusting is the kind who does not play tricks.
clairemann

How To Tackle Deforestation? Give Indigenous People Their Land Rights. | HuffPost - 1 views

  • Deforestation rates are significantly lower in forests protected and governed by Indigenous people, according to a new report.
  • found that, on average, forests in Indigenous and tribal territories have been much better conserved than other forests in the region.
  • Forests are huge carbon sinks and vital tools in holding back the climate crisis as well as stabilizing regional temperatures and rainfall patterns. Indigenous territories hold roughly one-third of all the carbon stored in the forests of Latin America and the Caribbean, and 14% of the carbon in tropical forests around the world. 
  • ...4 more annotations...
  • These communities, with generations of experience successfully protecting nature, have strong track records of guarding the forest, according to the report. They generally favor smaller-scale, more diverse farming, which extracts far less from the land than industrial operations do. 
  • “The forest provides us food, water and gives us a roof. It’s not something alien to us, but another living being,” said Rivas, who’s also been working to promote gender equality in the agroforestry sector. “We only take what we need, nothing more.”
  • “We need to center and take direction from Indigenous and tribal earth defenders in order to protect forests’ biodiversity and even prevent the next pandemic,”
  • “This scientific consensus [in the report] gives our world’s leaders a mandate to defend the rights of Indigenous and tribal communities,” said Recinos. “Otherwise, sensitive biomes and rainforests, such as the Amazon, will remain under threat or, worse, reach an irreversible tipping point.” 
katedriscoll

Is the Schrödinger Equation True? - Scientific American - 0 views

  • haped abstractions called vectors. Pondering Hilbert space makes me feel like a lump of dumb, decrepit flesh trapped in a squalid, 3-D prison. Far from exploring Hilbert space, I can’t even find a window through which to peer into it. I envision it as an immaterial paradise where luminescent cognoscenti glide to and fro, telepathically swapping witticisms about adjoint operators.
  • Reality, great sages have assured us, is essentially mathematical. Plato held that we and other things of this world are mere shadows of the sublime geometric forms that constitute reality. Galileo declared that “the great book of nature is written in mathematics.” We’re part of nature, aren’t we? So why does mathematics, once we get past natural numbers and basic arithmetic, feel so alien to most of us?
  • Physicists’ theories work. They predict the arc of planets and the flutter of electrons, and they have spawned smartphones, H-bombs and—well, what more do we need? But scientists, and especially physicists, aren’t just seeking practical advances. They’re after Truth. They want to believe that their theories are correct—exclusively correct—representations of nature. Physicists share this craving with religious folk, who need to believe that their path to salvation is the One True Path.
anonymous

Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself? - The Ne... - 0 views

  • Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself?
  • Was it the unfortunate use of a single word? Or something far more complicated?
  • Mary Lu Bilek, who has spent 32 years at the law school at the City University of New York, the past five of them as dean, sent an email to students and faculty with the subject line: “Apology.”
  • ...23 more annotations...
  • Discussing a contentious issue of race and tenure in a committee meeting last fall, she had likened herself to a “slaveholder.”
  • It was a strange, deeply jarring thing to say, but she had been trying to make the point that her position left her responsible for whatever racial inequities might exist institutionally
  • What the dean might have regarded as an admission of culpability, some of her colleagues viewed as an expression of the buried prejudices well-intentioned liberals never think they have.
  • Ms. Bilek quickly realized that she had drawn a terrible — perhaps unforgivable — analogy
  • “begun education and counseling to uncover and overcome my biases.”
  • To colleagues in the field, the circumstances of Ms. Bilek’s departure struck a note that was both ironic and painful.
  • Decades ago, long before it became commonplace, Ms. Bilek railed against the bar exam and other standardized tests for their disparate impact on low-income students
  • She had presented herself and the institution as “anti-racist,” they wrote, while ignoring how her own decisions perpetuated “institutional racism.”
  • On the face of things, it seemed as though Ms. Bilek had been lost to the maw of cancel culture and its relentless appetite for hapless boomer prey.
  • “I regret that my mistake means that I will not be doing that work” — the work of fighting racism — “with my CUNY colleagues,”
  • “Her reputation in the world of deans is that of someone who cares deeply about racial justice,”
  • Prestige in academia begins, of course, with tenure. Ms. Bilek’s troubles started last spring when she argued for granting early tenure, an extremely precious commodity, to someone about to become an administrator — a young white woman named Allie Robbins
  • Without tenure, administrative work in a university is an especially oppressive time suck, robbing an academic of the hours that could be spent on research and writing and conference-going — essentially, what is required for tenure.
  • Beyond that, the risk of alienating people who someday might weigh in on your own tenure case remained high.
  • As the fall progressed, anger continued to foment around Ms. Bilek.
  • The day after Christmas, 22 faculty members wrote a letter denouncing her wish to leapfrog a white junior academic in the promotion process, her “slaveholder” reference, and what they viewed as her resistance to listen to faculty members of color on the personnel committee “as they pointed out the disparate racial impacts” of her conduct.
  • “But I am certain that the work they do within the Law School and in the world will bring us to a more equal, anti-racist society.”
  • Next came a list of demands that included a public apology for her misdeeds, changes to practices in governance and a retreat from any outside roles furthering the perception that she was “an anti-racist dean.”
  • “We intentionally chose not to ask her to step down but to demand instead that she commit to the systemic work that her stated anti-racist principles required,”
  • “Dean Bilek chose to ignore that outstretched hand.”
  • “We said, ‘We don’t want to make a scene — no single action should define any of us. We don’t want to take away from all the work you’ve done at the law school, but we want the accountability,’”
  • “I thought there was a chance for redemption — we do not want to cancel folks; we are not people who think in carceral ways.”
  • Kept under wraps, news of all this turmoil reached the student body only last week, and when they discovered what Ms. Bilek had said and done and how long they had been left oblivious, a large and vocal faction did not feel as generously
caelengrubb

The future's in the past | Culture | The Guardian - 1 views

  • Whenever the importance of history is discussed, epigrams and homilies come tripping easily off our tongues: How can we understand our present or glimpse our future if we cannot understand our past? How can we know who we are if we don't know who we were?
  • While history may be condemned to repeat itself, historians are condemned to repeat themselves. History is bunk or possibly bunkum.
  • Historians, more than any other class, spend a great deal of time justifying their trade, defining it and aphorising it, seeming to lavish more attention on historiography than history.
  • ...15 more annotations...
  • Historians are no longer grandees at the centre of a fixed civilisation; they are simply journalists writing about celebrities who haven't got the grace to be alive any more
  • There are those who wonder if the whole of history is now valuable only as a politically correct lesson in the stupidity and cruelty of monarchs, aristocrats, industrialists and generals
  • You don't even have to dignify it with ideological abstractions any more; history is really the story of a series of subjugations, oppressions, exploitations and abuses.
  • The biggest challenge facing the great teachers and communicators of history is not to teach history itself, nor even the lessons of history, but why history matters.
  • A history in which historians have to stand on one side of an argument or another, for, in between, they are nothing but dry-as-dust statisticians
  • we measure the exponential growth in the public appetite for history
  • History, then, as one long, grovelling apology or act of self-abasement and self-laceration.
  • After all, isn't that what poetry and novels show, that humanity is best comprehended by understanding humans rather than ideas? But for some, this leads to the worry that history can now only mean witness
  • Certainly, history is popular in grand traditional forms, but new subgenres of history have, for the last 20 years, exploded in popularity, too.
  • We haven't arrived at our own moral and ethical imperatives by each of us working them out from first principles; we have inherited them and they were born out of blood and suffering, as all human things and human beings are.
  • This does not stop us from admiring and praising the progressive heroes who got there early and risked their lives to advance causes that we now take for granted.
  • In the end, I suppose history is all about imagination rather than facts
  • If you cannot feel what our ancestors felt when they cried: 'Wilkes and Liberty!' or, indeed, cried: 'Death to Wilkes!', if you cannot feel with them, then all you can do is judge them and condemn them, or praise them and over-adulate them.
  • History is not the story of strangers, aliens from another realm; it is the story of us had we been born a little earlier
  • History is memory
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
Javier E

Why Study History? (1985) | AHA - 0 views

  • Isn't there quite enough to learn about the world today? Why add to the burden by looking at the past
  • Historical knowledge is no more and no less than carefully and critically constructed collective memory. As such it can both make us wiser in our public choices and more richly human in our private lives.
  • Without individual memory, a person literally loses his or her identity, and would not know how to act in encounters with others. Imagine waking up one morning unable to tell total strangers from family and friends!
  • ...37 more annotations...
  • Collective memory is similar, though its loss does not immediately paralyze everyday private activity. But ignorance of history-that is, absent or defective collective memory-does deprive us of the best available guide for public action, especially in encounters with outsider
  • Often it is enough for experts to know about outsiders, if their advice is listened to. But democratic citizenship and effective participation in the determination of public policy require citizens to share a collective memory, organized into historical knowledge and belief
  • This value of historical knowledge obviously justifies teaching and learning about what happened in recent times, for the way things are descends from the way they were yesterday and the day before that
  • in fact, institutions that govern a great deal of our everyday behavior took shape hundreds or even thousands of years ago
  • Only an acquaintance with the entire human adventure on earth allows us to understand these dimensions of contemporary reality.
  • it follows that study of history is essential for every young person.
  • Collective memory is quite the same. Historians are always at work reinterpreting the past, asking new questions, searching new sources and finding new meanings in old documents in order to bring the perspective of new knowledge and experience to bear on the task of understanding the past.
  • what we know and believe about history is always changing. In other words, our collective, codified memory alters with time just as personal memories do, and for the same reasons.
  • skeptics are likely to conclude that history has no right to take student time from other subjects. If what is taught today is not really true, how can it claim space in a crowded school curriculum?
  • what if the world is more complicated and diverse than words can ever tell? What if human minds are incapable of finding' neat pigeon holes into which everything that happens will fit?
  • What if we have to learn to live with uncertainty and probabilities, and act on the basis of the best guesswork we are capable of?
  • Then, surely, the changing perspectives of historical understanding are the very best introduction we can have to the practical problems of real life. Then, surely, a serious effort to understand the interplay of change and continuity in human affairs is the only adequate introduction human beings can have to the confusing flow of events that constitutes the actual, adult world.
  • Memory is not something fixed and forever. As time passes, remembered personal experiences take on new meanings.
  • Early in this century, teachers and academic administrators pretty well agreed that two sorts of history courses were needed: a survey of the national history of the United States and a survey of European history.
  • Memory, indeed, makes us human. History, our collective memory, carefully codified and critically revised, makes us social, sharing ideas and ideals with others so as to form all sorts of different human groups
  • The varieties of history are enormous; facts and probabilities about the past are far too numerous for anyone to comprehend them all. Every sort of human group has its own histor
  • Where to start? How bring some sort of order to the enormous variety of things known and believed about the past?
  • Systematic sciences are not enough. They discount time, and therefore oversimplify reality, especially human reality.
  • This second course was often broadened into a survey of Western civilization in the 1930s and 1940s
  • But by the 1960s and 1970s these courses were becoming outdated, left behind by the rise of new kinds social and quantitative history, especially the history of women, of Blacks, and of other formerly overlooked groups within the borders of the United States, and of peoples emerging from colonial status in the world beyond our borders.
  • much harder to combine old with new to make an inclusive, judiciously balanced (and far less novel) introductory course for high school or college students.
  • But abandoning the effort to present a meaningful portrait of the entire national and civilizational past destroyed the original justification for requiring students to study history
  • Competing subjects abounded, and no one could or would decide what mattered most and should take precedence. As this happened, studying history became only one among many possible ways of spending time in school.
  • The costs of this change are now becoming apparent, and many concerned persons agree that returning to a more structured curriculum, in which history ought to play a prominent part, is imperative.
  • three levels of generality seem likely to have the greatest importance for ordinary people.
  • First is family, local, neighborhood history
  • Second is national history, because that is where political power is concentrated in our time.
  • Last is global history, because intensified communications make encounters with all the other peoples of the earth increasingly important.
  • Other pasts are certainly worth attention, but are better studied in the context of a prior acquaintance with personal-local, national, and global history. That is because these three levels are the ones that affect most powerfully what all other groups and segments of society actually do.
  • National history that leaves out Blacks and women and other minorities is no longer acceptable; but American history that leaves out the Founding Fathers and the Constitution is not acceptable either. What is needed is a vision of the whole, warts and all.
  • the study of history does not lead to exact prediction of future events. Though it fosters practical wisdom, knowledge of the past does not permit anyone to know exactly what is going to happen
  • Consequently, the lessons of history, though supremely valuable when wisely formulated, become grossly misleading when oversimplifiers try to transfer them mechanically from one age to another, or from one place to another.
  • Predictable fixity is simply not the human way of behaving. Probabilities and possibilities-together with a few complete surprises-are what we live with and must learn to expect.
  • Second, as acquaintance with the past expands, delight in knowing more and more can and often does become an end in itself.
  • On the other hand, studying alien religious beliefs, strange customs, diverse family patterns and vanished social structures shows how differently various human groups have tried to cop
  • Broadening our humanity and extending our sensibilities by recognizing sameness and difference throughout the recorded past is therefore an important reason for studying history, and especially the history of peoples far away and long ago
  • For we can only know ourselves by knowing how we resemble and how we differ from others. Acquaintance with the human past is the only way to such self knowledge.
ilanaprincilus06

Trump has trashed America's most important alliance. The rift with Europe could take de... - 0 views

  • The presidency of Donald Trump has left such a wretched stench in Europe that it's hard to see how, even in four years, Joe Biden could possibly get America's most important alliance back on track.
  • Throughout Trump's term, Europeans have been walking a tightrope, trying to balance outright condemnation of the President's most destructive behavior with not alienating the leader of the Western world.
  • Trump went out of his way to "gradually undo a lot of what the EU was working towards on the world stage," pointing specifically to the Iran nuclear deal and the Paris climate accord.
  • ...13 more annotations...
  • "The European relationship has changed and will now be shrouded in skepticism,"
  • Trump's outward aggression affected all aspects of European life, be it trade, defense or even the emotional shared ideas and cultural ties.
  • All those things suddenly seem debased and of less value."
  • "When they did take big stances on things like China or Iran, they chose not to involve anyone, leaving Europeans scrambling for a response,"
  • But he might have to accept that America's role in these relationships has changed."
  • This has led to lots of countries having to think more seriously about their future with a less assertive US,"
  • "In some respects, it was a good thing Trump forced us to think more about diplomatic initiatives, NATO and withdrawal of US troops,"
  • A view many European officials share is that no matter how friendly Biden is, Trump happened once -- and could happen again.
  • In 2024, Ivanka Trump, Donald Trump Jr., Mike Pompeo, or any other of his allies could conceivably pick up the torch and win an election.
  • "We cannot afford to be naive. If you look at the number of votes that Trump got, he wields an influence on American voters.
  • This anti-global, 'America First' undercurrent in American politics is still very much alive and we have to hedge our bets,"
  • For the US, it's unclear whether being downgraded as a diplomatic force is something that its citizens, who've lived through four introspective years of "America First," will even care about.
  • Regardless, the Trump era has left Europeans with little choice but to wait and see how much of a priority Biden places on reclaiming America's place on the world stage.
pier-paolo

The Brain on Love - The New York Times - 0 views

  • A RELATIVELY new field, called interpersonal neurobiology, draws its vigor from one of the great discoveries of our era: that the brain is constantly rewiring itself based on daily life.
  • All relationships change the brain — but most important are the intimate bonds that foster or fail us, altering the delicate circuits that shape memories, emotions and that ultimate souvenir, the self
  • At birth, the brain starts blazing new neural pathways based on its odyssey in an alien world. An infant is steeped in bright, buzzing, bristling sensations, raw emotions and the curious feelings they unleash, weird objects, a flux of faces, shadowy images and dreams
  • ...3 more annotations...
  • As the most social apes, we inhabit a mirror-world in which every important relationship, whether with spouse, friend or child, shapes the brain, which in turn shapes our relationships.
  • Just consider how much learning happens when you choose a mate. Along with thrilling dependency comes glimpsing the world through another’s eyes; forsaking some habits and adopting others (good or bad); tasting new ideas, rituals, foods or landscapes; a slew of added friends and family; a tapestry of physical intimacy and affection; and many other catalysts, including a tornadic blast of attraction and attachment hormones — all of which revamp the brain.
  • During idylls of safety, when your brain knows you’re with someone you can trust, it needn’t waste precious resources coping with stressors or menace. Instead it may spend its lifeblood learning new things or fine-tuning the process of healing.
mshilling1

A psychologist explains why people believe conspiracy theories - Business Insider - 0 views

  • a personality trait where a person is so "focused on their own interests they will manipulate, deceive, and exploit others to achieve their goals."
  • In terms of cognitive processes, people with stronger conspiracy beliefs are more likely to overestimate the likelihood of co-occurring events, to attribute intentionality where it is unlikely to exist, and to have lower levels of analytic thinking.
  • But once a person starts inventing a narrative out of thin air, you can see very little critical thinking occurring.
  • ...8 more annotations...
  • Lantian et al.'s (2017) research examined the role of a person's 'need for uniqueness' and a belief of conspiracy theories, and found a correlation.
  • We argue that people high in need for uniqueness should be more likely than others to endorse conspiracy beliefs because conspiracy theories represent the possession of unconventional and potentially scarce information.
  • People who believe in conspiracy theories can feel "special," in a positive sense, because they may feel that they are more informed than others about important social and political events.
  • Our findings can also be connected to recent research demonstrating that individual narcissism, or a grandiose idea of the self, is positively related to belief in conspiracy theories.
  • Due to these individuals feeling alienated from their peers, they may also turn to conspiracist groups for a sense of belonging and community, or to marginalized subcultures in which conspiracy theories are potentially more rife.
  • In this sense, conspiracy theories give a sense of meaning, security and control over an unpredictable and dangerous world.
  • The Internet has amplified the abilities of these like-minded people to come together to share and expand on their conspiracy theories.
  • Save your breath arguing with people who believe in them, as no amount of facts will dissuade them from their false belief.
« First ‹ Previous 41 - 60 of 79 Next ›
Showing 20 items per page