Skip to main content

Home/ TOK Friends/ Group items tagged old

Rss Feed Group items tagged

Emily Freilich

50 Cliches Of Gray: In Defense Of Old Truisms : The Protojournalist : NPR - 1 views

  • The word police at Lake Superior State University in Michigan have been trying to strike the phrase from public discourse since 1999.
  • "English is a very dynamic language," says David F. Beer, a retired writing professor at the University of Texas at Austin, "and parts of it are always growing or dropping off. And we don't have an English Academy as the French do to tell us what is right and what is wrong in the language. Thus cliches such as 'at the end of the day' are to be found all over the language.
  • While some hoary sayings occasionally fall by the wayside — for lots of reasons, such as a rise in social awareness — others will be with us from here to eternity.
  • ...4 more annotations...
  • Cliches can cut through claptrap like a knife through butter. We can use them as a kind of societal shorthand.
  • But the very fact that a word or phrase has become a cliche, "through popular use – and overuse ," the report continues, "suggests that the phrase has lost originality and ingenuity and, thus, impact."
  • "Avoid cliches ...like the plague," Toastmasters International, a worldwide group that works to improve communication skills, advises. Tongue-in-cheek, of course.
  • A cliche can be as comfortable as an old shoe, as helpful as all get out. A cliche is like a long lost friend,
Javier E

New Thinking and Old Books Revisited - NYTimes.com - 0 views

  • Mark Thoma’s classic crack — “I’ve learned that new economic thinking means reading old books” — has a serious point to it. We’ve had a couple of centuries of economic thought at this point, and quite a few smart people doing the thinking. It’s possible to come up with truly new concepts and approaches, but it takes a lot more than good intentions and casual observation to get there.
  • There is definitely a faction within economics that considers it taboo to introduce anything into its analysis that isn’t grounded in rational behavior and market equilibrium
  • what I do, and what everyone I’ve just named plus many others does, is a more modest, more eclectic form of analysis. You use maximization and equilibrium where it seems reasonably consistent with reality, because of its clarifying power, but you introduce ad hoc deviations where experience seems to demand them — downward rigidity of wages, balance-sheet constraints, bubbles (which are hard to predict, but you can say a lot about their consequences).
  • ...4 more annotations...
  • You may say that what we need is reconstruction from the ground up — an economics with no vestige of equilibrium analysis. Well, show me some results. As it happens, the hybrid, eclectic approach I’ve just described has done pretty well in this crisis, so you had better show me some really superior results before it gets thrown out the window.
  • if you think you’ve found a fundamental logical flaw in one of our workhorse economic models, the odds are very strong that you’ve just made a mistake.
  • it’s quite clear that the teaching of macroeconomics has gone seriously astray. As Saraceno says, the simple models that have proved so useful since 2008 are by and large taught only at the undergrad level — they’re treated as too simple, too ad hoc, whatever, to make it into the grad courses even at places that aren’t very ideological.
  • to temper your modeling with a sense of realism you need to know something about reality — and not just the statistical properties of U.S. time series since 1947. Economic history — global economic history — should be a core part of the curriculum. Nobody should be making pronouncements on macro without knowing a fair bit about the collapse of the gold standard in the 1930s, what actually happened in the stagflation of the 1970s, the Asian financial crisis of the 90s, and, looking forward, the euro crisis.
Javier E

Our Biased Brains - NYTimes.com - 0 views

  • The human brain seems to be wired so that it categorizes people by race in the first one-fifth of a second after seeing a face
  • Racial bias also begins astonishingly early: Even infants often show a preference for their own racial group. In one study, 3-month-old white infants were shown photos of faces of white adults and black adults; they preferred the faces of whites. For 3-month-old black infants living in Africa, it was the reverse.
  • in evolutionary times we became hard-wired to make instantaneous judgments about whether someone is in our “in group” or not — because that could be lifesaving. A child who didn’t prefer his or her own group might have been at risk of being clubbed to death.
  • ...7 more annotations...
  • I encourage you to test yourself at implicit.harvard.edu. It’s sobering to discover that whatever you believe intellectually, you’re biased about race, gender, age or disability.
  • unconscious racial bias turns up in children as soon as they have the verbal skills to be tested for it, at about age 4. The degree of unconscious bias then seems pretty constant: In tests, this unconscious bias turns out to be roughly the same for a 4- or 6-year-old as for a senior citizen who grew up in more racially oppressive times.
  • Many of these experiments on in-group bias have been conducted around the world, and almost every ethnic group shows a bias favoring its own. One exception: African-Americans.
  • in contrast to other groups, African-Americans do not have an unconscious bias toward their own. From young children to adults, they are essentially neutral and favor neither whites nor blacks.
  • even if we humans have evolved to have a penchant for racial preferences from a very young age, this is not destiny. We can resist the legacy that evolution has bequeathed us.
  • “We wouldn’t have survived if our ancestors hadn’t developed bodies that store sugar and fat,” Banaji says. “What made them survive is what kills us.” Yet we fight the battle of the bulge and sometimes win — and, likewise, we can resist a predisposition for bias against other groups.
  • Deep friendships, especially romantic relationships with someone of another race, also seem to mute bias
carolinewren

Whoops! A creationist museum supporter stumbled upon a major fossil find. - The Washing... - 0 views

  • Adhering to the most extreme form of religious creationism, the exhibits "prove" that the Earth is only around 6,000 years old, and that humans and dinosaurs co-existed.
  • Unfortunately, Nernberg just dug up a 60-million-year-old fish
  • Local outlets report that the man is far from shaken by the bony fish, which he found while excavating a basement in Calgary.
  • ...9 more annotations...
  • He just doesn't believe they're that old. And he's quite the fossil lover.
  • We all have the same evidence, and it’s just a matter of how you interpret it,”
  • “There’s no dates stamped on these things."
  • Just, you know, isotopic dating, basic geology, really shoddy stuff like that.
  • the science of dating fossils is not shaky -- at least not on the order of tens of millions of years of error -- so this fossil and the rocks around it really do give new earth creationism the boot.
  • But this can go down as one of the best examples ever of why it's downright impossible to convince someone who's "opposed" to evolution that it's a basic fact: If you think the very tenets of science are misguided, pretty much any evidence presented to you can be written off as fabricated or misinterpreted.
  • scientific community is thrilled and grateful for the find, and the University of Calgary will unveil the five fossils on Thursday.
  • It's an important point in Earth's evolutionary history, because new species were popping up all over to make up for the ecological niches dinos left behind.
  • Ironically, Nernberg's contributions at the Creation Science Museum are almost certainly what scientists have to thank for the find
grayton downing

Oldest Fossil of Ape Discovered | The Scientist Magazine® - 0 views

  • genes of living primates tell us that the ape lineage, which includes humans, diverged from the Old World monkeys such as baboons and macaques during the late Oligocene period, between 25 and 30 million years ago. B
  • Now, a team of paleontologists have found two new species in Tanzania’s Rukwa Rift Basin that help to fill this gap
  • species were found in sediments that date to precisely 25.2 million years ago.
  • ...7 more annotations...
  • first new fossil was unearthed in 2011, when the team found a molar belonging to the oldest known Old World monkey or cercopithecoid, which they named Nsungwepithecus gunnelli. A year later and 15 kilometers away, they found the oldest known remains of a hominoid or “ape”—a jawbone and four teeth belonging to a new species that they dubbed Rukwapithecus fleaglei. 
  • “That implies that the hominoid-cercopithecoid divergence was well underway.”
  • fossils from the late Oligocene are rare, partly because there are few deposits of the right age and they tend to be covered in thick vegetation.
  • the new discoveries expand the range of Oligocene primates from a handful of fossils in Kenya and Saudi Arabia, to the more southerly country of Tanzania. This location also suggests that the ape and Old World monkey lineages arose against a background of great geological upheaval. By the time Rukwapithecus and Nsungwepithecus appeared, the climate was warming and the flat Tanzanian landscape had already begun to fragment into mountains, deep rifts, and lakes, creating the beginnings of the Eastern African Rift. These changes could have created many new habitats, and fueled the diversification of the local primates.
  • were expecting [similar] fossils to occur in a more tectonically stable landscape, and were searching to the north in Kenya, Libya, and Egypt,”
  • “The Rukwa Rift Basin Project has succeeded in destroying those preconceived notions and opens many new possibilities.”
  • We suspect that by 25 million years ago, there were already several independent lineages of both apes and monkeys in Africa, but paleontologists just haven’t found their fossil remains yet. We are soon headed back to the field to try to find some more!”
carolinewren

Book Review: 'A New History of Life' by Peter Ward and Joe Kirschvink - WSJ - 0 views

  • I imagine that physicists are similarly deluged with revelations about how to build a perpetual-motion machine or about the hitherto secret truth behind relativity. And so I didn’t view the arrival of “A New History of Life” with great enthusiasm.
  • subtitle breathlessly promises “radical new discoveries about the origins and evolution of life on earth,” while the jacket copy avers that “our current paradigm for understanding the history of life on Earth dates back to Charles Darwin’s time, yet scientific advances of the last few decades have radically reshaped that aging picture.”
  • authors Peter Ward and Joe Kirschvink are genuine scientists—paleontologists, to be exact. And they can write.
  • ...16 more annotations...
  • even genuine scientists are human and as such susceptible to the allure of offering up new paradigms (as the historian of science Thomas Kuhn put it)
  • paleontologist Stephen Jay Gould insisted that his conception of “punctuated equilibria” (a kind of Marxist biology that blurred the lines between evolution and revolution), which he developed along with fellow paleontologist Niles Eldredge, upended the traditional Darwinian understanding of how natural selection works.
  • This notion doesn’t constitute a fundamental departure from plain old evolution by natural selection; it simply italicizes that sometimes the process is comparatively rapid, other times slower.
  • In addition, they have long had a peculiar perspective on evolution, because of the limitations of the fossil record
  • Darwin was a pioneering geologist as well as the greatest of all biologists, and his insights were backgrounded by the key concept of uniformitarianism, as advocated by Charles Lyell, his friend and mentor
  • previously regnant paradigm among geologists had been “catastrophism
  • fossil record was therefore seen as reflecting the creation and extinction of new species by an array of dramatic and “unnatural” dei ex machina.
  • Of late, however, uniformitarianism has been on a losing streak. Catastrophism is back, with a bang . . . or a flood, or a burst of extraterrestrial radiation, or an onslaught of unpleasant, previously submerged chemicals
  • This emphasis on catastrophes is the first of a triad of novelties on which “A New History of Life” is based. The second involves an enhanced role for some common but insufficiently appreciated inorganic molecules, notably carbon dioxide, oxygen and hydrogen sulfide.
  • Life didn’t so much unfold smoothly over hundreds of millions of years as lurch chaotically in response to diverse crises and opportunities: too much oxygen, too little carbon dioxide, too little oxygen, too much carbon dioxide, too hot, too cold
  • So far, so good, except that in their eagerness to emphasize what is new and different, the authors teeter on the verge of the same trap as Gould: exaggerating the novelty of their own ideas.
  • Things begin to unravel when it comes to the third leg of Messrs. Ward and Kirschvink’s purported paradigmatic novelty: a supposed role for ecosystems—rain forests, deserts, rivers, coral reefs, deep-sea vents—as units of evolutionary change
  • “While the history of life may be populated by species,” they write, “it has been the evolution of ecosystems that has been the most influential factor in arriving at the modern-day assemblage of life. . . . [W]e know that on occasion in the deep past entirely new ecosystems appear, populated by new kinds of life.” True enough, but it is those “new kinds of life,” not whole ecosystems, upon which natural selection acts.
  • One of the most common popular misconceptions about evolution is that it proceeds “for the good of the species.”
  • The problem is that smaller, nimbler units are far more likely to reproduce differentially than are larger, clumsier, more heterogeneous ones. Insofar as ecosystems are consequential for evolution—and doubtless they are—it is because, like occasional catastrophes, they provide the immediate environment within which something not-so-new is acted out.
  • This is natural selection doing its same-old, same-old thing: acting by a statistically potent process of variation combined with selective retention and differential reproduction, a process that necessarily operates within the particular ecosystem that a given lineage occupies.
Javier E

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
anonymous

Saudi Arabia's crown prince is making a lot of enemies (opinion) - CNN - 0 views

  • Saudi Arabia's Prince Mohammed bin Salman, first in line to inherit the throne from his 81-year-old father, is not a patient man. The 32-year-old is driving a frenetic pace of change in pursuit of three goals: securing his hold on power, transforming Saudi Arabia into a very different country, and pushing back against Iran.
  • In the two years since his father ascended the throne, this favorite son of King Salman bin Abdulaziz has been spectacularly successful at achieving the first item on his agenda. He has become so powerful so fast that observers can hardly believe how brazenly he is dismantling the old sedate system of family consensus, shared privilege and rigid ultraconservatism.
  • He has vowed to improve the status of women, announcing that the ban on women driving will be lifted next year, and limiting the scope of the execrable "guardianship" system, which treats women like children, requiring permission from male guardians for basic activities. He has also restrained the despised religious police. And just last month he called for a return to a "moderate Islam open to the world and all religions," combating extremism and empowering its citizens.
  • ...1 more annotation...
  • With so many enemies, the crown prince needs to produce more than a vision, he needs to show tangible results. The days of a quiet, patient Saudi Arabia are now over.
anonymous

Opinion | I Survived 18 Years in Solitary Confinement - The New York Times - 0 views

  • I Survived 18 Years in Solitary Confinement
  • Mr. Manuel is an author, activist and poet. When he was 14 years old, he was sentenced to life in prison with no parole and spent 18 years in solitary confinement.
  • Imagine living alone in a room the size of a freight elevator for almost two decades.
  • ...33 more annotations...
  • As a 15-year-old, I was condemned to long-term solitary confinement in the Florida prison system, which ultimately lasted for 18 consecutive years
  • From age 15 to 33.
  • For 18 years I didn’t have a window in my room to distract myself from the intensity of my confinement
  • I wasn’t permitted to talk to my fellow prisoners or even to myself. I didn’t have healthy, nutritious food; I was given just enough to not die
  • These circumstances made me think about how I ended up in solitary confinement.
  • United Nations standards on the treatment of prisoners prohibits solitary confinement for more than 15 days, declaring it “cruel, inhuman or degrading.”
  • For this I was arrested and charged as an adult with armed robbery and attempted murder.
  • My court-appointed lawyer advised me to plead guilty, telling me that the maximum sentence would be 15 years. So I did. But my sentence wasn’t 15 years — it was life imprisonment without the possibility of parole.
  • But a year and a half later, at age 15, I was put back into solitary confinement after being written up for a few minor infractions.
  • Florida has different levels of solitary confinement; I spent the majority of that time in one of the most restrictive
  • I was finally released from prison in 2016 thanks to my lawyer, Bryan Stevenson
  • Researchers have long concluded that solitary confinement causes post-traumatic stress disorder and impairs prisoners’ ability to adjust to society long after they leave their cell.
  • In the summer of 1990, shortly after finishing seventh grade, I was directed by a few older kids to commit a robbery. During the botched attempt, I shot a woman. She suffered serious injuries to her jaw and mouth but survived. It was reckless and foolish on my part, the act of a 13-year-old in crisis, and I’m simply grateful no one died.
  • More aggressive change is needed in state prison systems
  • In 2016, the Obama administration banned juvenile solitary confinement in federal prisons, and a handful of states have advanced similar reforms for both children and adults.
  • Yet the practice, even for minors, is still common in the United States, and efforts to end it have been spotty
  • Because solitary confinement is hidden from public view and the broader prison population, egregious abuses are left unchecked
  • I watched a corrections officer spray a blind prisoner in the face with chemicals simply because he was standing by the door of his cell as a female nurse walked by. The prisoner later told me that to justify the spraying, the officer claimed the prisoner masturbated in front of the nurse.
  • I also witnessed the human consequences of the harshness of solitary firsthand: Some people would resort to cutting their stomachs open with a razor and sticking a plastic spork inside their intestines just so they could spend a week in the comfort of a hospital room with a television
  • On occasion, I purposely overdosed on Tylenol so that I could spend a night in the hospital. For even one night, it was worth the pain.
  • Another time, I was told I’d be switching dorms, and I politely asked to remain where I was because a guard in the new area had been overly aggressive with me. In response, four or five officers handcuffed me, picked me up by my feet and shoulders, and marched with me to my new dorm — using my head to ram the four steel doors on the way there.
  • The punishments were wholly disproportionate to the infractions. Before I knew it, months in solitary bled into years, years into almost two decades.
  • As a child, I survived these conditions by conjuring up stories of what I’d do when I was finally released. My mind was the only place I found freedom from my reality
  • the only place I could play basketball with my brother or video games with my friends, and eat my mother’s warm cherry pie on the porch.
  • No child should have to use their imagination this way — to survive.
  • It is difficult to know the exact number of children in solitary confinement today. The Liman Center at Yale Law School estimated that 61,000 Americans (adults and children) were in solitary confinement in the fall of 2017
  • No matter the count, I witnessed too many people lose their minds while isolated. They’d involuntarily cross a line and simply never return to sanity. Perhaps they didn’t want to. Staying in their mind was the better, safer, more humane option.
  • Solitary confinement is cruel and unusual punishment, something prohibited by the Eighth Amendment, yet prisons continue to practice it.
  • When it comes to children, elimination is the only moral option. And if ending solitary confinement for adults isn’t politically viable, public officials should at least limit the length of confinement to 15 days or fewer, in compliance with the U.N. standards
  • As I try to reintegrate into society, small things often awaken painful memories from solitary. Sometimes relationships feel constraining. It’s difficult to maintain the attention span required for a rigid 9-to-5 job. At first, crossing the street and seeing cars and bikes racing toward me felt terrifying.
  • I will face PTSD and challenges big and small for the rest of my life because of what I was subjected to.
  • And some things I never will — most of all, that this country can treat human beings, especially children, as cruelly as I was treated.
  • Sadly, solitary confinement for juveniles is still permissible in many states. But we have the power to change that — to ensure that the harrowing injustice I suffered as a young boy never happens to another child in America.
  •  
    A very eye-opening article and story told by a victim about young children facing solitary confinement.
caelengrubb

Why Is Memory So Good and So Bad? - Scientific American - 0 views

  • Memories of visual images (e.g., dinner plates) are stored in what is called visual memory.
  • Our minds use visual memory to perform even the simplest of computations; from remembering the face of someone we’ve just met, to remembering what time it was last we checked. Without visual memory, we wouldn’t be able to store—and later retrieve—anything we see.
  • ust as a computer’s memory capacity constrains its abilities, visual memory capacity has been correlated with a number of higher cognitive abilities, including academic success, fluid intelligence (the ability to solve novel problems), and general comprehension.
  • ...13 more annotations...
  • For many reasons, then, it would be very useful to understand how visual memory facilitates these mental operations, as well as constrains our ability to perform them
  • Visual working memory is where visual images are temporarily stored while your mind works away at other tasks—like a whiteboard on which things are briefly written and then wiped away. We rely on visual working memory when remembering things over brief intervals, such as when copying lecture notes to a notebook.
  • UC Davis psychologists Weiwei Zhang and Steven Luck have shed some light on this problem. In their experiment, participants briefly saw three colored squares flashed on a computer screen, and were asked to remember the colors of each square. Then, after 1, 4 or 10 seconds the squares re-appeared, except this time their colors were missing, so that all that was visible were black squares outlined in white.
  • The participants had a simple task: to recall the color of one particular square, not knowing in advance which square they would be asked to recall. The psychologists assumed that measuring how visual working memory behaves over increasing demands (i.e., the increasing durations of 1,4 or 10 seconds) would reveal something about how the system works.
  • If short-term visual memories fade away—if they are gradually wiped away from the whiteboard—then after longer intervals participants’ accuracy in remembering the colors should still be high, deviating only slightly from the square’s original color. But if these memories are wiped out all at once—if the whiteboard is left untouched until, all at once, scrubbed clean—then participants should make very precise responses (corresponding to instances when the memories are still untouched) and then, after the interval grows too long, very random guesses.
  • Which is exactly what happened: Zhang & Luck found that participants were either very precise, or they completely guessed; that is, they either remembered the square’s color with great accuracy, or forgot it completely
  • But this, it turns out, is not true of all memories
  • In a recent paper, Researchers at MIT and Harvard found that, if a memory can survive long enough to make it into what is called “visual long-term memory,” then it doesn’t have to be wiped out at all.
  • Talia Konkle and colleagues showed participants a stream of three thousand images of different scenes, such as ocean waves, golf courses or amusement parks. Then, participants were shown two hundred pairs of images—an old one they had seen in the first task, and a completely new one—and asked to indicate which was the old one.
  • Participants were remarkably accurate at spotting differences between the new and old images—96 percent
  • In a recent review, researchers at Harvard and MIT argue that the critical factor is how meaningful the remembered images are—whether the content of the images you see connects to pre-existing knowledge about them
  • This prior knowledge changes how these images are processed, allowing thousands of them to be transferred from the whiteboard of short-term memory into the bank vault of long-term memory, where they are stored with remarkable detail.
  • Together, these experiments suggest why memories are not eliminated equally— indeed, some don’t seem to be eliminated at all. This might also explain why we’re so hopeless at remembering some things, and yet so awesome at remembering others.
aprossi

Kazuyoshi Miura: 53-year-old signs contract extension to play in his 36th professional ... - 0 views

  • Kazuyoshi Miura just can't give up playing football
  • 53-year-old
  • 36th season as a professional.
  • ...7 more annotations...
  • passion for soccer have kept growing.
  • debut back in 1986 as a 19-year-old
  • first Japanese player to play in the Italian first division.
  • prolific international career with Japan.
  • the second-most career goals in Japanese national team history.
  • oldest player to score in a professional match
  • "This is legendary... Unbelievable - 36 seasons!!!!"
caelengrubb

The Linguistic Evolution of 'Like' - The Atlantic - 0 views

  • In our mouths or in print, in villages or in cities, in buildings or in caves, a language doesn’t sit still. It can’t. Language change has preceded apace even in places known for preserving a language in amber
  • Because we think of like as meaning “akin to” or “similar to,” kids decorating every sentence or two with it seems like overuse. After all, how often should a coherently minded person need to note that something is similar to something rather than just being that something?
  • First, let’s take like in just its traditional, accepted forms. Even in its dictionary definition, like is the product of stark changes in meaning that no one would ever guess.
  • ...19 more annotations...
  • To an Old English speaker, the word that later became like was the word for, of all things, “body.”
  • The word was lic, and lic was part of a word, gelic, that meant “with the body,” as in “with the body of,” which was a way of saying “similar to”—as in like
  • It was just that, step by step, the syllable lic, which to an Old English speaker meant “body,” came to mean, when uttered by people centuries later, “similar to”—and life went on.
  • Like has become a piece of grammar: It is the source of the suffix -ly.
  • Like has become a part of compounds. Likewise began as like plus a word, wise, which was different from the one meaning “smart when either a child or getting old.”
  • Dictionaries tell us it’s pronounced “like-MINE-did,” but I, for one, say “LIKE- minded” and have heard many others do so
  • Therefore, like is ever so much more than some isolated thing clinically described in a dictionary with a definition like “(preposition) ‘having the same characteristics or qualities as; similar to.’”
  • What we are seeing in like’s transformations today are just the latest chapters in a story that began with an ancient word that was supposed to mean “body.”
  • It’s under this view of language—as something becoming rather than being, a film rather than a photo, in motion rather than at rest—that we should consider the way young people use (drum roll, please) like
  • The new like, then, is associated with hesitation.
  • So today’s like did not spring mysteriously from a crowd on the margins of unusual mind-set and then somehow jump the rails from them into the general population.
  • The problem with the hesitation analysis is that this was a thoroughly confident speaker.
  • It’s real-life usage of this kind—to linguists it is data, just like climate patterns are to meteorologists—that suggests that the idea of like as the linguistic equivalent to slumped shoulders is off.
  • Understandably so, of course—the meaning of like suggests that people are claiming that everything is “like” itself rather than itself.
  • The new like acknowledges unspoken objection while underlining one’s own point (the factuality). Like grandparents translates here as “There were, despite what you might think, actually grandparents.”
  • Then there is a second new like, which is closer to what people tend to think of all its new uses: it is indeed a hedge.
  • Then, the two likes I have mentioned must be distinguished from yet a third usage, the quotative like—as in “And she was like, ‘I didn’t even invite him.’
  • This is yet another way that like has become grammar. The meaning “similar to” is as natural a source here as it was for -ly: mimicking people’s utterances is talking similarly to, as in “like,” them.
  • Thus the modern American English speaker has mastered not just two, but actually three different new usages of like.
anonymous

Pandemic-Proof Your Habits - The New York Times - 1 views

  • The good news is that much of what we miss about our routines and customs, and what makes them beneficial to us as a species, has more to do with their comforting regularity than the actual behaviors
    • anonymous
       
      Our brains have that much power over our emotions, and can change how we feel about the world when they experience a change in routine.
  • The key to coping during this, or any, time of upheaval is to quickly establish new routines so that, even if the world is uncertain, there are still things you can count on.
    • anonymous
       
      I haven't really thought of this, since I'm so set on getting back to old routines.
  • Human beings are prediction machines.
  • ...28 more annotations...
  • Our brains are statistical organs that are built simply to predict what will happen next
    • anonymous
       
      I don't know if we've talked about this specifically, more that we like and tend to make up patterns to "predict" the future and reassure ourselves. However, it's not real.
  • This makes sense because, in prehistoric times, faulty predictions could lead to some very unpleasant surprises — like a tiger eating you or sinking in quicksand.
  • So-called prediction errors (like finding salmon instead of turkey on your plate on Thanksgiving) send us into a tizzy because our brains interpret them as a potential threat.
    • anonymous
       
      We have talked about this- the survival aspect of this reaction to change.
  • Keep doing what you’ve been doing, because you did it before, and you didn’t die.
    • anonymous
       
      A good way of putting it.
  • all essentially subconscious efforts to make your world more predictable, orderly and safe.
  • Routines and rituals also conserve precious brainpower
  • It turns out our brains are incredibly greedy when it comes to energy consumption, sucking up 20 percent of calories while accounting for only 2 percent of overall body weight.
  • Our brains are literally overburdened with all the uncertainty caused by the pandemic.
  • Not only is there the seeming capriciousness of the virus, but we no longer have the routines that served as the familiar scaffolding of our lives
  • “It’s counterintuitive because we think of meaning in life as coming from these grandiose experiences
    • anonymous
       
      I've definitely felt this way.
  • grandiose
  • grandiose
  • Of course, you can always take routines and rituals too far, such as the extremely controlled and repetitive behaviors indicative of addiction, obsessive-compulsive disorder and various eating disorders.
  • it’s mundane routines that give us structure to help us pare things down and better navigate the world, which helps us make sense of things and feel that life has meaning.”
  • In the coronavirus era, people may resort to obsessive cleaning, hoarding toilet paper, stockpiling food or neurotically wearing masks when driving alone in their cars. On the other end of the spectrum are those who stubbornly adhere to their old routines because stopping feels more threatening than the virus.
  • You’re much better off establishing a new routine within the limited environment that we find ourselves in
  • Luckily, there is a vast repertoire of habits you can adopt and routines you can establish to structure your days no matter what crises are unfolding around you
  • The point is to find what works for you. It just needs to be regular and help you achieve your goals, whether intellectually, emotionally, socially or professionally. The best habits not only provide structure and order but also give you a sense of pleasure, accomplishment or confidence upon completion.
  • It could be as simple as making your bed as soon as you get up in the morning or committing to working the same hours in the same spot.
  • Pandemic-proof routines might include weekly phone or video calls with friends, Taco Tuesdays with the family, hiking with your spouse on weekends, regularly filling a bird feeder, set times for prayer or meditation, front yard happy hours with the neighbors or listening to an audiobook every night before bed.
  • The truth is that you cannot control what happens in life. But you can create a routine that gives your life a predictable rhythm and secure mooring.
    • anonymous
       
      It's all about changing your thoughts and not tricking exactly but helping your brain.
  • This frees your brain to develop perspective so you’re better able to take life’s surprises in stride.
  • I attended a Thanksgiving dinner several years ago where the hostess, without warning family and friends, broke with tradition and served salmon instead of turkey, roasted potatoes instead of mashed, raspberry coulis instead of cranberry sauce and … you get the idea.
  • Too many people are still longing for their old routines. Get some new ones instead.
  • It wasn’t that the meal itself was bad. In fact, the meal was outstanding. The problem was that it wasn’t the meal everyone was expecting.
  • When there are discrepancies between expectations and reality, all kinds of distress signals go off in the brain.
  • It doesn’t matter if it’s a holiday ritual or more mundane habit like how you tie your shoes; if you can’t do it the way you normally do it, you’re biologically engineered to get upset.
  • This in part explains people’s grief and longing for the routines that were the background melodies of their lives before the pandemic
ilanaprincilus06

Meet the neuroscientist shattering the myth of the gendered brain | Science | The Guardian - 0 views

  • Whatever its sex, this baby’s future is predetermined by the entrenched belief that males and females do all kinds of things differently, better or worse, because they have different brains.
  • how vital it is, how life-changing, that we finally unpack – and discard – the sexist stereotypes and binary coding that limit and harm us.
  • she is out in the world, debunking the “pernicious” sex differences myth: the idea that you can “sex” a brain or that there is such a thing as a male brain and a female brain.
  • ...18 more annotations...
  • since the 18th century “when people were happy to spout off about what men and women’s brains were like – before you could even look at them. They came up with these nice ideas and metaphors that fitted the status quo and society, and gave rise to different education for men and women.”
  • she couldn’t find any beyond the negligible, and other research was also starting to question the very existence of such differences. For example, once any differences in brain size were accounted for, “well-known” sex differences in key structures disappeared.
  • Are there any significant differences based on sex alone? The answer, she says, is no.
  • “The idea of the male brain and the female brain suggests that each is a characteristically homogenous thing and that whoever has got a male brain, say, will have the same kind of aptitudes, preferences and personalities as everyone else with that ‘type’ of brain. We now know that is not the case.
  • ‘Forget the male and female brain; it’s a distraction, it’s inaccurate.’ It’s possibly harmful, too, because it’s used as a hook to say, well, there’s no point girls doing science because they haven’t got a science brain, or boys shouldn’t be emotional or should want to lead.”
  • The next question was, what then is driving the differences in behaviour between girls and boys, men and women?
  • “that the brain is moulded from birth onwards and continues to be moulded through to the ‘cognitive cliff’ in old age when our grey cells start disappearing.
  • The rules will change how the brain works and how someone behaves.” The upshot of gendered rules? “The ‘gender gap’ becomes a self-fulfilling prophecy.”
  • The brain is also predictive and forward-thinking in a way we had never previously realised.
  • the brain is much more a function of experiences. If you learn a skill your brain will change, and it will carry on changing.”
  • The brain is a biological organ. Sex is a biological factor. But it is not the sole factor; it intersects with so many variables.”
  • Letting go of age-old certainties is frightening, concedes Rippon, who is both optimistic about the future, and fearful for it.
  • On the plus side, our plastic brains are good learners. All we need to do is change the life lessons.
  • One major breakthrough in recent years has been the realisation that, even in adulthood, our brains are continually being changed, not just by the education we receive, but also by the jobs we do, the hobbies we have, the sports we play.
  • Once we acknowledge that our brains are plastic and mouldable, then the power of gender stereotypes becomes evident.
  • Beliefs about sex differences (even if ill-founded) inform stereotypes, which commonly provide just two labels – girl or boy, female or male – which, in turn, historically carry with them huge amounts of “contents assured” information and save us having to judge each individual on their own merits
  • With input from exciting breakthroughs in neuroscience, the neat, binary distinctiveness of these labels is being challenged – we are coming to realise that nature is inextricably entangled with nurture.
  • The 21st century is not just challenging the old answers – it is challenging the question itself.
anonymous

Why Childhood Memories Disappear - The Atlantic - 0 views

  • Most adults can’t remember much of what happened to them before age 3 or so. What happens to the memories formed in those earliest years?
  • When I talk about my first memory, what I really mean is my first retained memory. Carole Peterson, a professor of psychology at Memorial University Newfoundland, studies children’s memories. Her research has found that small children can recall events from when they were as young as 20 months old, but these memories typically fade by the time they’re between 4 and 7 years old.
  • “People used to think that the reason that we didn’t have early memories was because children didn’t have a memory system or they were unable to remember things, but it turns out that’s not the case,” Peterson said. “Children have a very good memory system. But whether or not something hangs around long-term depends on on several other factors.”
  • ...8 more annotations...
  • Two of the most important factors, Peterson explained, are whether the memory “has emotion infused in it,” and whether the memory is coherent: Does the story our memory tells us actually hang together and make sense when we recall it later?
  • A professor at the University of North Carolina-Chapel Hill, Reznick explained that shortly after birth, infants can start forming impressions of faces and react when they see those faces again; this is recognition memory. The ability to understand words and learn language relies on working memory, which kicks in at around six months old. More sophisticated forms of memory develop in the child’s second year, as semantic memory allows children to retain understanding of concepts and general knowledge about the world.
  • I formed earlier memories using more rudimentary, pre-verbal means, and that made those memories unreachable as the acquisition of language reshaped how my mind works, as it does for everyone.
  • False memories do exist, but their construction appears to begin much later in life
  • A study by Peterson presented young children with fictitious events to see if they could be misled into remembering these non-existent events, yet the children almost universally avoided the bait. As for why older children and adults begin to fill in gaps in their memories with invented details, she pointed out that memory is a fundamentally constructive activity: We use it to build understanding of the world, and that sometimes requires more complete narratives than our memories can recall by themselves.
  • as people get older, it becomes easier to conflate actual memories with other stimuli.
  • He explained that recognition memory is our most pervasive system, and that associations with my hometown I formed as an infant could well have endured more than 20 years later, however vaguely.
  • give me enough time, and I’m sure that detail will be added to my memory. It’s just too perfect a story.
  •  
    Alasdair Wilkins talks about her childhood memories, false memory and how children remember moments of their early years.
manhefnawi

Human brains make new nerve cells - and lots of them - well into old age | Science News - 0 views

  • Your brain might make new nerve cells well into old age.
  • Understanding how healthy brains change over time is important for researchers untangling the ways that conditions like depression, stress and memory loss affect older brains.
  • When it comes to studying neurogenesis in humans, “the devil is in the details,” says Jonas Frisén, a neuroscientist at the Karolinska Institute in Stockholm who was not involved in the new research. Small differences in methodology — such as the way brains are preserved or how neurons are counted — can have a big impact on the results, which could explain the conflicting findings. The new paper “is the most rigorous study yet,” he says.
melnikju

Opinion | The New Science of Mind - The New York Times - 0 views

    • melnikju
       
      SCARY
    • melnikju
       
      Old-fashioned thinking, the brain is an organ, therefore it can have issues that need to be treated.
  • The problem for many people is that we cannot point to the underlying biological bases of most psychiatric disorders. In fact, we are nowhere near understanding them as well as we understand disorders of the liver or the heart.
  • ...3 more annotations...
  • . All of these regions can be disturbed in depressive illnesses.
    • melnikju
       
      This is highly interesting as someone with depression
  • The second finding is de novo point mutations, which arise spontaneously in the sperm of adult men. Sperm divide every 15 days. This continuous division and copying of DNA leads to errors, and the rate of error increases significantly with age: a 20-year-old will have an average of 25 de novo point mutations in his sperm, whereas a 40-year-old will have 65. These mutations are one reason older fathers are more likely to have children with autism and schizophrenia.
Javier E

Opinion | Trump, Musk and Kanye Are Twitter Poisoned - The New York Times - 0 views

  • By Jaron LanierMr. Lanier is a computer scientist and an author of several books on technology’s impact on people.
  • I have observed a change, or really a narrowing, in the public behavior of people who use Twitter or other social media a lot.
  • When I compare Mr. Musk, Mr. Trump and Ye, I see a convergence of personalities that were once distinct. The garish celebrity playboy, the obsessive engineer and the young artist, as different from one another as they could be, have all veered not in the direction of becoming grumpy old men, but into being bratty little boys in a schoolyard. Maybe we should look at what social media has done to these men.
  • ...20 more annotations...
  • I believe “Twitter poisoning” is a real thing. It is a side effect that appears when people are acting under an algorithmic system that is designed to engage them to the max. It’s a symptom of being part of a behavior-modification scheme.
  • The same could be said about any number of other figures, including on the left. Examples are found in the excesses of cancel culture and joyless orthodoxies in fandom, in vain attention competitions and senseless online bullying.
  • The human brain did not evolve to handle modern chemicals or modern media technology and is vulnerable to addiction. That is true for me and for us all.
  • Behavioral changes occur as a side effect of something called operant conditioning, which is the underlying mechanism of social media addiction. This is the core mechanism analogous to the role alcohol plays in alcoholism.
  • In the case of digital platforms, the purpose is usually “engagement,” a concept that is hard to distinguish from addiction. People receive little positive and negative jolts of social feedback — getting followed or liked, or being ignored or even humiliated.
  • Before social media, that kind of tight feedback loop had rarely been present in human communications outside of laboratories or marriages. (This is part of why marriage can be hard, I suspect.)  
  • was around when Google and other companies that operate on the personalized advertising model were created, and I can say that at least in the early days, operant conditioning was not part of the plan.
  • What happened was that the algorithms that optimized the individualized advertising model found their way into it automatically, unintentionally rediscovering methods that had been tested on dogs and pigeons.
  • There is a childish insecurity, where before there was pride. Instead of being above it all, like traditional strongmen throughout history, the modern social media-poisoned alpha male whines and frets.
  • What do I think are the symptoms of Twitter poisoning?
  • o be clear, whiners are much better than Stalins. And yet there have been plenty of more mature and gracious leaders who are better than either
  • When we were children, we all had to negotiate our way through the jungle of human power relationships at the playground
  • When we feel those old humiliations, anxieties and sadisms again as adults — over and over, because the algorithm has settled on that pattern as a powerful way to engage us — habit formation restimulates old patterns that had been dormant. We become children again, not in a positive, imaginative sense, but in a pathetic way.
  • Twitter poisoning makes sufferers feel more oppressed than is reasonable in response to reasonable rules. The scope of fun is constricted to transgressions.
  • Unfortunately, scale changes everything. Taunts become dangerous hate when amplified. A Twitter-poisoned soul will often complain of a loss of fun when someone succeeds at moderating the spew of hate.
  • the afflicted lose all sense of proportion about their own powers. They can come to believe they have almost supernatural abilities
  • The degree of narcissism becomes almost absolute. Everything is about what someone else thinks of you.
  • These observations should inform our concerns about TikTok. The most devastating way China might use TikTok is not to misdirect our elections or to prefer pro-China posts, but to generally ramp up social media disease, so as to make Americans more divided, less able to talk to one another and less able to put up a coordinated, unified front.
  • uide society. Whether that idea appeals or not, when technology degrades the minds of those same engineers, then the result can only be dysfunction.
  • Jaron Lanier is a computer scientist who pioneered research in virtual reality and whose books include “Ten Arguments for Deleting Your Social Media Accounts Right Now.” He is Microsoft’s “prime unifying scientist” but does not speak for the company.
Javier E

Scientists Can No Longer Ignore Ancient Flooding Tales - The Atlantic - 0 views

  • It wasn’t long after Henry David Inglis arrived on the island of Jersey, just northwest of France, that he heard the old story. Locals eagerly told the 19th-century Scottish travel writer how, in a bygone age, their island had been much more substantial, and that folks used to walk to the French coast. The only hurdle to their journey was a river—one easily crossed using a short bridge.
  • there had been a flood. A big one. Between roughly 15,000 and 6,000 years ago, massive flooding caused by melting glaciers raised sea levels around Europe. That flooding is what eventually turned Jersey into an island.
  • Rather than being a ridiculous claim not worthy of examination, perhaps the old story was true—a whisper from ancestors who really did walk through now-vanished lands
  • ...8 more annotations...
  • That’s exactly what the geographer Patrick Nunn and the historian Margaret Cook at the University of the Sunshine Coast in Australia have proposed in a recent paper.
  • In their work, the pair describe colorful legends from northern Europe and Australia that depict rising waters, peninsulas becoming islands, and receding coastlines during that period of deglaciation thousands of years ago. Some of these stories, the researchers say, capture historical sea-level rise that actually happened—often several thousand years ago. For scholars of oral history, that makes them geomyths.
  • “The first time I read an Aboriginal story from Australia that seemed to recall the rise of sea levels after the last ice age, I thought, No, I don’t think this is correct,” Nunn says. “But then I read another story that recalled the same thing.
  • For Jo Brendryen, a paleoclimatologist at the University of Bergen in Norway who has studied the effects of deglaciation in Europe following the end of the last ice age, the idea that traditional oral histories preserve real accounts of sea-level rise is perfectly plausible.
  • During the last ice age, he says, the sudden melting of ice sheets induced catastrophic events known as meltwater pulses, which caused sudden and extreme sea-level rise. Along some coastlines in Europe, the ocean may have risen as much as 10 meters in just 200 years. At such a pace, it would have been noticeable to people across just a few human generations.
  • “These are stories based in trauma, based in catastrophe.”
  • That, he suggests, is why it may have made sense for successive generations to pass on tales of geological upheaval. Ancient societies may have sought to broadcast their warning: Beware, these things can happen!
  • the old stories still have things to teach us. As Nunn says, “The fact that our ancestors have survived those periods gives us hope that we can survive this.”
‹ Previous 21 - 40 of 530 Next › Last »
Showing 20 items per page