Skip to main content

Home/ TOK Friends/ Group items tagged Environment

Rss Feed Group items tagged

12More

Scientists identify vast underground ecosystem containing billions of micro-organisms |... - 0 views

  • Despite extreme heat, no light, minuscule nutrition and intense pressure, scientists estimate this subterranean biosphere is teeming with between 15bn and 23bn tonnes of micro-organisms, hundreds of times the combined weight of every human on the planet.
  • Researchers at the Deep Carbon Observatory say the diversity of underworld species bears comparison to the Amazon or the Galápagos Islands, but unlike those places the environment is still largely pristine because people have yet to probe most of the subsurface.
  • “It’s like finding a whole new reservoir of life on Earth,” said Karen Lloyd, an associate professor at the University of Tennessee in Knoxville. “We are discovering new types of life all the time. So much of life is within the Earth rather than on top of it.
  • ...9 more annotations...
  • The team combines 1,200 scientists from 52 countries in disciplines ranging from geology and microbiology to chemistry and physics.
  • The results suggest 70% of Earth’s bacteria and archaea exist in the subsurface
  • “The strangest thing for me is that some organisms can exist for millennia. They are metabolically active but in stasis, with less energy than we thought possible of supporting life.”
  • Some microorganisms have been alive for thousands of years, barely moving except with shifts in the tectonic plates, earthquakes or eruption
  • these organisms are part of slow, persistent cycles on geological timescales.”
  • Underworld biospheres vary depending on geology and geography. Their combined size is estimated to be more than 2bn cubic kilometres, but this could be expanded further in the future
  • The researchers said their discoveries were made possible by two technical advances: drills that can penetrate far deeper below the Earth’s crust, and improvements in microscopes that allow life to be detected at increasingly minute levels.
  • The scientists have been trying to find a lower limit beyond which life cannot exist, but the deeper they dig the more life they find. There is a temperature maximum – currently 122C – but the researchers believe this record will be broken if they keep exploring and developing more sophisticated instruments.
  • Robert Hazen, a mineralogist at the Carnegie Institution for Science, said: “We must ask ourselves: if life on Earth can be this different from what experience has led us to expect, then what strangeness might await as we probe for life on other worlds?”
7More

Dan Crenshaw: I made amends with Pete Davidson on SNL. But that's only the beginning. -... - 0 views

  • As a country, we still have a lot of work to do. We need to agree on some basic rules for civil discourse.
  • many of the ultimate goals — economic prosperity, better health care and education, etc. — are the same. We just don’t share the same vision of how to achieve them.
  • How, then, do we live together in this world of differing ideas? For starters, let’s agree that the ideas are fair game. If you think my idea is awful, you should say as much
  • ...4 more annotations...
  • But there is a difference between attacking an idea and attacking the person behind that idea. Labeling someone as an “-ist” who believes in an “-ism” because of the person’s policy preference is just a shortcut to playground-style name-calling, cloaked in political terminology
  • Similarly, people too often attack not just an idea but also the supposed intent behind an idea. That raises the emotional level of the debate and might seem like it strengthens the attacker’s side, but it’s a terrible way to make a point.
  • Assuming the worst about your opponents’ intentions has the effect of demonizing their ideas, removing the need for sound counter-reasoning and fact-based argument. That’s not a good environment for the exchange of ideas.
  • When all else fails, try asking for forgiveness, or granting it.
27More

Opinion | Is There Such a Thing as an Authoritarian Voter? - The New York Times - 0 views

  • Jonathan Weiler, a political scientist at the University of North Carolina at Chapel Hill, has spent much of his career studying the appeal of authoritarian figures: politicians who preach xenophobia, beat up on the press and place themselves above the law while extolling “law and order” for everyone else.
  • He is one of many scholars who believe that deep-seated psychological traits help explain voters’ attraction to such leaders. “These days,” he told me, “audiences are more receptive to the idea” than they used to be.
  • “In 2018, the sense of fear and panic — the disorientation about how people who are not like us could see the world the way they do — it’s so elemental,” Mr. Weiler said. “People understand how deeply divided we are, and they are looking for explanations that match the depth of that division.”
  • ...24 more annotations...
  • Moreover, using the child-rearing questionnaire, African-Americans score as far more authoritarian than whites
  • what, exactly, is an “authoritarian” personality? How do you measure it?
  • for more than half a century — social scientists have tried to figure out why some seemingly mild-mannered people gravitate toward a strongman
  • the philosopher (and German refugee) Theodor Adorno collaborated with social scientists at the University of California at Berkeley to investigate why ordinary people supported fascist, anti-Semitic ideology during the war. They used a questionnaire called the F-scale (F is for fascism) and follow-up interviews to analyze the “total personality” of the “potentially antidemocratic individual.”
  • The resulting 1,000-page tome, “The Authoritarian Personality,” published in 1950, found that subjects who scored high on the F-scale disdained the weak and marginalized. They fixated on sexual deviance, embraced conspiracy theories and aligned themselves with domineering leaders “to serve powerful interests and so participate in their power,”
  • “Globalized free trade has shafted American workers and left us looking for a strong male leader, a ‘real man,’” he wrote. “Trump offers exactly what my maladapted unconscious most craves.”
  • one of the F-scale’s prompts: “Obedience and respect for authority are the most important virtues children should learn.” Today’s researchers often diagnose latent authoritarians through a set of questions about preferred traits in children: Would you rather your child be independent or have respect for elders? Have curiosity or good manners? Be self-reliant or obedient? Be well behaved or considerate?
  • a glance at the Christian group Focus on the Family’s “biblical principles for spanking” reminds us that your approach to child rearing is not pre-political; it is shorthand for your stance in the culture wars.
  • “All the social sciences are brought to bear to try to explain all the evil that persists in the world, even though the liberal Enlightenment worldview says that we should be able to perfect things,” said Mr. Strouse, the Trump voter
  • what should have been obvious:
  • “Trump’s electoral strength — and his staying power — have been buoyed, above all, by Americans with authoritarian inclinations,” wrote Matthew MacWilliams, a political consultant who surveyed voters during the 2016 election
  • The child-trait test, then, is a tool to identify white people who are anxious about their decline in status and power.
  • new book, “Prius or Pickup?,” by ditching the charged term “authoritarian.” Instead, they divide people into three temperamental camps: fixed (people who are wary of change and “set in their ways”), fluid (those who are more open to new experiences and people) and mixed (those who are ambivalent).
  • “The term ‘authoritarian’ connotes a fringe perspective, and the perspective we’re describing is far from fringe,” Mr. Weiler said. “It’s central to American public opinion, especially on cultural issues like immigration and race.”
  • Other scholars apply a typology based on the “Big Five” personality traits identified by psychologists in the mid-20th century: extroversion, agreeableness, conscientiousness, neuroticism and openness to experience. (It seems that liberals are open but possibly neurotic, while conservatives are more conscientious.)
  • Historical context matters — it shapes who we are and how we debate politics. “Reason moves slowly,” William English, a political economist at Georgetown, told me. “It’s constituted sociologically, by deep community attachments, things that change over generations.”
  • “it is a deep-seated aspiration of many social scientists — sometimes conscious and sometimes unconscious — to get past wishy-washy culture and belief. Discourses that can’t be scientifically reduced are problematic” for researchers who want to provide “a universal account of behavior.”
  • in our current environment, where polarization is so unyielding, the apparent clarity of psychological and biological explanations becomes seductive
  • Attitudes toward parenting vary across cultures, and for centuries African-Americans have seen the consequences of a social and political hierarchy arrayed against them, so they can hardly be expected to favor it — no matter what they think about child rearing
  • — we know that’s not going to happen. People have wicked tendencies.”
  • as the social scientific portrait of humanity grows more psychological and irrational, it comes closer and closer to approximating the old Adam of traditional Christianity: a fallen, depraved creature, unable to see himself clearly except with the aid of a higher power
  • The conclusions of political scientists should inspire humility rather than hubris. In the end, they have confirmed what so many observers of our species have long suspected: None of us are particularly free or rational creatures.
  • Allen Strouse is not the archetypal Trump voter whom journalists discover in Rust Belt diners. He is a queer Catholic poet and scholar of medieval literature who teaches at the New School in New York City. He voted for Mr. Trump “as a protest against the Democrats’ failures on economic issues,” but the psychological dimensions of his vote intrigue him. “Having studied Freudian analysis, and being in therapy for 10 years, I couldn’t not reflexively ask myself, ‘How does this decision have to do with my psychology?’” he told me.
  • their preoccupation with childhood and “primitive and irrational wishes and fears” have influenced the study of authoritarianism ever since.
29More

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
15More

Geology's Timekeepers Are Feuding - The Atlantic - 0 views

  • , in 2000, the Nobel Prize-winning chemist Paul Crutzen won permanent fame for stratigraphy. He proposed that humans had so throughly altered the fundamental processes of the planet—through agriculture, climate change, and nuclear testing, and other phenomena—that a new geological epoch had commenced: the Anthropocene, the age of humans.
  • Zalasiewicz should know. He is the chair of the Anthropocene working group, which the ICS established in 2009 to investigate whether the new epoch deserved a place in stratigraphic time.
  • In 2015, the group announced that the Anthropocene was a plausible new layer and that it should likely follow the Holocene. But the team has yet to propose a “golden spike” for the epoch: a boundary in the sedimentary rock record where the Anthropocene clearly begins.
  • ...12 more annotations...
  • Officially, the Holocene is still running today. You have lived your entire life in the Holocene, and the Holocene has constituted the geological “present” for as long as there have been geologists.But if we now live in a new epoch, the Anthropocene, then the ICS will have to chop the Holocene somewhere. It will have to choose when the Holocene ended, and it will move some amount of time out of the purview of the Holocene working group and into that of the Anthropocene working group.
  • This is politically difficult. And right now, the Anthropocene working group seems intent on not carving too deep into the Holocene. In a paper published earlier this year in Earth-Science Reviews, the Anthropocene working group’s members strongly imply that they will propose starting the new epoch in the mid-20th century.
  • Some geologists argue that the Anthropocene started even earlier: perhaps 4,000 or 6,000 years ago, as farmers began to remake the land surface.“Most of the world’s forests that were going to be converted to cropland and agriculture were already cleared well before 1950,” says Bill Ruddiman, a geology professor at the University of Virginia and an advocate of this extremely early Anthropocene.
  • “Most of the world’s prairies and steppes that were going to be cleared for crops were already gone, by then. How can you argue the Anthropocene started in 1950 when all of the major things that affect Earth’s surface were already over?”Van der Pluijm agreed that the Anthropocene working group was picking 1950 for “not very good reasons.”“Agriculture was the revolution that allowed society to develop,” he said. “That was really when people started to force the land to work for them. That massive land movement—it’s like a landslide, except it’s a humanslide. And it is not, of course, as dramatic as today’s motion of land, but it starts the clock.”
  • This muddle had to stop. The Holocene comes up constantly in discussions of modern global warming. Geologists and climate scientists did not make their jobs any easier by slicing it in different ways and telling contradictory stories about it.
  • This process started almost 10 years ago. For this reason, Zalasiewicz, the chair of the Anthropocene working group, said he wasn’t blindsided by the new subdivisions at all. In fact, he voted to adopt them as a member of the Quaternary working group.“Whether the Anthropocene works with a unified Holocene or one that’s in three parts makes for very little difference,” he told me.In fact, it had made the Anthropocene group’s work easier. “It has been useful to compare the scale of the two climate events that mark the new boundaries [within the Holocene] with the kind of changes that we’re assessing in the Anthropocene. It has been quite useful to have the compare and contrast,” he said. “Our view is that some of the changes in the Anthropocene are rather bigger.”
  • Zalasiewicz said that he and his colleagues were going as fast as they could. When the working group group began its work in 2009, it was “really starting from scratch,” he told me.While other working groups have a large body of stratigraphic research to consider, the Anthropocene working group had nothing. “We had to spend a fair bit of time deciding whether the Anthropocene was geology at all,” he said. Then they had to decide where its signal could show up. Now, they’re looking for evidence that shows it.
  • This cycle of “glacials” and “interglacials” has played out about 50 times over the last several million years. When the Holocene began, it was only another interglacial—albeit the one we live in. Until recently, glaciers were still on schedule to descend in another 30,000 years or so.Yet geologists still call the Holocene an epoch, even though they do not bestow this term on any of the previous 49 interglacials. It get special treatment because we live in it.
  • Much of this science is now moot. Humanity’s vast emissions of greenhouse gas have now so warmed the climate that they have offset the next glaciation. They may even knock us out of the ongoing cycle of Ice Ages, sending the Earth hurtling back toward a “greenhouse” climate after the more amenable “icehouse” climate during which humans evolved.For this reason, van der Pluijm wants the Anthropocene to supplant the Holocene entirely. Humans made their first great change to the environment at the close of the last glaciation, when they seem to have hunted the world’s largest mammals—the wooly mammoth, the saber-toothed tiger—to extinction. Why not start the Anthropocene then?He would even rename the pre-1800 period “the Holocene Age” as a consolation prize:
  • Zalasiewicz said he would not start the Anthropocene too early in time, as it would be too work-intensive for the field to rename such a vast swath of time. “The early-Anthropocene idea would crosscut against the Holocene as it’s seen by Holocene workers,” he said. If other academics didn’t like this, they could create their own timescales and start the Anthropocene Epoch where they choose. “We have no jurisdiction over the word Anthropocene,” he said.
  • Ruddiman, the University of Virginia professor who first argued for a very early Anthropocene, now makes an even broader case. He’s not sure it makes sense to formally define the Anthropocene at all. In a paper published this week, he objects to designating the Anthropocene as starting in the 1950s—and then he objects to delineating the Anthropocene, or indeed any new geological epoch, by name. “Keep the use of the term informal,” he told me. “Don’t make it rigid. Keep it informal so people can say the early-agricultural Anthropocene, or the industrial-era Anthropocene.”
  • “This is the age of geochemical dating,” he said. Geologists have stopped looking to the ICS to place each rock sample into the rock sequence. Instead, field geologists use laboratory techniques to get a precise year or century of origin for each rock sample. “The community just doesn’t care about these definitions,” he said.
21More

A smarter way to think about willpower - The Washington Post - 0 views

  • in a self-report questionnaire completed by more than 80,000 American adults, self-control ranked lowest among 24 strengths of character.
  • three out of four parents said they thought self-control has declined in the past half-century.
  • Without a time machine that allows us to travel backward and compare Americans from different decades on the same self-control measures, we can’t be sure. Indeed, the scant scientific evidence on the question suggests that if anything, the capacity to delay gratification may be increasing.
  • ...18 more annotations...
  • there are plenty of behaviors that require self-control that have held steady or even improved in recent decades
  • Cigarette smoking has fallen sharply since the Mad Men days.
  • Alcohol consumption peaked in 1980 and has fallen back to the same level as 1960
  • Seat belts,
  • are now used by 9 out of 10 motorists.
  • the ratio of household consumption to household net worth just hit a postwar low: In 2018 consumption was 13.2 percent of net worth, down from 16.3 percent in 1946.
  • it isn’t clear that savings habits have worsened since World War II.
  • Nevertheless, like every generation before us, we crave more self-control.
  • science shows that helping people do better in the internal tug-of-war of self-control depends on creating the right external environment.
  • some temptations require hard paternalism
  • some choices are not in our best interest. Taxing, regulating, restricting or even banning especially addictive drugs may lead to more freedom
  • Cellphones and soda
  • the benefits of constraining access may, in some cases, justify the costs
  • we recommend nudges — subtle changes in how choices are framed that make doing what’s in our long-term interest more obvious, easier or more attractiv
  • deploy science-backed strategies that make self-control easier.
  • putting temptations out of sight and out of reach:
  • disabling apps that, upon reflection, do more harm than good.
  • Anything you can do to put time and effort between you and indulgence makes self-control easier.
6More

Living Another Day, Thanks to Grandparents Who Couldn't Sleep - The New York Times - 1 views

  • A new study, published Tuesday in Proceedings of the Royal Society B, suggests that the way sleep patterns change with age may be an evolutionary adaptation that helped our ancestors survive the night by ensuring one person in a community was awake at all times. The researchers called this phenomenon the “poorly sleeping grandparent hypothesis,” suggesting that an older member of a community who woke before dawn might have been crucial to spotting the threat of a hungry predator while younger people were still asleep. It may explain why people slept in mixed-age groups through much of human history.
  • The Hadza sleeping environment may have similarities to that of earlier humans, researchers said. They sleep outdoors or in grass huts in groups of 20 to 30 people without artificially regulating temperature or light. These conditions provide a suitable window to study the evolutionary aspects of sleep.
  • more than 220 total hours of sleep observation, researchers found only 18 minutes when all adults were sound asleep simultaneously. Typically, older participants in their 50s and 60s went to bed earlier and woke up earlier than those in their 20s and 30s. On average, more than a third of the group was alert, or lightly dozing, at any given time.
  • ...3 more annotations...
  • “We have a propensity to overcategorize things as disorders in the West,” said David Samson, an author of the study and an assistant professor of anthropology at the University of Toronto. “It might help elderly individuals to know changes they’re experiencing have an evolutionary reason.”
  • “The variation may be partially explained by genetics,” she said, “but there are environmental conditions too.” As people age, their social needs and level of activity change, potentially affecting their sleep patterns.
  • there is evidence of a genetic link, she added, pointing out that sleep quality declined among the older Hadza even while they remained active hunters and gatherers.
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
5More

A reason to believe - 2 views

  • They’re finding that religion may, in fact, be a byproduct of the way our brains work, growing from cognitive tendencies to seek order from chaos, to anthropomorphize our environment and to believe the world around us was created for our use.
  • “Religion is one of the big ways that human societies have hit on as a solution to induce unrelated individuals to be nice to each other,” says Norenzayan.
  • “The problem is, the more you look inward toward your religious group and its claims of virtue, the less you look outward and the more distrustful you are of others,” he says.
  • ...1 more annotation...
  • That distrust causes much of the world’s strife and violence and is one of the reasons the “new atheists,” including British evolutionary biologist Richard Dawkins, PhD, and neuroscientist Sam Harris, PhD, want to see religion disappear.
  •  
    I find this article interesting because it analysis the reason of religion from the perspective of basic human nature.
17More

Is our world a simulation? Why some scientists say it's more likely than not | Technolo... - 3 views

  • Musk is just one of the people in Silicon Valley to take a keen interest in the “simulation hypothesis”, which argues that what we experience as reality is actually a giant computer simulation created by a more sophisticated intelligence
  • Oxford University’s Nick Bostrom in 2003 (although the idea dates back as far as the 17th-century philosopher René Descartes). In a paper titled “Are You Living In a Simulation?”, Bostrom suggested that members of an advanced “posthuman” civilization with vast computing power might choose to run simulations of their ancestors in the universe.
  • If we believe that there is nothing supernatural about what causes consciousness and it’s merely the product of a very complex architecture in the human brain, we’ll be able to reproduce it. “Soon there will be nothing technical standing in the way to making machines that have their own consciousness,
  • ...14 more annotations...
  • At the same time, videogames are becoming more and more sophisticated and in the future we’ll be able to have simulations of conscious entities inside them.
  • “Forty years ago we had Pong – two rectangles and a dot. That’s where we were. Now 40 years later, we have photorealistic, 3D simulations with millions of people playing simultaneously and it’s getting better every year. And soon we’ll have virtual reality, we’ll have augmented reality,” said Musk. “If you assume any rate of improvement at all, then the games will become indistinguishable from reality.”
  • “If one progresses at the current rate of technology a few decades into the future, very quickly we will be a society where there are artificial entities living in simulations that are much more abundant than human beings.
  • If there are many more simulated minds than organic ones, then the chances of us being among the real minds starts to look more and more unlikely. As Terrile puts it: “If in the future there are more digital people living in simulated environments than there are today, then what is to say we are not part of that already?”
  • Reasons to believe that the universe is a simulation include the fact that it behaves mathematically and is broken up into pieces (subatomic particles) like a pixelated video game. “Even things that we think of as continuous – time, energy, space, volume – all have a finite limit to their size. If that’s the case, then our universe is both computable and finite. Those properties allow the universe to be simulated,” Terrile said
  • “Is it logically possible that we are in a simulation? Yes. Are we probably in a simulation? I would say no,” said Max Tegmark, a professor of physics at MIT.
  • “In order to make the argument in the first place, we need to know what the fundamental laws of physics are where the simulations are being made. And if we are in a simulation then we have no clue what the laws of physics are. What I teach at MIT would be the simulated laws of physics,”
  • Terrile believes that recognizing that we are probably living in a simulation is as game-changing as Copernicus realizing that the Earth was not the center of the universe. “It was such a profound idea that it wasn’t even thought of as an assumption,”
  • That we might be in a simulation is, Terrile argues, a simpler explanation for our existence than the idea that we are the first generation to rise up from primordial ooze and evolve into molecules, biology and eventually intelligence and self-awareness. The simulation hypothesis also accounts for peculiarities in quantum mechanics, particularly the measurement problem, whereby things only become defined when they are observed.
  • “For decades it’s been a problem. Scientists have bent over backwards to eliminate the idea that we need a conscious observer. Maybe the real solution is you do need a conscious entity like a conscious player of a video game,
  • How can the hypothesis be put to the test
  • scientists can look for hallmarks of simulation. “Suppose someone is simulating our universe – it would be very tempting to cut corners in ways that makes the simulation cheaper to run. You could look for evidence of that in an experiment,” said Tegmark
  • First, it provides a scientific basis for some kind of afterlife or larger domain of reality above our world. “You don’t need a miracle, faith or anything special to believe it. It comes naturally out of the laws of physics,”
  • it means we will soon have the same ability to create our own simulations. “We will have the power of mind and matter to be able to create whatever we want and occupy those worlds.”
8More

More College Students Seem to Be Majoring in Perfectionism - The New York Times - 0 views

  • New data from American, Canadian and British college students indicates that perfectionism, especially when influenced by social media, has increased by 33 percent since 1989.
  • Thinking that others in their social network expect a lot of them is even more important to young adults than the expectations of parents and professors.
  • “Meritocracy places a strong need for young people to strive, perform and achieve,” he said. They have “increasingly unrealistic educational and professional expectations for themselves.”
  • ...5 more annotations...
  • Parents in my practice say they’re noticing how often their kids come away from Facebook and Instagram feeling depressed, ashamed and anxious, and how vulnerable they are to criticism and judgment, even from strangers, on their social media feeds.
  • Perfectionism is a personality trait or characteristic that is innate in many people. It is nurtured in some environments, notably in families where personal accomplishment, academic or otherwise, is rewarded:
  • The researchers looked at more than 41,000 students’ responses on the Multidimensional Perfectionism Scale, which not only measures degrees of perfectionism but also distinguishes among its three aspects: self-oriented, other-oriented and socially prescribed
  • the questions posed by the Multidimensional Perfectionism Scale, used in the study, may be informative. Among its 44 items, scaled from 1 (disagree) to 7 (agree), are statements like these: When I am working on something, I can’t relax unless it’s perfect; The people around me expect me to succeed at everything I do; The better I do, the better I am expected to do. The scale is not a clinical instrument, but the questions might be a good starting point for discussion.
  • It’s hard to tell how much social media is affecting your child’s self-image; many feel enormous pressure to be perfect off line, too. And it’s difficult to know what, if anything, parents can do about it, beyond offering empathy, reassurance and emotional support.
17More

Facebook will now ask users to rank news organizations they trust - The Washington Post - 0 views

  • Zuckerberg wrote Facebook is not “comfortable” deciding which news sources are the most trustworthy in a “world with so much division."
  • "We decided that having the community determine which sources are broadly trusted would be most objective," he wrote.
  • The new trust rankings will emerge from surveys the company is conducting. "Broadly trusted" outlets that are affirmed by a significant cross-section of users may see a boost in readership, while less known organizations or start-ups receiving poor ratings could see their web traffic decline
  • ...14 more annotations...
  • The company's changes include an effort to boost the content of local news outlets, which have suffered sizable subscription and readership declines
  • The changes follow another major News Feed redesign, announced last week, in which Facebook said users would begin to see less content from news organizations and brands in favor of "meaningful" posts from friends and family.
  • Currently, 5 percent of Facebook posts are generated by news organizations; that number is expected to drop to 4 percent after the redesign, Zuckerberg said.
  • On Friday, Google announced it would cancel a two-month-old experiment, called Knowledge Panel, that informed its users that a news article had been disputed by independent fact-checking organizations. Conservatives had complained the feature unfairly targeted a right-leaning outlet.
  • More than two-thirds of Americans now get some of their news from social media, according to Pew Research Center.
  • That shift has empowered Facebook and Google, putting them in an uncomfortable position of deciding what news they should distribute to their global audiences. But it also has led to questions about whether these corporations should be considered media companie
  • "Just by putting things out to a vote in terms of what the community would find trustworthy undermines the role for any serious institutionalized process to determine what’s quality and what’s not,” he said.
  • rther criticism that the social network had become vulnerable to bad actors seeking to spread disinformation.
  • Jay Rosen, a journalism professor at New York University, said that Facebook learned the wrong lesson from Trending Topics, which was to try to avoid politics at all costs
  • “One of the things that can happen if you are determined to avoid politics at all costs is you are driven to illusory solutions,” he said. “I don’t think there is any alternative to using your judgement. But Facebook is convinced that there is. This idea that they can avoid judgement is part of their problem.”
  • Facebook revealed few details about how it is conducting its trust surveys,
  • "The hard question we've struggled with is how to decide what news sources are broadly trusted," Zuckerberg wrote. "We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking."
  • Some experts wondered whether Facebook's latest effort could be gamed.
  • "This seems like a positive step toward improving the news environment on Facebook," Diresta said. "That said, the potential downside is that the survey approach unfairly penalizes emerging publications."
3More

Jared Diamond: We Could Be Living in a New Stone Age by 2114 - Mother Jones - 0 views

  • Either by the year 2050 we’ve succeeded in developing a sustainable economy, in which case we can then ask your question about 100 years from now, because there will be 100 years from now; or by 2050 we’ve failed to develop a sustainable economy, which means that there will no longer be first world living conditions, and there either won’t be humans 100 years from now, or those humans 100 years from now will have lifestyles similar of those of Cro-Magnons 40,000 years ago, because we’ve already stripped away the surface copper and the surface iron. If we knock ourselves out of the first world, we’re not going to be able to rebuild a first world.
  • It all depends, he says, on where we are at 2050:
  • Not everybody agrees with Diamond that we’re in such a perilous state, of course. But there is perhaps no more celebrated chronicler of why civilizations rise, and why they fall. That is, after all, why we read him. So when Diamond says we’ve got maybe 50 years to turn it around, we should at least consider the possibility that he might actually be right. For if he is, the consequences are so intolerable that anything possible should be done to avert them.
6More

Greta Thunberg tells U.N. climate summit to take action on climate change - The Washing... - 0 views

  • “You have stolen my dreams and my childhood with your empty words,”
  • “For more than 30 years, the science has been crystal clear,” Thunberg added. “How dare you continue to look away and come here and say you’re doing enough when the politics and solutions needed are still nowhere in sight?
  • We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you?”
  • ...3 more annotations...
  • “People are suffering. People are dying. Entire ecosystem are collapsing.
  • ou say you hear us and that you understand the urgency, but no matter how sad and angry I am, I do not want to believe that, because if you really understood the situation and still kept on failing to act, then you would be evil, and that I refuse to believe."
  • “You’re failing us, but the young people are starting to understand your betrayal. The eyes of all future generations are upon you. And if you choose to fail us, I say, we will never forgive you.”
11More

How Does Expectation Affect Perception - 3 views

  • One important fact is that the brain works in some ways like television transmission, in that it processes stable backgrounds without much attention and moving parts more intensely and differently.
  • Recent research in babies shows that they respond most to unexpected events and use these to evaluate the environment and learn.
  • But, the over arching analysis of visual signals depends on what is expected.
  • ...7 more annotations...
  • Picture of bright light causes eye pupils to react, as if a real light.
  • Good hitters in baseball view the ball as larger.
  • Large people judge the absolute measurement of a doorway as more narrow than others will.
  • Words and thoughts alter sensory information:
  • She kicked the ball” or “grasped the subject” stimulates the leg or arm brain regions related to kicking or grasping.
  • Experienced observers of ballet or classical Indian dance who have never danced, when watching a dance stimulate specific muscles of the dance.
  • The brain has many interacting pathways and loops that create expectations with different probabilities from our previous experiences.
  •  
    I found this article very interesting because it explains some aspects of how our expectation can influence our perception. In this article, language is also mentioned that different vocabulary can alter our perception. I think this can be related to the definition of words we talked about recently. I think this article suggests that the definition of a word is the result of our expectation as we often define things differently in our favor if no clear definition is stated. This relationship can also be reversed as we use definitions to describe and organize our expectation. --Sissi (11/16/2016)
17More

How to avoid covid-19 hoax stories? - The Washington Post - 1 views

  • How good are people at sifting out fake news?
  • we’ve been investigating whether ordinary individuals who encounter news when it first appears online — before fact-checkers like Snopes and PolitiFacts have an opportunity to issue reports about an article’s veracity — are able to identify whether articles contain true or false information.
  • Unfortunately, it seems quite difficult for people to identify false or misleading news, and the limited number of coronavirus news stories in our collection are no exception
  • ...14 more annotations...
  • Over a 13-week period, our study allowed us to capture people’s assessments of fresh news articles in real time. Each day of the study, we relied on a fixed, pre-registered process to select five popular articles published within the previous 24 hours
  • The five articles were balanced between conservative, liberal and non-partisan sources, as well as from mainstream news websites and from websites known to produce fake news. In total, we sent 150 total articles to 90 survey respondents each
  • We also sent these articles separately to six independent fact checkers, and treated their most common response — true, false/misleading, or cannot determine — for each article as the “correct’’ answer for that article.
  • When shown an article that was rated “true” by the professional fact checkers, respondents correctly identified the article as true 62 percent of the time. When the source of the true news story was a mainstream news source, respondents correctly identified the article as true 73 percent of the time.
  • However, for each article the professional fact checkers rated “false/misleading,” the study participants were as likely to say it was true as they were to say it was false or misleading. And roughly one-third of the time they told us they were unable to determine the veracity of the article. In other words, people on the whole were unable to correctly classify false or misleading news.
  • four of the articles in our study that fact checkers rated as false or misleading were related to the coronavirus.
  • All four articles promoted the unfounded rumor that the virus was intentionally developed in a laboratory. Although accidental releases of pathogens from labs have previously caused significant morbidity and mortality, in the current pandemic multiple pieces of evidence suggest this virus is of natural origin. There’s little evidence that the virus was manufactured or altered.
  • Only 30 percent of participants correctly classified them as false or misleading.
  • respondents seemed to have more trouble deciding what to think about false covid-19 stories, leading to a higher proportion of “could not determine” responses than we saw for the stories on other topics our professional fact checkers rated as “false/misleading.” This finding suggests that it may be particularly difficult to identify misinformation in newly emerging topics
  • Study participants with higher levels of education did better on identifying both fake news overall and coronavirus-related fake news — but were far from being able to correctly weed out misinformation all of the time
  • In fact, no group, regardless of education level, was able to correctly identify the stories that the professional fact checkers had labeled as false or misleading more than 40 percent of the time.
  • Taken together, our findings suggest that there is widespread potential for vulnerability to misinformation when it first appears online. This is especially worrying during the current pandemic
  • In the current environment, misinformation has the potential to undermine social distancing efforts, to lead people to hoard supplies, or to promote the adoption of potentially dangerous fake cures.
  • our findings suggest that non-trivial numbers of people will believe false information to be true when they first encounter it. And it suggests that efforts to remove coronavirus-related misinformation will need to be swift — and implemented early in an article’s life-cycle — to stop the spread of something else that’s dangerous: misinformation.
11More

How to Defeat Those Who Are Waging War on Science - Scientific American Blog Network - 0 views

  • new language of this war—a subtle, yet potentially damaging form of science skepticism
  • The systematic use of so-called “uncertainty” surrounding well-established scientific ideas has proven to be a reliable method for manipulating public perception and stalling political action.
  • Make no mistake: the War on Science is going to affect you, whether you are a scientist or not. It is going to affect everything—ranging from the safety of the food we eat, the water we drink, the air we breathe, and the kind of planet we live on.
  • ...8 more annotations...
  • The reality is that science touches everything we do, and everyone we love
  • Do we want to be the America that embraces science and the pursuit of knowledge to advance our health, safety, prosperity, and security, making America the leader of the civilized world? Or do want America to mimic failed regimes of the past, where knowledge and science were deliberately suppressed to benefit a few, to funnel more profits into dying industries, and placate the prejudices of a mob
  • Traditionally, scientists have been coached to steer clear of the political fray. But if the past few weeks have taught us anything, it’s that now is the time for a quantum leap of political relevance.
  • You cannot isolate science from politics, or politics from science
  • That is precisely why scientists shouldn’t shy away from engaging in political conversations. Now more than ever, it is necessary to be participating in them
  • At the very least, we all share a deeply-held fascination with our natural world. The search for meaning, the understanding of something bigger than ourselves, is of universal significance.
  • In today’s world, facts alone are not enough to win debates, let alone people’s hearts and minds. Research shows that increasing scientific knowledge can often deepen the divide between people on polarizing issues. “Individuals subconsciously resist factual information that threatens their defining values,” a recent study points out
  • America has a choice to make. A choice between advancing civilization or bringing it down. A choice between knowledge and chaos. Now, everyone must choose which side they are on.
8More

Manterruption is a Thing, and Now There is an App to Detect it in Daily Conversation | ... - 0 views

  • Introducing our word of the day – “manterruption”. It’s a pretty self-explanatory term, describing a behavior when men interrupt women unnecessarily, which leads to a pretty serious imbalance in the amount of female vs. male contributions in a conversation.
  • A 2004 study on gender issues at Harvard Law School found that men were 50% more likely than women to volunteer at least one comment during class and 144% more likely to volunteer three or more comments. 
  • which as a consequence leaves decision-making mostly to men.
  • ...4 more annotations...
  • Meaning, women’s voices bring a different and valuable perspective in a conversation and should be heard more.
  • Here's the thing, though: while fighting for the cause of hearing the female perspective equally in all matters of business, government, and life is definitely worthwhile, blaming it all on interrupting men doesn’t seem fair. Because it is not just men who interrupt women, women do it too. As a matter of fact, a study done in a tech company showed that 87% of the time that women interrupt, they are interrupting other women.
  • There are also other dynamics at play, for example, seniority. It is still more likely that men will hold a more senior position in a professional environment and, generally, people with a higher rank tend to interrupt more and be interrupted less.
  • Hearing the voices and perspectives of both genders equally is incredibly important, but we should make sure we are addressing the right root causes and are not antagonizing those who need to be on the same side for progress to be made. 
  •  
    I think this app is very interesting. There are obviously gender inequality in the society that men are often more used to take the leadership than women. I think by counting how many times a woman is interrupted by a man is a very interesting aspect to show how the society is still dominated by men. I also really like that the author discusses about other possible factors of why women are more likely to be interrupted by men. Only arguing about one side wouldn't make a strong argument. Gender inequality is a big and heavy label that we should give it more thinking before we apply it to any phenomenon. --Sissi (3/14/2017)
10More

Do You and Your Partner Fight Too Much, or Not Enough? Turns Out There's a "Magic Ratio... - 0 views

  • Everyone knows couples break up when they fight too much. But what if they don't fight enough?
  • the “magic ratio” of positive and negative interactions in successful relationships is about 5 to 1.
  • So, too much fighting leads to breakups. That’s obvious. But what’s interesting about the theory is it implies that one sign of a doomed relationship could be not enough negativity.
  • ...6 more annotations...
  • The idea is that because people and environments are always changing, partners must provide one another with enough corrective feedback so they can be “on the same page.” 
  • Gottman and his colleagues found that couples who remained stoic during conflicts actually tended to fare worse than couples that were more “volatile".
  • These couples exert a healthy amount of influence on one another, both positively and negatively. But as long as their interactions favor the positive, they tend to enjoy relatively stable relationships over the long term.
  • The 5:1 ratio also seems to ring true in the business world.
  • The results showed that the most successful teams made an average of 5.6 positive comments per every negative one, while the average ratio among the lowest performing teams was just 0.36 to 1.
  • Negative feedback can prevent you from driving off a cliff.
  •  
    I find it very interesting that sometimes having some negative things can result in a positive way. In TV series or books, we can always see a scene that when two people are arguing, there would be a third person saying that "wow, you guys have such a good relationship!" and they would reply "no" together. Bow there are research on that and we can see from the perspective of logic of evolution that human community needs correction and advices from others to adjust themselves. I think arguing may sometimes shorten the relationship between two people since they both show each other the worst side and there won't be much hide between them. --Sissi (4/26/2017)
« First ‹ Previous 201 - 220 of 332 Next › Last »
Showing 20 items per page