Skip to main content

Home/ TOK Friends/ Group items tagged difficulty

Rss Feed Group items tagged

5More

Book Review: The Moral Lives of Animals - WSJ.com - 0 views

  • en less to such accounts than meets the eye. What appear on the surface to be instances of insight, reflection, empathy or higher purpose frequently turn out to be a fairly simple learned behavior, of a kind that every sentient species from humans to earthworms exhibits all the time.
  • The deeper problem, as Mr. Peterson more frankly acknowledges, is that it is the height of anthropomorphic absurdity to project human values and behaviors onto other species—and then to judge them by their similarity to us
  • Recognizing the difficulty of boosting animals, his approach is instead to deflate humans: in particular, to suggest that there is much less to even so vaunted a human trait as morality than we like to believe. Rather than a sophisticated system of language-based laws, philosophical arguments and abstract values that sets mankind apart, morality is, in his view, a set of largely primitive psycho logical instincts.
  • ...2 more annotations...
  • And Mr. Peterson simply ignores several decades worth of recent studies in cognitive science by researchers such as David Povinelli, Bruce Hood, Michael Tomasello and Elisabetta Visalberghi, which have elucidated very real differences between human and nonhuman minds in the realm of conceptual reasoning, particularly with respect to what has been termed "theory of mind." This is the uniquely human ability to have thoughts about thoughts and to perceive that other minds exist and that they can hold ideas and beliefs different from one's own. While human and animal minds share a broadly similar ability to learn from experience, formulate intentions and store memories, careful experiments have repeatedly come up empty when attempting to establish the existence of a theory of mind in nonhumans.
  • This not only detracts from the argument Mr. Peterson seeks to make but reinforces the sense of intellectual parochialism that is the book's chief flaw. Modern evolutionary psychology and cognitive science have done much to illuminate the evolutionary instincts that animate complex human mental processes. Unfortunately, in his determination to level the playing field between human and nonhuman minds, Mr. Peterson has ignored at least half his story.
14More

Narcissus Regards a Book - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Common readers—which is to say the great majority of people who continue to read—read for one purpose and one purpose only. They read for pleasure. They read to be entertained. They read to be diverted, assuaged, comforted, and tickled.
  • Reading, where it exists at all, has largely become an unprofitable wing of the diversion industry.
  • it's not only the division of experience between hard labor and empty leisure that now makes reading for something like mortal stakes a very remote possibility.
  • ...11 more annotations...
  • when life is not work, it is play. That's not hard to understand. People are tired, stressed, drained: They want to kick back a little.
  • But entertainment culture suffers no such difficulty. Its rationale is simple, clear, potent: The products of the culture industry are good because they make you feel good.
  • Though the arts interest them, though they read this and they read that—there is one thing that makes them very nervous indeed about what they do. They are not comfortable with judgments of quality. They are not at ease with "the whole evaluation thing."
  • They may sense that Blake's Songs of Innocence and Experience are in some manner more valuable, more worth pondering, more worth preserving than The Simpsons. They may sense as much. But they do not have the terminology to explain why. They never heard the arguments. The professors who should have been providing the arguments when the No More Western Culture marches were going on never made a significant peep.
  • Now the kids who were kids when the Western canon went on trial and received summary justice are working the levers of culture. They are the editors and the reviewers and the arts writers and the ones who interview the novelists and the poets
  • So the arbiters of culture—our former students—went the logical way. They said: If it makes you feel good, it must be good. If Stephen King and John Grisham bring pleasure, why then, let us applaud them.
  • What's not asked in the review and the interview and the profile is whether a King book is worth writing or worth reading. It seems that no one anymore has the wherewithal to say that reading a King novel is a major waste of time.
  • Media no longer seek to shape taste. They do not try to educate the public. And this is so in part because no one seems to know what literary and cultural education would consist of. What does make a book great, anyway? And the media have another reason for not trying to shape taste: It pisses off the readers. They feel insulted, condescended to; they feel dumb.
  • Even the most august publications and broadcasts no longer attempt to shape taste. They merely seek to reflect it. They hold the cultural mirror up to the reader—what the reader likes, the writer and the editor like. They hold the mirror up and the reader and—what else can he do?—the reader falls in love. The common reader today is someone who has fallen in love, with himself.
  • Reading in pursuit of influence—that, I think, is the desired thing. It takes a strange mixture of humility and confidence to do as much.
  • The desire to be influenced is always bound up with some measure of self-dislike, or at least with a dose of discontent. While the culture tells us to love ourselves as we are—or as we will be after we've acquired the proper products and services—the true common reader does not find himself adequate at all.
3More

How To Repel Tourism « The Dish - 0 views

  • In short: Demanding a visa from a country’s travelers in advance is associated with a 70 percent lower level of tourist entries than from a similar country where there is no visa requirement. The U.S. requires an advance visa from citizens of 81 percent of the world’s countries; if it waived that requirement, the researchers estimate, inbound tourism arrivals would more than double, and tourism expenditure would climb by $123 billion.
  • what it is like to enter the US as a non-citizen. It’s a grueling, off-putting, frightening, and often brutal process. Compared with entering a European country, it’s like entering a police state. When you add the sheer difficulty of getting a visa, the brusque, rude and contemptuous treatment you routinely get from immigration officials at the border, the sense that all visitors are criminals and potential terrorists unless proven otherwise, the US remains one of the most unpleasant places for anyone in the world to try and get access to.
  • And this, of course, is a function not only of a vast and all-powerful bureaucracy. It’s a function of this country’s paranoia and increasing insularity. It’s a thoroughly democratic decision to keep foreigners out as much as possible. And it’s getting worse and worse.
8More

People Prefer Electric Shocks to Being Alone With Their Thoughts - Matthew Hutson - The... - 0 views

  • 11 experiments. In most, they asked participants to put away any distractions and entertain themselves with their own thoughts for 6 to 15 minutes
  • Over the first six studies, 58 percent of participants rated the difficulty at or above the midpoint on a scale (“somewhat”), and 42 percent rated their enjoyment below the midpoint
  • Participants rated the task of entertaining themselves with their own thoughts as far less enjoyable and more conducive to mind-wandering than other mellow activities such as reading magazines or doing crossword puzzles.
  • ...5 more annotations...
  • In the most, ahem, shocking study, subjects were wired up and given the chance to shock themselves during the thinking period if they desired. They’d all had a chance to try out the device to see how painful it was. And yet, even among those who said they would pay money not to feel the shock again, a quarter of the women and two thirds of the men gave themselves a zap when left with their own thoughts. (One outlier pressed the button 190 times in the 15 minutes.)
  • Maybe subjects just couldn’t decide where to steer their thoughts? Nope. In several studies, some were offered topics to fantasize about (going on a beautiful hike, etc.), but that tweak had no effect
  • Maybe modern technology is rotting our brains? Nope. Enjoyment was unrelated to age or the use of smart phones or social media. Wilson says if anything, use of technology is more a symptom than a cause
  • Wilson favors the “scanner hypothesis”: Mammals have evolved to monitor their environments for dangers and opportunities, and so focusing completely internally for several minutes is unnatural.
  • Wilson noted that the researchers don’t yet have strong evidence, but, he said, “I’m convinced it’s correct.”
12More

How movies influence perceptions of brain disorders - The Globe and Mail - 0 views

  • Blockbusters, from the 2002 action thriller The Bourne Identity to last year’s Scarlett Johansson vehicle Lucy, reinforce pervading misconceptions about how the brain works.
  • “Watching movies about neurological disorders, if they’re done well, I think gives people an appreciation for what the characters may go through,” she says, while films that promote stereotypes “can actually be a little bit more hurtful to people who have those disorders.”
  • this 2003 Disney film offers surprisingly solid insight about a neurological disorder.
  • ...9 more annotations...
  • Dory, voiced by comedian Ellen DeGeneres, suffers classic symptoms of anterograde amnesia, which is typically associated with damage to the hippocampus, the area of the brain involved in encoding memories.
  • the portrayal of the condition is spot on. She has difficulty remembering names and retaining new information, but her condition doesn’t affect her sense of identity.
  • Jason Bourne, the amnesiac main character of this action flick, exhibits no trouble with short-term memories, but wakes up after suffering an unspecified injury to the brain with no recollection of who he is.
  • it’s just a perfect example of the neuromyth
  • explaining that the “double conk” myth – the idea that someone can lose their identity after being hit in the head and regain it after a second blow or psychological trigger – is actually a conflation of two ideas.
  • Identity loss is more closely associated with psychogenic amnesia, an extremely rare and controversial diagnosis, whose origins, some experts believe, may be influenced by culture.
  • This sci-fi action film relies on the conceit that humans only use 10 per cent of their brains.
  • Sure, filmmakers may take artistic licence, she says, but the trouble is many people actually believe we only use a portion of our brains.
  • she notes that due to its success, the film may have inadvertently contributed to the stereotype of the autistic savant – the notion that people with autism excel in a specific area, which, in the case of Hoffman’s character, involved dealing in numbers. In reality, Spiers says, this is very rare.
12More

The Dark Knight of the Soul - Tomas Rocha - The Atlantic - 1 views

  • Her investigation of this phenomenon, called "The Dark Night Project," is an effort to document, analyze, and publicize accounts of the adverse effects of contemplative practices.
  • According to a survey by the National Institutes of Health, 10 percent of respondents—representing more than 20 million adult Americans—tried meditating between 2006 and 2007, a 1.8 percent increase from a similar survey in 2002. At that rate, by 2017, there may be more than 27 million American adults with a recent meditation experience.
  • "We're not being thorough or honest in our study of contemplative practice," says Britton, a critique she extends to the entire field of researchers studying meditation, including herself.
  • ...9 more annotations...
  • this widespread assumption—that meditation exists only for stress reduction and labor productivity, "because that's what Americans value"—narrows the scope of the scientific lens. When the time comes to develop hypotheses around the effects of meditation, the only acceptable—and fundable—research questions are the ones that promise to deliver the answers we want to hear.
  • the oscillations of spiritual life parallel the experience of learning to walk, very similar to the metaphor Saint John of the Cross uses in terms of a mother weaning a child … first you are held up by a parent and it is exhilarating and wonderful, and then they take their hands away and it is terrifying and the child feels abandoned."
  • while meditators can better avoid difficult experiences under the guidance of seasoned teachers, there are cases where such experiences are useful signs of progress in contemplative development. Distinguishing between the two, however, remains a challenge.
  • One of her team's preliminary tasks—a sort of archeological literature review—was to pore through the written canons of Theravadin, Tibetan, and Zen Buddhism, as well as texts within Christianity, Judaism, and Sufism. "Not every text makes clear reference to a period of difficulty on the contemplative path," Britton says, "but many did." Related Story What Happens to the Brain During Spiritual Experiences?
  • "Does it promote good relationships? Does it reduce cortisol? Does it help me work harder?" asks Britton, referencing these more lucrative questions. Because studies have shown that meditation does satisfy such interests, the results, she says, are vigorously reported to the public. "But," she cautions, "what about when meditation plays a role in creating an experience that then leads to a breakup, a psychotic break, or an inability to focus at work?"
  • Given the juggernaut—economic and otherwise—behind the mindfulness movement, there is a lot at stake in exploring a shadow side of meditation. Upton Sinclair once observed how difficult it is to get a man to understand something when his salary depends on his not understanding it.
  • Among the nearly 40 dark night subjects her team has formally interviewed over the past few years, she says most were "fairly out of commission, fairly impaired for between six months [and] more than 20 years."
  • The Dark Night Project is young, and still very much in progress. Researchers in the field are just beginning to carefully collect and sort through the narratives of difficult meditation-related experiences. Britton has presented her findings at major Buddhist and scientific conferences, prominent retreat centers, and even to the Dalai Lama at the 24th Mind and Life Dialogue in 2012.
  • "There are parts of me that just want meditation to be all good. I find myself in denial sometimes, where I just want to forget all that I've learned and go back to being happy about mindfulness and promoting it, but then I get another phone call and meet someone who's in distress, and I see the devastation in their eyes, and I can't deny that this is happening. As much as I want to investigate and promote contemplative practices and contribute to the well-being of humanity through that, I feel a deeper commitment to what's actually true."
13More

Why Teenagers Act Crazy - NYTimes.com - 1 views

  • there is a darker side to adolescence that, until now, was poorly understood: a surge during teenage years in anxiety and fearfulness. Largely because of a quirk of brain development, adolescents, on average, experience more anxiety and fear and have a harder time learning how not to be afraid than either children or adults.
  • the brain circuit for processing fear — the amygdala — is precocious and develops way ahead of the prefrontal cortex, the seat of reasoning and executive control. This means that adolescents have a brain that is wired with an enhanced capacity for fear and anxiety, but is relatively underdeveloped when it comes to calm reasoning.
  • the brain’s reward center, just like its fear circuit, matures earlier than the prefrontal cortex. That reward center drives much of teenagers’ risky behavior. This behavioral paradox also helps explain why adolescents are particularly prone to injury and trauma. The top three killers of teenagers are accidents, homicide and suicide.
  • ...10 more annotations...
  • The brain-development lag has huge implications for how we think about anxiety and how we treat it. It suggests that anxious adolescents may not be very responsive to psychotherapy that attempts to teach them to be unafraid, like cognitive behavior therapy
  • should also make us think twice — and then some — about the ever rising use of stimulants in young people, because these drugs may worsen anxiety and make it harder for teenagers to do what they are developmentally supposed to do: learn to be unafraid when it is appropriate
  • up to 20 percent of adolescents in the United States experience a diagnosable anxiety disorder, like generalized anxiety or panic attacks, probably resulting from a mix of genetic factors and environmental influences.
  • This isn’t to say that cognitive therapy is ineffective for teenagers, but that because of their relative difficulty in learning to be unafraid, it may not be the most effective treatment when used on its own.
  • Fear learning lies at the heart of anxiety and anxiety disorders. This primitive form of learning allows us to form associations between events and specific cues and environments that may predict danger.
  • once previously threatening cues or situations become safe, we have to be able to re-evaluate them and suppress our learned fear associations. People with anxiety disorders have trouble doing this and experience persistent fear in the absence of threat — better known as anxiety.
  • Dr. Casey discovered that adolescents had a much harder time “unlearning” the link between the colored square and the noise than children or adults did.
  • adolescents had trouble learning that a cue that was previously linked to something aversive was now neutral and “safe.” If you consider that adolescence is a time of exploration when young people develop greater autonomy, an enhanced capacity for fear and a more tenacious memory for threatening situations are adaptive and would confer survival advantage. In fact, the developmental gap between the amygdala and the prefrontal cortex that is described in humans has been found across mammalian species, suggesting that this is an evolutionary advantage.
  • As a psychiatrist, I’ve treated many adults with various anxiety disorders, nearly all of whom trace the origin of the problem to their teenage years. They typically report an uneventful childhood rudely interrupted by adolescent anxiety. For many, the anxiety was inexplicable and came out of nowhere.
  • prescription sales for stimulants increased more than fivefold between 2002 and 2012. This is of potential concern because it is well known from both human and animal studies that stimulants enhance learning and, in particular, fear conditioning.
2More

On David Frum, The New York Times, and the Non-Faked 'Fake' Gaza Photos - The Atlantic - 0 views

  • Erik Wemple argues in a very tough critique of Frum's claims for the Washington Post that imbalanced, one-sided skepticism was the main problem with Frum's apology. He was willing to believe the worst about the motives and standards of the nation's leading news organization, while accepting at face value some Pallywood-style fantasies about all-fronts fakery.
  • For all their blind spots and flaws, reporters on the scene are trying to see, so they can tell, and the photographic and video reporters take greater risks than all the rest, since they must be closer to the action. For people on the other side of the world to casually assert that they're just making things up—this could and would drive them crazy. I'm sure that fakery has occurred. But the claim that it has is as serious as they come in journalism. It goes at our ultimate source of self-respect. As when saying that a doctor is deliberately mis-diagnosing patients, that a pilot is drunk in the cockpit, that a lifeguard is purposely letting people drown, you might be right, but you had better be very, very sure before making the claim.
9More

Learning How Little We Know About the Brain - NYTimes.com - 0 views

  • So many large and small questions remain unanswered. How is information encoded and transferred from cell to cell or from network to network of cells?
  • Science found a genetic code but there is no brain-wide neural code; no electrical or chemical alphabet exists that can be recombined to say “red” or “fear” or “wink” or “run.” And no one knows whether information is encoded differently in various parts of the brain.
  • A decade ago, he moved from Brandeis to Columbia, which now has one of the biggest groups of theoretical neuroscientists in the world, he says, and which has a new university-wide focus on integrating brain science with other disciplines.
  • ...6 more annotations...
  • Single neurons, he said, are fairly well understood, as are small circuits of neurons.The question now on his mind, and that of many neuroscientists, is how larger groups, thousands of neurons, work together — whether to produce an action, like reaching for a cup, or to perceive something, like a flower.
  • “We’ve looked at the nervous system from the two ends in,” Dr. Abbott said, meaning sensations that flow into the brain and actions that are initiated there. “Somewhere in the middle is really intelligence, right? That’s where the action is.”
  • the goal is to discover the physiological mechanism in the data.
  • For example, he asks why does one pattern of neurons firing “make you jump off the couch and run out the door and others make you just sit there and do nothing?” It could be, Dr. Abbott says, that simultaneous firing of all the neurons causes you to take action. Or it could be that it is the number of neurons firing that prompts an action.
  • a “pioneer of computational neuroscience.” Mr. Abbott brought the mathematical skills of a physicist to the field, but he is able to plunge right into the difficulties of dealing with actual brain experiments
  • In the brain, somehow, stored memories and desires like hunger or thirst are added to information about the world, and actions are the result. This is the case for all sorts of animals, not just humans. It is thinking, at the most basic level.
7More

How to Be French - NYTimes.com - 0 views

  • I’m pursuing French citizenship. The whole procedure can take years. Amid repeated requests for new documents, some would-be French people just give up.
  • This may be by design. “The difficulty of the ordeal seems a means of testing the authenticity of his/her commitment to the project of becoming French,” the sociologists Didier Fassin and Sarah Mazouz concluded in their 2009 paper “What Is It to Become French?” Officials can reject an applicant because he hasn’t adopted French values, or merely because his request isn’t “opportune.”
  • There’s a long tradition of Frenchification here. Napoleon Bonaparte was born Napoleone di Buonaparte and spoke French with a thick Corsican accent. He and others spent the 19th century transforming France from a nation with a patchwork of regional languages and dialects to one where practically everyone spoke proper French.
  • ...4 more annotations...
  • Schools were their main instrument. French schools follow a national curriculum that includes arduous surveys of French philosophy and literature. Frenchmen then spend the rest of their lives quoting Proust to one another, with hardly anyone else catching the references.
  • Even the rituals of friendship are different here. The Canadian writer Jean-Benoît Nadeau, who just spent a year in Paris, says there are clues that a French person wants to befriend you: She tells you about her family; she uses self-deprecating humor; and she admits that she likes her job. There’s also the fact that she speaks to you at all. Unlike North Americans, “the French have no compunction about not talking to you.”
  • Apparently, being a Parisian woman has its own requirements. The new book “How to Be Parisian Wherever You Are” says Parisiennes are “imperfect, vague, unreliable and full of paradoxes” and have “that typically French enthusiasm for transforming life into fiction.” I need to cultivate an “air of fragility,” too.
  • Apparently nobody expects me to achieve a state of inner Frenchness. At a naturalization ceremony that the two sociologists observed, an official told new citizens that they were granted French nationality because they had assimilated “not to the point where you entirely resemble native French people, yet enough so that you feel at ease among us.”
20More

Journalists debunk vaccine science denial - 0 views

  • extra difficulties imposed irrationally by antiscience.
  • Large outbreaks in the U.S. of the highly infectious disease have become more common in the past two years, even though measles hasn’t been indigenous since 2000, according to the Centers for Disease Control and Prevention.
  • difficult because concerns about a possible link between vaccines and autism—now debunked by science—have expanded to more general, and equally groundless, worries about the effects of multiple shots on a child’s immune system, vaccine experts and doctors say.
  • ...17 more annotations...
  • It summarized and condemned the scientific and medical fraud that the British researcher Andrew Wakefield perpetrated. Years earlier, he had falsely linked the measles, mumps, and rubella (MMR) vaccine to autism. The editorial lamented that “the damage to public health continues, fuelled by unbalanced media reporting and an ineffective response from government, researchers, journals, and the medical profession.”
  • Reporters also seek to ensure that viewers, listeners, or readers understand that measles can afflict a victim more powerfully than does a mere passing ailment.
  • Measles doesn’t spread in most U.S. communities because people are protected by “herd immunity,” meaning that 92% to 94% of the population is vaccinated or immune. That level of protection makes it hard for one case of measles to spread even from one unvaccinated person to another without direct contact.
  • a study that “found that only 51 percent of Americans were confident that vaccines are safe and effective, which is similar to the proportion who believe that houses can be haunted by ghosts.”
  • In some parts of California, resistance to vaccinations including the MMR shot is stronger than ever, despite cases of measles hitting five US states.
  • “Vaccines are a great idea, but they are poisoning us, adding things that kick in later in life so they can sell us more drugs.”
  • Health professionals say those claims are unfounded or vastly overstated.
  • “the anti-vaccination movement is fueled by an over-privileged group of rich people grouped together who swear they won’t put any chemicals in their kids (food or vaccines or whatever else), either because it’s trendy to be all-natural or they don’t understand or accept the science of vaccinations. Their science denying has been propelled further by celebrities
  • the outbreak “should worry and enrage the public.” It indicted the anti-vaxxers’ “ignorant and self-absorbed rejection of science” and declared, “Getting vaccinated is good for the health of the inoculated person and also part of one’s public responsibility to help protect the health of others.”
  • “It’s wrong,” the editors emphasized, “to allow public health to be threatened while everyone else waits for these science-denying parents to open their eyes.”
  • “It’s because these people are highly educated and they get on the Internet and read things and think they can figure things out better than their physician.”
  • linked vaccination opposition to the “political left, which has long been suspicious of the lobbying power of the pharmaceutical industry and its influence on government regulators, and also the fringe political right, which has at different times seen vaccination, fluoridisation and other public-health initiatives as attempts by big government to impose tyrannical limits on personal freedom.”
  • Attempts to increase concerns about communicable diseases or correct false claims about vaccines may be especially likely to be counterproductive.
  • “attempting balance by giving vaccine skeptics and pro-vaccine advocates equal weight in news stories leads people to believe the evidence for and against vaccination is equally strong.”
  • A recent edition of the Washington Post carried a letter defending anti-vaxxers as “people who generally are pro-science and highly educated, who have high incomes and who have studied this issue carefully before coming to the conclusion that the risk to their children is greater than the slim possibility of contracting a childhood disease that [in many cases leaves] little or no residual consequences.”
  • anecdotal evidence suggests that some journalists, rather than omitting anti-vaxxers’ views, prefer to expose them and then oppose them.
  • “unwarranted fear . . . an assault on one of the greatest public-health inventions in world history.”
6More

How Meditation Changes the Brain and Body - The New York Times - 0 views

  • a study published in Biological Psychiatry brings scientific thoroughness to mindfulness meditation and for the first time shows that, unlike a placebo, it can change the brains of ordinary people and potentially improve their health.
  • One difficulty of investigating meditation has been the placebo problem. In rigorous studies, some participants receive treatment while others get a placebo: They believe they are getting the same treatment when they are not. But people can usually tell if they are meditating. Dr. Creswell, working with scientists from a number of other universities, managed to fake mindfulness.
  • Half the subjects were then taught formal mindfulness meditation at a residential retreat center; the rest completed a kind of sham mindfulness meditation that was focused on relaxation and distracting oneself from worries and stress.
  • ...3 more annotations...
  • Dr. Creswell and his colleagues believe that the changes in the brain contributed to the subsequent reduction in inflammation, although precisely how remains unknown.
  • follow-up brain scans showed differences in only those who underwent mindfulness meditation. There was more activity, or communication, among the portions of their brains that process stress-related reactions and other areas related to focus and calm. Four months later, those who had practiced mindfulness showed much lower levels in their blood of a marker of unhealthy inflammation than the relaxation group, even though few were still meditating.
  • When it comes to how much mindfulness is needed to improve health, Dr. Creswell says, ‘‘we still have no idea about the ideal dose.”
18More

Addicted to Distraction - The New York Times - 0 views

  • ONE evening early this summer, I opened a book and found myself reading the same paragraph over and over, a half dozen times before concluding that it was hopeless to continue. I simply couldn’t marshal the necessary focus.
  • All my life, reading books has been a deep and consistent source of pleasure, learning and solace. Now the books I regularly purchased were piling up ever higher on my bedside table, staring at me in silent rebuke.
  • Instead of reading them, I was spending too many hours online,
  • ...15 more annotations...
  • “The net is designed to be an interruption system, a machine geared to dividing attention,” Nicholas Carr explains in his book “The Shallows: What the Internet Is Doing to Our Brains.” “We willingly accept the loss of concentration and focus, the division of our attention and the fragmentation of our thoughts, in return for the wealth of compelling or at least diverting information we receive.”
  • Addiction is the relentless pull to a substance or an activity that becomes so compulsive it ultimately interferes with everyday life
  • Denial is any addict’s first defense. No obstacle to recovery is greater than the infinite capacity to rationalize our compulsive behaviors
  • According to one recent survey, the average white-collar worker spends about six hours a day on email.
  • The brain’s craving for novelty, constant stimulation and immediate gratification creates something called a “compulsion loop.” Like lab rats and drug addicts, we need more and more to get the same effect.
  • Endless access to new information also easily overloads our working memory. When we reach cognitive overload, our ability to transfer learning to long-term memory significantly deteriorates.
  • By that definition, nearly everyone I know is addicted in some measure to the Internet. It has arguably replaced work itself as our most socially sanctioned addictio
  • t we humans have a very limited reservoir of will and discipline. We’re far more likely to succeed by trying to change one behavior at a time, ideally at the same time each day, so that it becomes a habit, requiring less and less energy to sustain.
  • Now it was time to detox. I interpreted the traditional second step — belief that a higher power could help restore my sanity — in a more secular way. The higher power became my 30-year-old daughter, who disconnected my phone and laptop from both my email and the Web.
  • During those first few days, I did suffer withdrawal pangs, most of all the hunger to call up Google and search for an answer to some question that arose. But with each passing day offline, I felt more relaxed, less anxious, more able to focus and less hungry for the next shot of instant but short-lived stimulation. What happened to my brain is exactly what I hoped would happen: It began to quiet down.
  • I had brought more than a dozen books of varying difficulty and length on my vacation. I started with short nonfiction, and then moved to longer nonfiction as I began to feel calmer and my focus got stronger. I eventually worked my way up to “The Emperor of All Maladies
  • I am back at work now, and of course I am back online. The Internet isn’t going away, and it will continue to consume a lot of my attention. My aim now is to find the best possible balance between time online and time off
  • I also make it my business now to take on more fully absorbing activities as part of my days. Above all, I’ve kept up reading books, not just because I love them, but also as a continuing attention-building practice.
  • I’ve retained my longtime ritual of deciding the night before on the most important thing I can accomplish the next morning. That’s my first work activity most days, for 60 to 90 minutes without interruption. Afterward, I take a 10- to 15-minute break to quiet my mind and renew my energy.
  • If I have other work during the day that requires sustained focus, I go completely offline for designated periods, repeating my morning ritual. In the evening, when I go up to my bedroom, I nearly always leave my digital devices downstairs.
17More

Opinion | Knowledge, Ignorance and Climate Change - The New York Times - 1 views

  • the value of being aware of our ignorance has been a recurring theme in Western thought: René Descartes said it’s necessary to doubt all things to build a solid foundation for science; and Ludwig Wittgenstein, reflecting on the limits of language, said that “the difficulty in philosophy is to say no more than we know.”
  • Sometimes, when it appears that someone is expressing doubt, what he is really doing is recommending a course of action. For example, if I tell you that I don’t know whether there is milk in the fridge, I’m not exhibiting philosophical wisdom — I’m simply recommending that you check the fridge before you go shopping.
  • According to NASA, at least 97 percent of actively publishing climate scientists think that “climate-warming trends over the past century are extremely likely caused by human activities.”
  • ...14 more annotations...
  • As a philosopher, I have nothing to add to the scientific evidence of global warming, but I can tell you how it’s possible to get ourselves to sincerely doubt things, despite abundant evidence to the contrary
  • scenarios suggest that it’s possible to feel as though you don’t know something even when possessing enormous evidence in its favor. Philosophers call scenarios like these “skeptical pressure” cases
  • In general, a skeptical pressure case is a thought experiment in which the protagonist has good evidence for something that he or she believes, but the reader is reminded that the protagonist could have made a mistake
  • If the story is set up in the right way, the reader will be tempted to think that the protagonist’s belief isn’t genuine knowledge
  • When presented with these thought experiments, some philosophy students conclude that what these examples show is that knowledge requires full-blown certainty. In these skeptical pressure cases, the evidence is overwhelming, but not 100 percent. It’s an attractive idea, but it doesn’t sit well with the fact that we ordinarily say we know lots of things with much lower probability.
  • Although there is no consensus about how it arises, a promising idea defended by the philosopher David Lewis is that skeptical pressure cases often involve focusing on the possibility of error. Once we start worrying and ruminating about this possibility, no matter how far-fetched, something in our brains causes us to doubt. The philosopher Jennifer Nagel aptly calls this type of effect “epistemic anxiety.”
  • In my own work, I have speculated that an extreme version of this phenomenon is operative in obsessive compulsive disorder
  • The standard response by climate skeptics is a lot like our reaction to skeptical pressure cases. Climate skeptics understand that 97 percent of scientists disagree with them, but they focus on the very tiny fraction of holdouts. As in the lottery case, this focus might be enough to sustain their skepticism.
  • Anti-vaccine proponents, for example, aware that medical professionals disagree with their position, focus on any bit of fringe research that might say otherwise.
  • Skeptical allure can be gripping. Piling on more evidence does not typically shake you out of it, just as making it even more probable that you will lose the lottery does not all of a sudden make you feel like you know your ticket is a loser.
  • One way to counter the effects of skepticism is to stop talking about “knowledge” and switch to talking about probabilities. Instead of saying that you don’t know some claim, try to estimate the probability that it is true. As hedge fund managers, economists, policy researchers, doctors and bookmakers have long been aware, the way to make decisions while managing risk is through probabilities.
  • Once we switch to this perspective, claims to “not know,” like those made by Trump, lose their force and we are pushed to think more carefully about the existing data and engage in cost-benefit analyses.
  • It’s easy to say you don’t know, but it’s harder to commit to an actual low probability estimate in the face of overwhelming contrary evidence.
  • Socrates was correct that awareness of one’s ignorance is virtuous, but philosophers have subsequently uncovered many pitfalls associated with claims of ignorance. An appreciation of these issues can help elevate public discourse on important topics, including the future of our planet.
16More

There Is No Theory of Everything - The New York Times - 3 views

  • fered by science and the kinds of humanistic description we find, say, in the novels of Dickens or Dostoevsky, or in the sociological writings of Erving Goffman and David Riesman. His quest was to try and clarify the occasions when a scientific explanation was appropriate and when it was not, and we need instead a humanistic remark.
  • His conviction was that our confusions about science and the humanities had wide-ranging and malign societal consequences.
  • However efficacious the blue pill might be, in this instance the doctor’s causal diagnosis is the wrong one. What is required is for you to be able to talk, to feel that someone understands your problems and perhaps can offer some insight or even suggestions on how you might move forward in your life. This, one imagines, is why people go into therapy.
  • ...13 more annotations...
  • On a ferry you want a blue pill that is going to alleviate the symptoms of seasickness and make you feel better.
  • Frank’s point is that our society is deeply confused by the occasions when a blue pill is required and not required, or when we need a causal explanation and when we need a further description, clarification or elucidation. We tend to get muddled and imagine that one kind of explanation (usually the causal one) is appropriate in all occasions when it is not.
  • What is in play here is the classical distinction made by Max Weber between explanation and clarification, between causal or causal-sounding hypotheses and interpretation. Weber’s idea is that natural phenomena require causal explanation, of the kind given by physics, say, whereas social phenomena require elucidation — richer, more expressive descriptions.
  • In Frank’s view, one major task of philosophy is help us get clear on this distinction and to provide the right response at the right time. This, of course, requires judgment, which is no easy thing to teach.
  • Frank’s point, which is still hugely important, is that there is no theory of everything, nor should there be. There is a gap between nature and society. The mistake, for which scientism is the name, is the belief that this gap can or should be filled.
  • One huge problem with scientism is that it invites, as an almost allergic reaction, the total rejection of science. As we know to our cost, we witness this every day with climate change deniers, flat-earthers and religious fundamentalists
  • in order to confront the challenge of obscurantism, we do not simply need to run into the arms of scientism. What is needed is a clearer overview of the occasions when a scientific remark is appropriate and when we need something else, the kind of elucidation we find in stories, poetry or indeed when we watch a movie
  • People often wonder why there appears to be no progress in philosophy, unlike in natural science, and why it is that after some three millenniums of philosophical activity no dramatic changes seem to have been made to the questions philosophers ask. The reason is because people keep asking the same questions and perplexed by the same difficulties
  • Wittgenstein puts the point rather directly: “Philosophy hasn’t made any progress? If somebody scratches the spot where he has an itch, do we have to see some progress?”
  • Philosophy scratches at the various itches we have, not in order that we might find some cure for what ails us, but in order to scratch in the right place and begin to understand why we engage in such apparently irritating activity.
  • This is one way of approaching the question of life’s meaning. Human beings have been asking the same kinds of questions for millenniums and this is not an error. It testifies to the fact that human being are rightly perplexed by their lives. The mistake is to believe that there is an answer to the question of life’s meaning
  • The point, then, is not to seek an answer to the meaning of life, but to continue to ask the question.
  • We don’t need an answer to the question of life’s meaning, just as we don’t need a theory of everything. What we need are multifarious descriptions of many things, further descriptions of phenomena that change the aspect under which they are seen, that light them up and let us see them anew
18More

Here's what the government's dietary guidelines should really say - The Washington Post - 0 views

  • If I were writing the dietary guidelines, I would give them a radical overhaul. I’d go so far as to radically overhaul the way we evaluate diet. Here’s why and how.
  • Lately, as scientists try, and fail, to reproduce results, all of science is taking a hard look at funding biases, statistical shenanigans and groupthink. All that criticism, and then some, applies to nutrition.
  • Prominent in the charge to change the way we do science is John Ioannidis, professor of health research and policy at Stanford University. In 2005, he published “Why Most Research Findings Are False” in the journal PLOS Medicin
  • ...15 more annotations...
  • He came down hard on nutrition in a pull-no-punches 2013 British Medical Journal editorial titled, “Implausible results in human nutrition research,” in which he noted, “Almost every single nutrient imaginable has peer reviewed publications associating it with almost any outcome.”
  • Ioannidis told me that sussing out the connection between diet and health — nutritional epidemiology — is enormously challenging, and “the tools that we’re throwing at the problem are not commensurate with the complexity and difficulty of the problem.” The biggest of those tools is observational research, in which we collect data on what people eat, and track what happens to them.
  • He lists plant-based foods — fruit, veg, whole grains, legumes — but acknowledges that we don’t understand enough to prescribe specific combinations or numbers of servings.
  • funding bias isn’t the only kind. “Fanatical opinions abound in nutrition,” Ioannidis wrote in 2013, and those have bias power too.
  • “Definitive solutions won’t come from another million observational papers or small randomized trials,” reads the subtitle of Ioannidis’s paper. His is a burn-down-the-house ethos.
  • When it comes to actual dietary recommendations, the disagreement is stark. “Ioannidis and others say we have no clue, the science is so bad that we don’t know anything,” Hu told me. “I think that’s completely bogus. We know a lot about the basic elements of a healthy diet.”
  • Give tens of thousands of people that FFQ, and you end up with a ginormous repository of possible correlations. You can zero in on a vitamin, macronutrient or food, and go to town. But not only are you starting with flawed data, you’ve got a zillion possible confounding variables — dietary, demographic, socioeconomic. I’ve heard statisticians call it “noise mining,” and Ioannidis is equally skeptical. “With this type of data, you can get any result you want,” he said. “You can align it to your beliefs.”
  • Big differences in what people eat track with other differences. Heavy plant-eaters are different from, say, heavy meat-eaters in all kinds of ways (income, education, physical activity, BMI). Red meat consumption correlates with increased risk of dying in an accident as much as dying from heart disease. The amount of faith we put in observational studies is a judgment call.
  • I find myself in Ioannidis’s camp. What have we learned, unequivocally enough to build a consensus in the nutrition community, about how diet affects health? Well, trans-fats are bad.
  • Over and over, large population studies get sliced and diced, and it’s all but impossible to figure out what’s signal and what’s noise. Researchers try to do that with controlled trials to test the connections, but those have issues too. They’re expensive, so they’re usually small and short-term. People have trouble sticking to the diet being studied. And scientists are generally looking for what they call “surrogate endpoints,” like increased cholesterol rather than death from heart disease, since it’s impractical to keep a trial going until people die.
  • , what do we do? Hu and Ioannidis actually have similar suggestions. For starters, they both think we should be looking at dietary patterns rather than single foods or nutrients. They also both want to look across the data sets. Ioannidis emphasizes transparency. He wants to open data to the world and analyze all the data sets in the same way to see if “any signals survive.” Hu is more cautious (partly to safeguard confidentiality
  • I have a suggestion. Let’s give up on evidence-based eating. It’s given us nothing but trouble and strife. Our tools can’t find any but the most obvious links between food and health, and we’ve found those already.
  • Instead, let’s acknowledge the uncertainty and eat to hedge against what we don’t know
  • We’ve got two excellent hedges: variety and foods with nutrients intact (which describes such diets as the Mediterranean, touted by researchers). If you severely limit your foods (vegan, keto), you might miss out on something. Ditto if you eat foods with little nutritional value (sugar, refined grains). Oh, and pay attention to the two things we can say with certainty: Keep your weight down, and exercise.
  • I used to say I could tell you everything important about diet in 60 seconds. Over the years, my spiel got shorter and shorter as truisms fell by the wayside, and my confidence waned in a field where we know less, rather than more, over time. I’m down to five seconds now: Eat a wide variety of foods with their nutrients intact, keep your weight down and get some exercise.
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
12More

This Is Not a Market | Dissent Magazine - 0 views

  • Given how ordinary people use the term, it’s not surprising that academic economists are a little vague about it—but you’ll be glad to hear that they know they’re being vague. A generation of economists have criticized their colleagues’ inability to specify what a “market” actually is. George Stigler, back in 1967, thought it “a source of embarrassment that so little attention has been paid to the theory of markets.” Sociologists agree: according to Harrison White, there is no “neoclassical theory of the market—[only] a pure theory of exchange.” And Wayne Baker found that the idea of the market is “typically assumed—not studied” by most economists, who “implicitly characterize ‘market’ as a ‘featureless plane.’
  • When we say “market” now, we mean nothing particularly specific, and, at the same time, everything—the entire economy, of course, but also our lives in general. If you can name it, there’s a market in it: housing, education, the law, dating. Maybe even love is “just an economy based on resource scarcity.”
  • The use of markets to describe everything is odd, because talking about “markets” doesn’t even help us understand how the economy works—let alone the rest of our lives. Even though nobody seems to know what it means, we use the metaphor freely, even unthinkingly. Let the market decide. The markets are volatile. The markets responded poorly. Obvious facts—that the economy hasn’t rebounded after the recession—are hidden or ignored, because “the market” is booming, and what is the economy other than “the market”? Well, it’s lots of other things. We might see that if we talked about it a bit differently.
  • ...9 more annotations...
  • For instance, we might choose a different metaphor—like, say, the traffic system. Sounds ridiculous? No more so than the market metaphor. After all, we already talk about one important aspect of economic life in terms of traffic: online activity. We could describe it in market terms (the market demands Trump memes!), but we use a different metaphor, because it’s just intuitively more suitable. That last Trump meme is generating a lot of traffic. Redirect your attention as required.
  • We don’t know much about markets, because we don’t deal with them very often. But most of us know plenty about traffic systems: drivers will know the frustration of trying to turn left onto a major road, of ceaseless, pointless lane-switching on a stalled rush-hour freeway, but also the joys of clear highways.
  • We know the traffic system because, whether we like it or not, we are always involved in it, from birth
  • As of birth, Jean is in the economy—even if s/he rarely goes to a market. You can’t not be an economic actor; you can’t not be part of the transport system.
  • Consider also the composition of the traffic system and the economy. A market, whatever else it is, is always essentially the same thing: a place where people can come together to buy and sell things. We could set up a market right now, with a few fences and a sign announcing that people could buy and sell. We don’t even really need the fences. A traffic system, however, is far more complex. To begin with, the system includes publicly and privately run elements: most cars are privately owned, as are most airlines
  • If we don’t evaluate traffic systems based on their size, or their growth, how do we evaluate them? Mostly, by how well they help people get where they want to go. The market metaphor encourages us to think that all economic activity is motivated by the search for profit, and pursued in the same fashion everywhere. In a market, everyone’s desires are perfectly interchangeable. But, while everybody engages in the transport system, we have no difficulty remembering that we all want to go to different places, in different ways, at different times, at different speeds, for different reasons
  • Deciding how to improve the traffic system, how to expand people’s opportunities, is obviously a question of resource allocation and prioritization on a scale that private individuals—even traders—cannot influence on their own. That’s why government have not historically trusted the “magic of the markets” to produce better opportunities for transport. We intuitively understand that these decisions are made at the level of mass society and public policy. And, whether you like it or not, this is true for decisions about the economy as well.
  • Thinking of the economy in terms of the market—a featureless plane, with no entry or exit costs, little need for regulation, and equal opportunity for all—obscures this basic insight. And this underlying misconception creates a lot of problems: we’ve fetishized economic growth, we’ve come to distrust government regulation, and we imagine that the inequalities in our country, and our world, are natural or justified. If we imagine the economy otherwise—as a traffic system, for example—we see more clearly how the economy actually works.
  • We see that our economic life looks a lot less like going to “market” for fun and profit than it does sitting in traffic on our morning commute, hoping against hope that we’ll get where we want to go, and on time.
11More

George Soros: Facebook and Google a menace to society | Business | The Guardian - 0 views

  • Facebook and Google have become “obstacles to innovation” and are a “menace” to society whose “days are numbered”
  • “Mining and oil companies exploit the physical environment; social media companies exploit the social environment,” said the Hungarian-American businessman, according to a transcript of his speech.
  • “This is particularly nefarious because social media companies influence how people think and behave without them even being aware of it. This has far-reaching adverse consequences on the functioning of democracy, particularly on the integrity of elections.”
  • ...8 more annotations...
  • In addition to skewing democracy, social media companies “deceive their users by manipulating their attention and directing it towards their own commercial purposes” and “deliberately engineer addiction to the services they provide”. The latter, he said, “can be very harmful, particularly for adolescents”
  • There is a possibility that once lost, people who grow up in the digital age will have difficulty in regaining it. This may have far-reaching political consequences.”
  • Soros warned of an “even more alarming prospect” on the horizon if data-rich internet companies such as Facebook and Google paired their corporate surveillance systems with state-sponsored surveillance – a trend that’s already emerging in places such as the Philippines.
  • “This may well result in a web of totalitarian control the likes of which not even Aldous Huxley or George Orwell could have imagined,”
  • “The internet monopolies have neither the will nor the inclination to protect society against the consequences of their actions. That turns them into a menace and it falls to the regulatory authorities to protect society against them,
  • He also echoed the words of world wide web inventor Sir Tim Berners-Lee when he said the tech giants had become “obstacles to innovation” that need to be regulated as public utilities “aimed at preserving competition, innovation and fair and open universal access”.
  • Earlier this week, Salesforce’s chief executive, Marc Benioff, said that Facebook should be regulated like a cigarette company because it’s addictive and harmful.
  • In November, Roger McNamee, who was an early investor in Facebook, described Facebook and Google as threats to public health.
5More

A Real 'Very Stable Genius' Doesn't Call Himself One - The Atlantic - 2 views

  • the Dunning-Kruger effect: the more limited someone is in reality, the more talented the person imagines himself to be.
  • “Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments.”
  • During a brief stint of actually working at a tech company, I learned that some of the engineers and coders were viewed as just operating on a different plane: The code they wrote was better, tighter, and more elegant than other people’s, and they could write it much more quickly.
  • ...2 more annotations...
  • If you report long enough on politics and public life, even there you will see examples of exceptional strategic, analytic, and bargaining intelligence, along with a lot of clownishness.
  • They know what they don’t know. This to me is the most consistent marker of real intelligence. The more acute someone’s ability to perceive and assess, the more likely that person is to recognize his or her limits. These include the unevenness of any one person’s talents; the specific areas of weakness—social awkwardness, musical tin ear, being stronger with numbers than with words, or vice versa; and the incomparable vastness of what any individual person can never know. To read books seriously is to be staggered by the knowledge of how many more books will remain beyond your ken. It’s like looking up at the star-filled sky.
« First ‹ Previous 41 - 60 of 111 Next › Last »
Showing 20 items per page