Skip to main content

Home/ TOK Friends/ Group items tagged Materialism

Rss Feed Group items tagged

Javier E

New Statesman - The Joy of Secularism: 11 Essays for How We Live Now - 0 views

  • Art & Design Books Film Ideas Music & Performance TV & Radio Food & Drink Blog Return to: Home | Culture | Books The Joy of Secularism: 11 Essays for How We Live Now By George Levine Reviewed by Terry Eagleton - 22 June 2011 82 comments Print version Email a friend Listen RSS Misunderstanding what it means to be secular.
  • Societies become truly secular not when they dispense with religion but when they are no longer greatly agitated by it. It is when religious faith ceases to be a vital part of the public sphere
  • Christianity is certainly other-worldly, and so is any reasonably sensitive soul who has been reading the newspapers. The Christian gospel looks to a future transformation of the appalling mess we see around us into a community of justice and friendship, a change so deep-seated and indescribable as to make Lenin look like a Lib Dem.“This [world] is our home," Levine comments. If he really feels at home in this crucifying set-up, one might humbly suggest that he shouldn't. Christians and political radicals certainly don't.
  • ...9 more annotations...
  • he suspects that Christian faith is other-worldly in the sense of despising material things. Material reality, in his view, is what art celebrates but religion does not. This is to forget that Gerard Manley Hopkins was a Jesuit. It is also to misunderstand the doctrine of Creation
  • Adam Phillips writes suggestively of human helplessness as opposed to the sense of protectedness that religious faith supposedly brings us, without noticing that the signifier of God for the New Testament is the tortured and executed corpse of a suspected political criminal.
  • None of these writers points out that if Christianity is true, then it is all up with us. We would then have to face the deeply disagreeable truth that the only authentic life is one that springs from a self-dispossession so extreme that it is probably beyond our power.
  • What exactly," he enquires, "does the invocation of some supernatural being add?" A Christian might reply that it adds the obligations to give up everything one has, including one's life, if necessary, for the sake of others. And this, to say the least, is highly inconvenient.
  • The Christian paradigm of love, by contrast, is the love of strangers and enemies, not of those we find agreeable. Civilised notions such as mutual sympathy, more's the pity, won't deliver us the world we need.
  • Secularisation is a lot harder than people tend to imagine. The history of modernity is, among other things, the history of substitutes for God. Art, culture, nation, Geist, humanity, society: all these, along with a clutch of other hopeful aspirants, have been tried from time to time. The most successful candidate currently on offer is sport, which, short of providing funeral rites for its spectators, fulfils almost every religious function in the book.
  • If Friedrich Nietzsche was the first sincere atheist, it is because he saw that the Almighty is exceedingly good at disguising Himself as something else, and that much so-called secularisation is accordingly bogus.
  • Postmodernism is perhaps best seen as Nietzsche shorn of the metaphysical baggage. Whereas modernism is still haunted by a God-shaped absence, postmodern culture is too young to remember a time when men and women were anguished by the fading spectres of truth, reality, nature, value, meaning, foundations and the like. For postmodern theory, there never was any truth or meaning in the first place
  • Postmodernism is properly secular, but it pays an immense price for this coming of age - if coming of age it is. It means shelving all the other big questions, too, as hopelessly passé. It also involves the grave error of imagining that all faith or passionate conviction is inci­piently dogmatic. It is not only religious belief to which postmodernism is allergic, but belief as such. Advanced capitalism sees no need for the stuff. It is both politically divisive and commercially unnecessary.
carolinewren

'God made science': Louisiana teachers are literally using the Bible as science textboo... - 0 views

  • students in Louisiana literally use the Bible as their science textbook, according to recently obtained records
  • State law permits teachers to promote classroom discussion on evolution, but critics say the Louisiana Science Education Act allows creationism to be taught in public schools.
  • students read the Book of Genesis to learn creationism in biology class.
  • ...6 more annotations...
  • “We will read in Genesis and them [sic] some supplemental material debunking various aspects of evolution from which the students will present,”
  • A teacher at Caddo Parish schools wrote a newspaper column saying that her job is to present both evolution and creationism.
  • God made science,” wrote fifth-grade teacher Charlotte Hinson.
  • “pushing her twisted religious beliefs onto the class,” another praised biology teacher Michael Stacy because he “discussed evolution and creationism in a full spectrum of thought.”
  • state law, passed in 2008, allows science teachers to introduce supplemental materials to “critique” scientific theories – lessons on creationism are still illegal under federal law.
  • schools are also violating prohibitions on teacher-led prayer in school.
Richard Monari

Yakko's Universe Song - 0 views

  •  
    This song, released on the Animaniacs in the 90s, first instilled in me a sense of relativity. Later on, I feel impressed that an animated show would include such material that discussed the largeness of the universe for small children. Also, it depressed me when they said "we're all really puny."
Javier E

Deflating Our Bonus Culture - The Dish | By Andrew Sullivan - The Daily Beast - 0 views

  • The idea that people are solely self-interested and materially orientated has been thrown overboard by leading scholars. Empirical research, in particular experimental research, has shown that under suitable conditions human beings care for the wellbeing of other persons. Above all, they are not solely interested in material gains. Recognition by co-workers is greatly important. Many workers are intrinsically motivated
  • When a remote authority sets incentives, people respond by manipulating the system. This fact is poorly understood by education reformers who are fond of pay-for-performance and national standards
  • The Hayekian story here is that effective compensation practices require local knowledge and tacit knowledge.
Javier E

How Social Status Affects Your Health - NYTimes.com - 0 views

  • If you want to see how status affects health, you have to isolate status from material wealth. How to do that? The easiest way is to observe a society in which there is minimal material wealth to contest and where there are limited avenues for status competition.
  • For several years, we studied the Tsimane forager-horticulturalists of Amazonian Bolivia, a small, preindustrial, politically egalitarian society in which status confers no formal privileges (such as coercive authority).
  • we found that even among the Tsimane, higher status was associated with lower levels of stress and better health.
  • ...12 more annotations...
  • Along the banks of the Maniqui River and in adjacent forests, the Tsimane people hunt, fish and plant plantains, rice and sweet manioc. They live in villages that range in size from 30 to 700 people. During village meetings, decision making is consensus-based. No individual has the right to coerce anyone else.
  • that doesn’t mean there are no status distinctions. When you attend a Tsimane village meeting, you soon notice that the opinions of certain men are more influential during the consensus-building process. These same men are often solicited to mediate disputes or to represent villagers’ interests with outsiders.
  • My colleagues and I measured the social status of all the men from four Tsimane villages (nearly 200 men between the ages of 18 and 83), by asking them to evaluate one another on their informal political influence. The men also provided urine samples and received medical examinations from physicians
  • We found that Tsimane men with less political influence had higher levels of the stress hormone cortisol, which has many important physiological functions. This result persisted after controlling for other factors that might affect stress levels, including age, body size and personality.
  • In addition, we found that the less influential Tsimane men had a higher risk of respiratory infection, the most common cause of sickness and death in their society. Stress may contribute to this disparity in infection risk; when chronic, stress can dampen immune function.
  • Studying the same individuals over a four-year period, we also found that for men whose influence declined over time, greater declines were correlated with higher levels of cortisol and respiratory illness. Downward mobility is harmful, it seems, even in an egalitarian society.
  • Why might low status cause such stress for the Tsimane? One possibility is that status offers a greater sense of control.
  • Another is that status acts as a form of social insurance. Influential Tsimane men have more allies and food-production partners, who can be helpful in mitigating conflict, sickness and food shortage. The relative lack of such support may cause psychosocial stress.
  • It is interesting that even in industrialized societies, the status comparisons most consequential for psychosocial stress are often among individuals who live near one another or occupy the same social network, not individuals at opposite ends of the socioeconomic spectrum.
  • Those living just above the poverty line may resent welfare for those living just below it, and a millionaire may envy a multimillionaire more than he envies a billionaire.
  • The importance of relative status perceptions may have its roots in the small-scale societies of our ancestors, which were similar to that of the Tsimane. In such societies, both our political competitors and our cooperative partners were likely individuals with whom we interacted regularly.
  • As our society debates the effects of wealth inequality, the Tsimane help us understand why we care so deeply about relative social position — and why our health depends on it.
Javier E

History News Network | How the NCSS Sold Out Social Studies and History - 0 views

  • As a historian, I was influenced by E. H. Carr’s thinking about the past and present as part of a continuum that stretches into the future. In What is History? (1961), Carr argued that concern with the future is what really motivates the study of the past. As a teacher and political activist, I also strongly believe that schools should promote active citizenship as essential for maintaining a democratic society.
  • in an effort to survive, the NCSS has largely abandoned its commitment to these ideas, twisting itself into a pretzel to adapt to national Common Core standards and to satisfy influential conservative organizations that they are not radical, or even liberal. I suspect, but cannot document, that the organization’s membership has precipitously declined during the past two decades and it has increasingly depended financial support for its conferences and publications from deep-pocketed traditional and rightwing groups who advertise and have display booths.
  • No Child Left Behind (NCLB). Since the introduction of NCLB, there has been a steady reduction in the amount of time spent in the teaching of social studies, with the most profound decline noticed in the elementary grades.”
  • ...10 more annotations...
  • In an effort to counter the Common Core push for detextualized skill-based instruction and assessment that has further marginalized social studies education, the NCSS is promoting what it calls “College, Career, and Civic Life (C3) Framework,”
  • through its choice of partners, its rigid adherence to Common Core lesson guidelines, and the sample material it is promoting, the NCSS has virtually abandoned not just meaningful social studies education, but education for democracy and citizenship as well.
  • My biggest problem with the C3 Framework as presented in this new document on instruction is its attempt to adopt a fundamentally flawed Common Core approach to social studies and history based on the close reading of text without exploring historical context
  • how Common Core as it is being implemented will mean the end of history.
  • In the C3 Framework inquiry approach, students start in Dimension 1 by “developing questions and planning inquiries,” however the inquiry is really already “planned” because material they use is pre-selected. It is also not clear what their questions will be based on since they do not necessarily have any background on the subject. In Dimension 3 students evaluate sources using evidence, but again, devoid of historical context. Dimension 4 is supposed to be C3’s chief addition to Common Core. In Dimension 4 students are supposed to plan activities and become involved in civic life, although of course their options have again already been pre-prescribed.
  • In Dimension 2, as they read the text, which sixth graders and many eighth graders will find difficult, students discuss “How was citizenship revolutionary in 1776?” The question requires them to assume that colonists had already formulated a concept of citizenship, which I do not believe they had, a concept of nation, which they definitely did not, and an understanding that somehow what they were doing was “revolutionary,” which was still being debated.
  • Some of the organizations involved in writing the C3 Frameworks have positions so politically skewed to the right that NCSS should be embarrassed about including them in the project. In this category I include Gilder Lehrman, The Bill of Rights Institute, and The Center for Economic Education and Entrepreneurship (CEEE).
  • Conspicuously missing from the group of contributors is the Zinn Education Project which would have provided a radically different point of view.
  • What we have in the C3 Framework is standard teaching at best but a lot of poor teaching and propaganda as well.
  • Instead of challenging Common Core, the NCSS begs to be included. Instead of presenting multiple perspectives, it sells advertising in the form of lessons to its corporate and foundation sponsors. But worst in their own terms, in a time of mass protest against police brutality by high school and college students across the United States, active citizenship in a democratic society is stripped of meaning and becomes little more than idle discussion and telling student to vote when they are eighteen.
kushnerha

There's nothing wrong with grade inflation - The Washington Post - 0 views

  • By the early ’90s, so long as one had the good sense to major in the humanities — all bets were off in the STEM fields — it was nearly impossible to get a final grade below a B-minus at an elite college. According to a 2012 study, the average college GPA, which in the 1930s was a C-plus, had risen to a B at public universities and a B-plus at private schools. At Duke, Pomona and Harvard, D’s and F’s combine for just 2 percent of all grades. A Yale report found that 62 percent of all Yale grades are A or A-minus. According to a 2013 article in the Harvard Crimson, the median grade at Harvard was an A-minus , while the most common grade was an A.
  • The result is widespread panic about grade inflation at elite schools. (The phenomenon is not as prevalent at community colleges and less-selective universities.) Some blame students’ consumer mentality, a few see a correlation with small class sizes (departments with falling enrollments want to keep students happy), and many cite a general loss of rigor in a touchy-feely age.
  • Yet whenever elite schools have tried to fight grade inflation, it’s been a mess. Princeton instituted strict caps on the number of high grades awarded, then abandoned the plan, saying the caps dissuaded applicants and made students miserable. At Wellesley, grade-inflated humanities departments mandated that the average result in their introductory and intermediate classes not exceed a B-plus. According to one study, enrollment fell by one-fifth, and students were 30 percent less likely to major in one of these subjects.
  • ...12 more annotations...
  • I liked the joy my students found when they actually earned a grade they’d been reaching for. But whereas I once thought we needed to contain grades, I now see that we may as well let them float skyward. If grade inflation is bad, fighting it is worse. Our goal should be ending the centrality of grades altogether. For years, I feared that a world of only A’s would mean the end of meaningful grades; today, I’m certain of it. But what’s so bad about that?
  • It’s easy to see why schools want to fight grade inflation. Grades should motivate certain students: those afraid of the stigma of a bad grade or those ambitious, by temperament or conditioning, to succeed in measurable ways. Periodic grading during a term, on quizzes, tests or papers, provides feedback to students, which should enable them to do better. And grades theoretically signal to others, such as potential employers or graduate schools, how well the student did. (Grade-point averages are also used for prizes and class rankings, though that doesn’t strike me as an important feature.)
  • But it’s not clear that grades work well as motivators. Although recent research on the effects of grades is limited, several studies in the 1970s, 1980s and 1990s measured how students related to a task or a class when it was graded compared to when it was ungraded. Overall, graded students are less interested in the topic at hand and — and, for obvious, common-sense reasons — more inclined to pick the easiest possible task when given the chance. In the words of progressive-education theorist Alfie Kohn, author of “The Homework Myth,” “the quality of learning declines” when grades are introduced, becoming “shallower and more superficial when the point is to get a grade.”
  • Even where grades can be useful, as in describing what material a student has mastered, they are remarkably crude instruments. Yes, the student who gets a 100 on a calculus exam probably grasps the material better than the student with a 60 — but only if she retains the knowledge, which grades can’t show.
  • I still can’t say very well what separates a B from an A. What’s more, I never see the kind of incompetence or impudence that would merit a D or an F. And now, in our grade-inflated world, it’s even harder to use grades to motivate, or give feedback, or send a signal to future employers or graduate schools.
  • According to a 2012 study by the Chronicle of Higher Education, GPA was seventh out of eight factors employers considered in hiring, behind internships, extracurricular activities and previous employment. Last year, Stanford’s registrar told the Chronicle about “a clamor” from employers “for something more meaningful” than the traditional transcript. The Lumina Foundation gave a$1.27 million grant to two organizations for college administrators working to develop better student records, with grades only one part of a student’s final profile.
  • Some graduate schools, too, have basically ditched grades. “As long as you don’t bomb and flunk out, grades don’t matter very much in M.F.A. programs,” the director of one creative-writing program told the New York Times. To top humanities PhD programs, letters of reference and writing samples matter more than overall GPA (although students are surely expected to have received good grades in their intended areas of study). In fact, it’s impossible to get into good graduate or professional schools without multiple letters of reference, which have come to function as the kind of rich, descriptive comments that could go on transcripts in place of grades.
  • suggests that GPAs serve not to validate students from elite schools but to keep out those from less-prestigious schools and large public universities, where grades are less inflated. Grades at community colleges “have actually dropped” over the years, according to Stuart Rojstaczer, a co-author of the 2012 grade-inflation study. That means we have two systems: one for students at elite schools, who get jobs based on references, prestige and connections, and another for students everywhere else, who had better maintain a 3.0. Grades are a tool increasingly deployed against students without prestige.
  • The trouble is that, while it’s relatively easy for smaller colleges to go grade-free, with their low student-to-teacher ratios, it’s tough for professors at larger schools, who must evaluate more students, more quickly, with fewer resources. And adjuncts teaching five classes for poverty wages can’t write substantial term-end comments, so grades are a necessity if they want to give any feedback at all.
  • It would mean hiring more teachers and paying them better (which schools should do anyway). And if transcripts become more textured, graduate-school admission offices and employers will have to devote more resources to reading them, and to getting to know applicants through interviews and letters of reference — a salutary trend that is underway already.
  • When I think about getting rid of grades, I think of happier students, with whom I have more open, democratic relationships. I think about being forced to pay more attention to the quiet ones, since I’ll have to write something truthful about them, too. I’ve begun to wonder if a world without grades may be one of those states of affairs (like open marriages, bicycle lanes and single-payer health care) that Americans resist precisely because they seem too good, suspiciously good. Nothing worth doing is supposed to come easy.
  • Alfie Kohn, too, sees ideology at work in the grade-inflation panic. “Most of what powers the arguments against grade inflation is a very right-wing idea that excellence consists in beating everyone else around you,” he says. “Even when you have sorted them — even when they get to Harvard! — we have to sort them again.” In other words, we can trust only a system in which there are clear winners and losers.
Javier E

Enter the Age of the Outsiders - The New York Times - 0 views

  • In the 1990s, the central political institutions radiated confidence, derived from an assumed vision of the post-Cold War world. History would be a slow march toward democratic capitalism. Nations would be bound in peaceful associations like the European Union. The United States would oversee a basic international order.
  • This vision was materialistic and individualistic. Nations should pursue economic growth and a decent distribution of wealth. If you give individuals access to education and opportunity, they will pursue affluence and personal happiness. They will grow more temperate and “reasonable.”
  • The uncertain Republican establishment cannot govern its own marginal members, while those on the edge burn with conviction. Jeb Bush looks wan but Donald Trump radiates confidence.
  • ...7 more annotations...
  • the deeper problem was spiritual. Many people around the world rejected democratic capitalism’s vision of a secular life built around materialism and individual happiness. They sought more intense forms of meaning. Some of them sought meaning in the fanaticisms of sect, tribe, nation, or some stronger and more brutal ideology
  • In case after case, “reasonableness” has been trampled by behavior and creed that is stronger, darker and less temperate.
  • Since 2000, this vision of the post-Cold War world has received blow after blow. Some of these blows were self-inflicted. Democracy, especially in the United States, has grown dysfunctional. Mass stupidity and greed led to a financial collapse and deprived capitalism of its moral swagger.
  • The Democratic establishment no longer determines party positions; it is pulled along by formerly marginal players like Bernie Sanders.
  • Republicans blame Obama for hesitant and halting policies, but it’s not clear the foreign policy and defense apparatus believes anymore in its own abilities to establish order, or that the American public has any confidence in U.S. effectiveness as a global actor.
  • the primary problem is mental and spiritual. Some leader has to be able to digest the lessons of the last 15 years and offer a revised charismatic and persuasive sense of America’s historic mission.
  • This mission, both nationalist and universal, would be less individualistic than the gospel of the 1990s, and more realistic about depravity and the way barbarism can spread. It would offer a goal more profound than material comfort.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Philosophy isn't dead yet | Raymond Tallis | Comment is free | The Guardian - 1 views

  • Fundamental physics is in a metaphysical mess and needs help. The attempt to reconcile its two big theories, general relativity and quantum mechanics, has stalled for nearly 40 years. Endeavours to unite them, such as string theory, are mathematically ingenious but incomprehensible even to many who work with them. This is well known.
  • A better-kept secret is that at the heart of quantum mechanics is a disturbing paradox – the so-called measurement problem, arising ultimately out of the Uncertainty Principle – which apparently demonstrates that the very measurements that have established and confirmed quantum theory should be impossible. Oxford philosopher of physics David Wallace has argued that this threatens to make quantum mechanics incoherent which can be remedied only by vastly multiplying worlds.
  • there is the failure of physics to accommodate conscious beings. The attempt to fit consciousness into the material world, usually by identifying it with activity in the brain, has failed dismally, if only because there is no way of accounting for the fact that certain nerve impulses are supposed to be conscious (of themselves or of the world) while the overwhelming majority (physically essentially the same) are not. In short, physics does not allow for the strange fact that matter reveals itself to material objects (such as physicists).
  • ...3 more annotations...
  • then there is the mishandling of time. The physicist Lee Smolin's recent book, Time Reborn, links the crisis in physics with its failure to acknowledge the fundamental reality of time. Physics is predisposed to lose time because its mathematical gaze freezes change. Tensed time, the difference between a remembered or regretted past and an anticipated or feared future, is particularly elusive. This worried Einstein: in a famous conversation, he mourned the fact that the present tense, "now", lay "just outside of the realm of science".
  • Recent attempts to explain how the universe came out of nothing, which rely on questionable notions such as spontaneous fluctuations in a quantum vacuum, the notion of gravity as negative energy, and the inexplicable free gift of the laws of nature waiting in the wings for the moment of creation, reveal conceptual confusion beneath mathematical sophistication. They demonstrate the urgent need for a radical re-examination of the invisible frameworks within which scientific investigations are conducted.
  • we should reflect on how a scientific image of the world that relies on up to 10 dimensions of space and rests on ideas, such as fundamental particles, that have neither identity nor location, connects with our everyday experience. This should open up larger questions, such as the extent to which mathematical portraits capture the reality of our world – and what we mean by "reality".
Javier E

What Does Coronavirus Do to the Body? - The New York Times - 0 views

  • the virus appears to start in peripheral areas on both sides of the lung and can take a while to reach the upper respiratory tract, the trachea and other central airways.
  • that pattern helps explain why in Wuhan, where the outbreak began, many of the earliest cases were not identified immediately.
  • The initial testing regimen in many Chinese hospitals did not always detect infection in the peripheral lungs, so some people with symptoms were sent home without treatment.
  • ...18 more annotations...
  • “The virus will actually land on organs like the heart, the kidney, the liver, and may cause some direct damage to those organs,
  • the infection can spread through the mucous membranes, from the nose down to the rectum.
  • while the virus appears to zero in on the lungs, it may also be able to infect cells in the gastrointestinal system, experts say. This may be why some patients have symptoms like diarrhea or indigestion
  • it’s unclear whether infectious virus can persist in blood or stool
  • some patients in China recovered but got sick again, apparently because they had damaged and vulnerable lung tissue that was subsequently attacked by bacteria in their body.
  • As the body’s immune system shifts into high gear to battle the infection, the resulting inflammation may cause those organs to malfunction, he said.
  • About 80 percent of people infected with the new coronavirus have relatively mild symptoms. But about 20 percent of people become more seriously ill and in about 2 percent of patients in China, which has had the most cases, the disease has been fatal.
  • more than half of 121 patients in China had normal CT scans early in their disease.
  • the course a patient’s coronavirus will take is not yet fully understood.
  • Some patients can remain stable for over a week and then suddenly develop pneumonia, Dr. Diaz said. Some patients seem to recover but then develop symptoms again.
  • the effects appear to depend on how robust or weakened a person’s immune system is. Older people or those with underlying health issues, like diabetes or another chronic illness, are more likely to develop severe symptoms
  • Coronavirus particles have spiked proteins sticking out from their surfaces, and these spikes hook onto cell membranes, allowing the virus’s genetic material to enter the human cell.
  • That genetic material proceeds to “hijack the metabolism of the cell and say, in effect, ‘Don’t do your usual job. Your job now is to help me multiply and make the virus,’
  • As copies of the virus multiply, they burst out and infect neighboring cells. The symptoms often start in the back of the throat with a sore throat and a dry cough.
  • The virus then “crawls progressively down the bronchial tubes,”
  • That can damage the alveoli or lung sacs and they have to work harder to carry out their function of supplying oxygen to the blood
  • The swelling and the impaired flow of oxygen can cause those areas in the lungs to fill with fluid, pus and dead cells. Pneumonia, an infection in the lung, can occur
  • Some people have so much trouble breathing they need to be put on a ventilator
Javier E

The Story Behind the SAT Overhaul - NYTimes.com - 2 views

  • “When you cover too many topics,” Coleman said, “the assessments designed to measure those standards are inevitably superficial.” He pointed to research showing that more students entering college weren’t prepared and were forced into “remediation programs from which they never escape.” In math, for example, if you examined data from top-performing countries, you found an approach that emphasized “far fewer topics, far deeper,” the opposite of the curriculums he found in the United States, which he described as “a mile wide and an inch deep.”
  • The lessons he brought with him from thinking about the Common Core were evident — that American education needed to be more focused and less superficial, and that it should be possible to test the success of the newly defined standards through an exam that reflected the material being taught in the classroom.
  • she and her team had extensive conversations with students, teachers, parents, counselors, admissions officers and college instructors, asking each group to tell them in detail what they wanted from the test. What they arrived at above all was that a test should reflect the most important skills that were imparted by the best teachers
  • ...12 more annotations...
  • for example, a good instructor would teach Martin Luther King Jr.’s “I Have a Dream” speech by encouraging a conversation that involved analyzing the text and identifying the evidence, both factual and rhetorical, that makes it persuasive. “The opposite of what we’d want is a classroom where a teacher might ask only: ‘What was the year the speech was given? Where was it given?’ ”
  • in the past, assembling the SAT focused on making sure the questions performed on technical grounds, meaning: Were they appropriately easy or difficult among a wide range of students, and were they free of bias when tested across ethnic, racial and religious subgroups? The goal was “maximizing differentiation” among kids, which meant finding items that were answered correctly by those students who were expected to get them right and incorrectly by the weaker students. A simple way of achieving this, Coleman said, was to test the kind of obscure vocabulary words for which the SAT was famous
  • In redesigning the test, the College Board shifted its emphasis. It prioritized content, measuring each question against a set of specifications that reflect the kind of reading and math that students would encounter in college and their work lives. Schmeiser and others then spent much of early last year watching students as they answered a set of 20 or so problems, discussing the questions with the students afterward. “The predictive validity is going to come out the same,” she said of the redesigned test. “But in the new test, we have much more control over the content and skills that are being measured.”
  • Evidence-based reading and writing, he said, will replace the current sections on reading and writing. It will use as its source materials pieces of writing — from science articles to historical documents to literature excerpts — which research suggests are important for educated Americans to know and understand deeply. “The Declaration of Independence, the Constitution, the Bill of Rights and the Federalist Papers,” Coleman said, “have managed to inspire an enduring great conversation about freedom, justice, human dignity in this country and the world” — therefore every SAT will contain a passage from either a founding document or from a text (like Lincoln’s Gettysburg Address) that is part of the “great global conversation” the founding documents inspired.
  • The Barbara Jordan vocabulary question would have a follow-up — “How do you know your answer is correct?” — to which students would respond by identifying lines in the passage that supported their answer.
  • The idea is that the test will emphasize words students should be encountering, like “synthesis,” which can have several meanings depending on their context. Instead of encouraging students to memorize flashcards, the test should promote the idea that they must read widely throughout their high-school years.
  • . No longer will it be good enough to focus on tricks and trying to eliminate answer choices. We are not interested in students just picking an answer, but justifying their answers.”
  • the essay portion of the test will also be reformulated so that it will always be the same, some version of: “As you read the passage in front of you, consider how the author uses evidence such as facts or examples; reasoning to develop ideas and to connect claims and evidence; and stylistic or persuasive elements to add power to the ideas expressed. Write an essay in which you explain how the author builds an argument to persuade an audience.”
  • The math section, too, will be predicated on research that shows that there are “a few areas of math that are a prerequisite for a wide range of college courses” and careers. Coleman conceded that some might treat the news that they were shifting away from more obscure math problems to these fewer fundamental skills as a dumbing-down the test, but he was adamant that this was not the case. He explained that there will be three areas of focus: problem solving and data analysis, which will include ratios and percentages and other mathematical reasoning used to solve problems in the real world; the “heart of algebra,” which will test how well students can work with linear equations (“a powerful set of tools that echo throughout many fields of study”); and what will be called the “passport to advanced math,” which will focus on the student’s familiarity with complex equations and their applications in science and social science.
  • “Sometimes in the past, there’s been a feeling that tests were measuring some sort of ineffable entity such as intelligence, whatever that might mean. Or ability, whatever that might mean. What this is is a clear message that good hard work is going to pay off and achievement is going to pay off. This is one of the most significant developments that I have seen in the 40-plus years that I’ve been working in admissions in higher education.”
  • The idea of creating a transparent test and then providing a free website that any student could use — not to learn gimmicks but to get a better grounding and additional practice in the core knowledge that would be tested — was appealing to Coleman.
  • (The College Board won’t pay Khan Academy.) They talked about a hypothetical test-prep experience in which students would log on to a personal dashboard, indicate that they wanted to prepare for the SAT and then work through a series of preliminary questions to demonstrate their initial skill level and identify the gaps in their knowledge. Khan said he could foresee a way to estimate the amount of time it would take to achieve certain benchmarks. “It might go something like, ‘O.K., we think you’ll be able to get to this level within the next month and this level within the next two months if you put in 30 minutes a day,’ ” he said. And he saw no reason the site couldn’t predict for anyone, anywhere the score he or she might hope to achieve with a commitment to a prescribed amount of work.
caelengrubb

Free Market - Econlib - 0 views

  • Free market” is a summary term for an array of exchanges that take place in society.
  • Each exchange is undertaken as a voluntary agreement between two people or between groups of people represented by agents. These two individuals (or agents) exchange two economic goods, either tangible commodities or nontangible services
  • Both parties undertake the exchange because each expects to gain from it. Also, each will repeat the exchange next time (or refuse to) because his expectation has proved correct (or incorrect) in the recent past.
  • ...25 more annotations...
  • Trade, or exchange, is engaged in precisely because both parties benefit; if they did not expect to gain, they would not agree to the exchange.
  • This simple reasoning refutes the argument against free trade typical of the “mercantilist” period of sixteenth- to eighteenth-century Europe and classically expounded by the famed sixteenth-century French essayist Montaigne.
  • At each stage of production from natural resource to consumer good, money is voluntarily exchanged for capital goods, labor services, and land resources. At each step of the way, terms of exchanges, or prices, are determined by the voluntary interactions of suppliers and demanders. This market is “free” because choices, at each step, are made freely and voluntarily.
  • We can immediately see the fallacy in this still-popular viewpoint: the willingness and even eagerness to trade means that both parties benefit. In modern game-theory jargon, trade is a win-win situation, a “positive-sum” rather than a “zero-sum” or “negative-sum” game.
  • Each one values the two goods or services differently, and these differences set the scene for an exchange.
  • Two factors determine the terms of any agreement: how much each participant values each good in question, and each participant’s bargaining skills.
  • the market in relation to how favorably buyers evaluate these goods—in shorthand, by the interaction of their supply with the demand for them.
  • On the other hand, given the buyers’ evaluation, or demand, for a good, if the supply increases, each unit of supply—each baseball card or loaf of bread—will fall in value, and therefore the price of the good will fall. The reverse occurs if the supply of the good decreases.
  • The market, then, is not simply an array; it is a highly complex, interacting latticework of exchanges.
  • Production begins with natural resources, and then various forms of machines and capital goods, until finally, goods are sold to the consumer.
  • The mercantilists argued that in any trade, one party can benefit only at the expense of the other—that in every transaction there is a winner and a loser, an “exploiter” and an “exploited.”
  • A common charge against the free-market society is that it institutes “the law of the jungle,” of “dog eat dog,” that it spurns human cooperation for competition and exalts material success as opposed to spiritual values, philosophy, or leisure activities.
  • Saving and investment can then develop capital goods and increase the productivity and wages of workers, thereby increasing their standard of living.
  • The free competitive market also rewards and stimulates technological innovation that allows the innovator to get a head start in satisfying consumer wants in new and creative ways.
  • Government, in every society, is the only lawful system of coercion. Taxation is a coerced exchange, and the heavier the burden of taxation on production, the more likely it is that economic growth will falter and decline
  • The ultimate in government coercion is socialism.
  • Under socialist central planning the socialist planning board lacks a price system for land or capital goods.
  • Market socialism is, in fact, a contradiction in terms.
  • The fashionable discussion of market socialism often overlooks one crucial aspect of the market: When two goods are exchanged, what is really exchanged is the property titles in those goods.
  • This means that the key to the existence and flourishing of the free market is a society in which the rights and titles of private property are respected, defended, and kept secure.
  • The key to socialism, on the other hand, is government ownership of the means of production, land, and capital goods.
  • Under socialism, therefore, there can be no market in land or capital goods worthy of the name.
  • ome critics of the free market argue that property rights are in conflict with “human” rights. But the critics fail to realize that in a free-market system, every person has a property right over his own person and his own labor and can make free contracts for those services.
  • The free market and the free price system make goods from around the world available to consumers.
  • It is the coercive countries with little or no market activity—the notable examples in the last half of the twentieth century were the communist countries—where the grind of daily existence not only impoverishes people materially but also deadens their spirit.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
ilanaprincilus06

Opinion | Beyond the Brain - The New York Times - 0 views

  • Human beings are nothing but neurons, they assert. Once we understand the brain well enough, we will be able to understand behavior.
  • We will see that people don’t really possess free will; their actions are caused by material processes emerging directly out of nature.
  • Neuroscience will replace psychology and other fields as the way to understand action.
  • ...8 more annotations...
  • there is the problem that one action can arise out of many different brain states and the same event can trigger many different brain reactions.
  • we have this useful concept, “working memory,” but the activity described by this concept is widely distributed across at least 30 regions of the brain.
  • It is probably impossible to look at a map of brain activity and predict or even understand the emotions, reactions, hopes and desires of the mind.
  • you may order the same salad, but your brain activity will look different, depending on whether you are drunk or sober, alert or tired.
  • there is the problem of meaning. A glass of water may be more meaningful to you when you are dying of thirst than when you are not.
  • People can change their brains in unique and unpredictable ways by shifting the patterns of their attention.
  • we are compelled to rely on different disciplines to try to understand behavior on multiple levels, with inherent tensions between them.
  • They want to eliminate the confusing ambiguity of human freedom by reducing everything to material determinism.
lucieperloff

Secrets of the 'uncrushable' beetle revealed - BBC News - 0 views

  • Equipped with super-tough body armour, the insect can survive being stamped on or even run over by a car.
  • Equipped with super-tough body armour, the insect can survive being stamped on or even run over by a car.
    • lucieperloff
       
      Oh no... Where does this thing live?
  • And the findings could give clues to building tougher materials for use in construction and aeronautics.
    • lucieperloff
       
      Can actually have a positive impact on society
  • ...2 more annotations...
  • which allow the beetle to withstand forces of up to 149 Newtons (approximately 39,000 times the creature's body weight).
  • To test the potential of this type of structure as a way of joining different materials, such as plastics and metal, the scientists made a series of joints from metal and composites based on those seen in the beetle.
    • lucieperloff
       
      Able to improve science and technology from something found in nature
mmckenziejr01

Does Photographic Memory Exist? - Scientific American - 0 views

  • The intuitive notion of a “photographic” memory is that it is just like a photograph: you can retrieve it from your memory at will and examine it in detail, zooming in on different parts. But a true photographic memory in this sense has never been proved to exist.
  • Most of us do have a kind of photographic memory, in that most people's memory for visual material is much better and more detailed than our recall of most other kinds of material.
  • Sorry to disappoint further, but even an amazing memory in one domain, such as vision, is not a guarantee of great memory across the board.
  • ...1 more annotation...
  • A winner of the memory Olympics, for instance, still had to keep sticky notes on the refrigerator to remember what she had to do during the day.
  •  
    I was researching photographic memory to go off of our discussion last class. It's more scientific name is eidetic memory (this is also more useful to google if you're looking for information).
manhefnawi

William James on Science and Spirituality, the Limits of Materialism, and the Existenti... - 0 views

  • At bottom the whole concern of both morality and religion is with the manner of our acceptance of the universe. Do we accept it only in part and grudgingly, or heartily and altogether
  • “I live my life with the idea that the universe can be described by a set of physical laws that are quantifiable and knowable, and that they apply anywhere in the universe, and that’s an assumption,” NASA astrophysicist Natalie Batalha — a modern-day Carl Sagan — reflected in our On Being conversation. Assumption is a species of belief, or rather the genome of all belief — which is why Sagan himself asserted in his superb meditation on science and religion, based on his 1985 Gifford Lectures in Scotland, that “if we ever reach the point where we think we thoroughly understand who we are and where we came from, we will have failed.”
blythewallick

The quiet loss of knowledge threatens indigenous communities -- ScienceDaily - 0 views

  • Plants play an important role for most indigenous communities in South America, and not merely as a source of food. They also provide the raw material for building materials, tools, medicine, and much more. The extinction of a plant species therefore also endangers the very foundation of these people's way of life. advertisement
  • The problem is that this is not written down. Passed down as a cultural inheritance, it exists only in the minds of the people -- and could therefore vanish almost unnoticed. "Very little is known about how vulnerable this knowledge is in the context of current global change," says Jordi Bascompte, professor of ecology at the University of Zurich.
  • "There is therefore an urgent need to find out how biological and cultural factors interact with each other in determining the services provided by biodiversity.."
  • ...4 more annotations...
  • For their study, they analyzed knowledge held by 57 indigenous communities in the Amazon basin, the Andes and the Chocó region to collate their knowledge of palm trees. The researchers then depicted the different palm species and their uses in graphical form in a network, from which they could identify the local and regional links between the knowledge of indigenous communities.
  • "In this context, cultural diversity is just as important as biological diversity," says Jordi Bascompte. "In particular, the simultaneous loss of plant species and cultural inheritance leads to a much faster disintegration of the indigenous knowledge network."
  • However, the irreplaceable knowledge that is gradually disappearing from indigenous communities is equally important for the service that an ecosystem provides."
  • The study also highlights the value of transdisciplinary collaboration between ecology and social science: "The relationship established between biological and cultural diversity can help strengthen the resilience of indigenous communities in the face of global change."
‹ Previous 21 - 40 of 187 Next › Last »
Showing 20 items per page