Skip to main content

Home/ TOK Friends/ Group items tagged obscurity

Rss Feed Group items tagged

Javier E

G.M. Lawyers Hid Fatal Flaw, From Critics and One Another - NYTimes.com - 0 views

  • An internal investigation released on Thursday into the company’s failure to recall millions of defective small cars found no evidence of a cover-up. But interviews with victims, their lawyers and current and former G.M. employees, as well as evidence in the report itself, paint a more complete picture: The automaker’s legal department took actions that obscured the deadly flaw, both inside and outside the company.
  • “That says to me that the G.M. lawyers were involved in keeping the ignition failure secret case by case,” said Mr. Zitrin, who has helped draft new federal legislation that would make it difficult for corporations to enter into confidential settlements.
  • The secrecy factor extended to how some employees kept or discarded old emails. According to two former G.M. officials, company lawyers conducted annual audits of some employees’ emails that could be used as evidence in lawsuits against the company.
Javier E

Enlightenment's Evil Twin - The Atlantic - 0 views

  • The first time I can remember feeling like I didn’t exist, I was 15. I was sitting on a train and all of a sudden I felt like I’d been dropped into someone else’s body. My memories, experiences, and feelings—the things that make up my intrinsic sense of “me-ness”—projected across my mind like phantasmagoria, but I felt like they belonged to someone else. Like I was experiencing life in the third person.
  • It’s characterized by a pervasive and disturbing sense of unreality in both the experience of self (called “depersonalization”) and one’s surroundings (known as “derealization”); accounts of it are surreal, obscure, shrouded in terms like “unreality” and “dream,” but they’re often conveyed with an almost incongruous lucidity.
  • It’s not a psychotic condition; the sufferers are aware that what they’re perceiving is unusual. “We call it an ‘as if’ disorder. People say they feel as if they’re in a movie, as if they’re a robot,” Medford says.
  • ...13 more annotations...
  • Studies carried out with college students have found that brief episodes are common in young people, with a prevalence ranging from 30 to 70 percent. It can happen when you’re jet-lagged, hungover, or stressed. But for roughly 1 to 2 percent of the population, it becomes persistent, and distressing
  • Research suggests that areas of the brain that are key to emotional and physical sensations, such as the amygdala and the insula, appear to be less responsive in chronic depersonalization sufferers. You might become less empathetic; your pain threshold might increase. These numbing effects mean that it’s commonly conceived as a defense mechanism; Hunter calls it a “psychological trip switch” which can be triggered in times of stress.
  • Have you ever played that game when you repeat a word over and over again until it loses all meaning? It’s called semantic satiation. Like words, can a sense of self be broken down into arbitrary, socially-constructed components?
  • That question may be why the phenomenon has attracted a lot of interest from philosophers. In a sense, the experience presupposes certain notions of how the self is meant to feel. We think of a self as an essential thing—a soul or an ego that everyone has and is aware of—but scientists and philosophers have been telling us for a while now that the self isn’t quite as it seems
  • there is no center in the brain where the self is generated. “What we experience is a powerful depiction generated by our brains for our benefit,” he writes. Brains make sense of data that would otherwise be overwhelming. “Experiences are fragmented episodes unless they are woven together in a meaningful narrative,” he writes, with the self being the story that “pulls it all together.”
  • “The unity [of self that] we experience, which allows us legitimately to talk of ‘I,’ is a result of the Ego Trick—the remarkable way in which a complicated bundle of mental events, made possible by the brain, creates a singular self, without there being a singular thing underlying it,”
  • depersonalization is both a burden, a horrible burden—but it’s in some strange way a blessing, to reach some depths, some meaning which somehow comes only in the broken mirror,” Bezzubova says. “It’s a Dostoyevsky style illumination—where clarity cannot be distinguished from pain.”
  • for her, the experience is pleasant. “It’s helped me in my life,” she says. Over the past few years, she has learned to interpret her experiences in a Buddhist context, and she describes depersonalization as a “deconditioning” of sorts: “The significance I place on the world is all in my mind,”
  • “I believe I am on the path to enlightenment,” she says.
  • The crossover between dark mental states and Buddhist practices is being investigated
  • Mindfulness has become increasingly popular in the West over the past few years, but as Britton told The Atlantic, the practice in its original form isn’t just about relaxation: It’s about the often painstaking process of coming to terms with three specific insights of the Theravadin Buddhist tradition, which are anicca, or impermanence; dukkha, or dissatisfaction; and anatta, or not-self.
  • depersonalization must cause the patient distress and have an impact on her daily functioning for it to be classified as clinically significant. In this sense, it seems inappropriate to call Alice’s experiences pathological. “We have ways of measuring disorders, but you have to ask if it’s meaningful. It’s an open question,”
  • “I think calling it a loss of self is maybe a convenient shorthand for something that’s hard to capture,” he says. “I prefer to talk about experience—because that’s what’s important in psychiatry.”
Javier E

OKCupid Publishes Findings of User Experiments - NYTimes.com - 0 views

  • “If you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site,” Christian Rudder, president of OKCupid, wrote on the company’s blog. “That’s how websites work.”
  • Ms. Harris said, however, that her expectations for online dating were low regardless of percentages displayed. If the experiment was short-lived and produced better matchmaking, she said, “It’s not that big a deal.
  • OKCupid’s user agreement says that when a person signs up for the site, personal data may be used in research and analysis.
  • ...2 more annotations...
  • “We told users something that wasn’t true. I’m definitely not hiding from that fact,” said Mr. Rudder, OKCupid’s president. But he said the tests were done to determine how people can get the most from the site. “People come to us because they want the website to work, and we want the website to work.
  • when the site obscured all profile photos one day, users engaged in more meaningful conversations, exchanged more contact details and responded to first messages more often. They got to know each other. But when pictures were reintroduced on the site, many of those conversations stopped cold.
carolinewren

Politicians, others on right, left challenge scientific consensus on some issues | The ... - 0 views

  • Often, pronouncements about either subject are accompanied by the politician’s mea culpa: “I’m not a scientist, but ... ”
  • It’s the butthat has caused heartburn among scientists, many of whom say such skepticism has an impact on public policy.
  • “They’ve been using it as if they can dismiss the view of scientists, which doesn’t make any sense,”
  • ...31 more annotations...
  • ‘Well, I’m not an engineer, but I think the bridge will stand up.’  ”
  • “Not just as a public figure, but as a human being, your fidelity should be to reality and to the truth,”
  • Among those agreeing that climate change is both real and a man-made threat are the Intergovernmental Panel on Climate Change, NASA, the National Academy of Sciences, the Defense Department, the American Association for the Advancement of Science and the American Meteorological Society
  • giving parents a “measure of choice” on vaccination is “the balance that government has to decide.”
  • murkiness of those comments caused alarm among public-health officials, who say the impact of the anti-vaccination movement is being seen in a measles outbreak in a number of states and Washington, D.C.
  • Climate change also sparks tension.
  • He said he was galled by U.S. Sen. Rand Paul’s recent assertion that the government should not require parents to vaccinate their children because it’s an issue of “freedom.”
  • “There is an unwritten litmus test for GOP officeholders” to express some form of skepticism about the phenomenon, he said.
  • However, Cruz, Rubio, Portman and Paul all voted against another amendment that said human activity contributes “significantly” to the threat. Cruz has asserted to the National Journal that climate change is “a theory that can’t be proven or disproven.”
  • In a separate vote, 98 senators — including Cruz, Rubio, Portman and Paul — acknowledged that climate change is “real and not a hoax.”
  • The group that denied climate change is occurring has pivoted, acknowledging that it exists. Still, the group questions whether it is a man-made phenomenon.
  • As for the caveat I’m not a scientist, “What they’re saying they implicitly think is that scientists don’t even know about climate change,”
  • Conservatives felt more negative emotions when they read scientific studies that challenged their views on climate change and evolution than liberals did in reading about nuclear power and fracking, but researchers believe that’s because climate change and evolution are more national in scope than the issues picked for liberals.
  • “The point is, to a very high level, scientists do know.”
  • Even those who agree that climate change is real and is man-made might not support government action
  • He said the disconnect between the public and scientists isn’t necessarily a bad thing
  • Such a slowdown “gives the science time to mature on some of these issues.”
  • most would-be candidates want to appeal to as many people as possible.
  • “And if you can sort of try to obscure your actual position but not offend anyone, that’s what I think they try to do,”
  • But it’s possible that their comments reflect a growing disconnect between the views of the public and the scientific community.
  • 86 percent of scientists who are members of the American Association for the Advancement of Science said childhood vaccines such as the one for measles-mumps-rubella should be required, 68 percent of U.S. adults agreed.
  • larger gap on the subject of climate change: 87 percent of the scientists said climate change is caused mostly by human activity, while 50 percent of U.S. adults did.
  • The divide is not necessarily a conservative one
  • For example, while 88 percent of scientists said it is generally safe to eat genetically modified foods, only 37 percent of U.S. adults agreed.
  • And the vaccine issue is one that has united some liberals, the religious right and libertarians.
  • The study found that conservatives tend to distrust science on issues such as climate change and evolution. For liberals, it is fracking and nuclear power.
  • didn’t stop 39 Republicans — including GOP presidential contenders Sens. Ted Cruz, R-Texas, and Marco Rubio, R-Fla. — from opposing an amendment last month that blamed changing global temperatures on human activity.
  • liberals showed some distrust about science when they read about climate change and evolution
  • “Liberals can be just as biased as conservatives,” he said.
  • Rosenberg said the Internet can provide affirmation of pre-existing beliefs rather than encouraging people to find objective sources of information, such as peer-reviewed journals.
  • Often, attacking science is the easiest way to justify inaction, Rosenberg said.
Javier E

A New Report Argues Inequality Is Causing Slower Growth. Here's Why It Matters. - NYTim... - 0 views

  • Is income inequality holding back the United States economy? A new report argues that it is, that an unequal distribution in incomes is making it harder for the nation to recover from the recession
  • The fact that S.&P., an apolitical organization that aims to produce reliable research for bond investors and others, is raising alarms about the risks that emerge from income inequality is a small but important sign of how a debate that has been largely confined to the academic world and left-of-center political circles is becoming more mainstream.
  • “Our review of the data, as well as a wealth of research on this matter, leads us to conclude that the current level of income inequality in the U.S. is dampening G.D.P. growth,” the S.&P. researchers write
  • ...7 more annotations...
  • To understand why this matters, you have to know a little bit about the many tribes within the world of economics.
  • There are the academic economists who study the forces shaping the modern economy. Their work is rigorous but often obscure. Some of them end up in important policy jobs (See: Bernanke, B.) or write books for a mass audience (Piketty, T.), but many labor in the halls of academia for decades writing carefully vetted articles for academic journals that are rigorous as can be but are read by, to a first approximation, no one.
  • Then there are the economists in what can broadly be called the business forecasting community. They wear nicer suits than the academics, and are better at offering a glib, confident analysis of the latest jobs numbers delivered on CNBC or in front of a room full of executives who are their clients. They work for ratings firms like S.&P., forecasting firms like Macroeconomic Advisers and the economics research departments of all the big banks.
  • they are trying to do the practical work of explaining to clients — companies trying to forecast future demand, investors trying to allocate assets — how the economy is likely to evolve. They’re not really driven by ideology, or by models that are rigorous enough in their theoretical underpinnings to pass academic peer review. Rather, their success or failure hinges on whether they’re successful at giving those clients an accurate picture of where the economy is heading.
  • worries that income inequality is a factor behind subpar economic growth over the last five years (and really the last 15 years) is going from an idiosyncratic argument made mainly by left-of center economists to something that even the tribe of business forecasters needs to wrestle with.
  • Because the affluent tend to save more of what they earn rather than spend it, as more and more of the nation’s income goes to people at the top income brackets, there isn’t enough demand for goods and services to maintain strong growth, and attempts to bridge that gap with debt feed a boom-bust cycle of crises, the report argues. High inequality can feed on itself, as the wealthy use their resources to influence the political system toward policies that help maintain that advantage, like low tax rates on high incomes and low estate taxes, and underinvestment in education and infrastructur
  • The report itself does not break any major new analytical or empirical ground. It spends many pages summarizing the findings of various academic and government economists who have studied inequality and its discontents, and stops short of recommending any radical policy changes
Javier E

Who Wins in the Name Game? - The Atlantic - 2 views

  • Not being able to pronounce a name spells a death sentence for relationships. That’s because the ability to pronounce someone’s name is directly related to how close you feel to that person. Our brains tend to believe that if something is difficult to understand, it must also be high-risk.
  • companies with names that are simple and easy to pronounce see significantly higher investments than more complexly named stocks,
  • People with easier to pronounce names are also judged more positively and tend to be hired and promoted more often than their more obscurely named peers. 
  • ...7 more annotations...
  • In competitive fields that have classically been dominated by men, such as law and engineering, women with sexually ambiguous names tend to be more successful.
  • Our names can even influence what cities we live in, who we befriend, and what products we buy since, we’re attracted to things and places that share similarities to our names.
  • Teachers tend to hold lower expectations for students with typically black-sounding names while they set high expectations for students with typically white- and Asian-sounding names. And this early assessment of students’ abilities could influence students’ expectations for themselves.
  • A first name can imply race, age, socioeconomic status, and sometimes religion, so it’s an easy—or lazy—way to judge someone’s background, character, and intelligence.
  • On this year’s French baccalaureate, an exam that determines university placement for high school students, test-takers named Thomas (for boys) and Marie (for girls) tended to score highest. These are, you will note, typically white, French, middle- or upper-class names.
  • A 2004 study showed that all else being equal, employers selected candidates with names like Emily Walsh and Greg Baker for callbacks almost 50 percent more often than candidates with names like Lakisha Washington and Jamal Jones.
  • The researchers concluded that there was a great advantage to having a white-sounding name, so much so that having a white-sounding name is worth about eight years of work experience. “Jamal” would have to work in an industry for eight years longer than “Greg” for them to have equal chances of being hired,
silveiragu

College Scorecard Sandbags Equity in Higher Education | Patricia McGuire - 0 views

  • the "haves" in higher education have quite a lot; the "have nots" struggle mightily. And this economic chasm is seriously influenced by gender, race and social class -- issues on which the College Scorecard is silent, but which affect just about every factoid presented
  • The reality is that even smart wonks educated at some of the best of the "haves" can be blind to social reality; their monument to algorithmic gymnastics in the College Scorecard obscures some of the most important and painful facts about college life and American society today.
  • The administration presents the collegiate earnings data as if it were value-neutral, not only with no reference to the mission of institutions that may have different values from those the administration apparently exalts, but even more devastatingly, with no reference to the pernicious effects of gender and race discrimination on career opportunities and earnings.
  • ...6 more annotations...
  • I am not a wonk, but I did prepare this chart based on data in the College Scorecard and the federal data system IPEDS
  • The value-neutral approach to the collegiate earnings data ignores the facts of life about women and families.
  • 74% of all undergraduates have at least one "non-traditional" characteristic, and more than 55% have two or more non-traditional characteristics such as having children, being a caregiver, delaying college enrollment, attending part-time, working full-time.
  • But the College Scorecard completely ignores the increasingly non-traditional nature of the nation's undergraduate student body today, and instead, presents data as if most college students are privileged children whiling away four years in some grove of academic luxury
  • The Obama administration claims that the new College Scorecard will provide more "transparent" data to students and families trying to decide which college to attend. Unfortunately, by presenting some data in value-neutral or misleading ways, and ignoring other truly important questions in the college choice process
  • the administration presents a data mashup with limited utility for consumers but large potential for misrepresentation of social realities.
Javier E

Bile, venom and lies: How I was trolled on the Internet - The Washington Post - 0 views

  • Thomas Jefferson often argued that an educated public was crucial for the survival of self-government
  • We now live in an age in which that education takes place mostly through relatively new platforms. Social networks — Facebook, Twitter, Instagram, etc. — are the main mechanisms by which people receive and share facts, ideas and opinions. But what if they encourage misinformation, rumors and lies?
  • In a comprehensive new study of Facebook that analyzed posts made between 2010 and 2014, a group of scholars found that people mainly shared information that confirmed their prejudices, paying little attention to facts and veracity. (Hat tip to Cass Sunstein, the leading expert on this topic.) The result, the report says, is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust and paranoia.
  • ...7 more annotations...
  • The authors specifically studied trolling — the creation of highly provocative, often false information, with the hope of spreading it widely. The report says that “many mechanisms cause false information to gain acceptance, which in turn generate false beliefs that, once adopted by an individual, are highly resistant to correction.”
  • in recent weeks I was the target of a trolling campaign and saw exactly how it works. It started when an obscure website published a post titled “CNN host Fareed Zakaria calls for jihad rape of white women.” The story claimed that in my “private blog” I had urged the use of American women as “sex slaves” to depopulate the white race. The post further claimed that on my Twitter account, I had written the following line: “Every death of a white person brings tears of joy to my eyes.”
  • Disgusting. So much so that the item would collapse from its own weightlessness, right? Wrong. Here is what happened next: Hundreds of people began linking to it, tweeting and retweeting it, and adding their comments, which are too vulgar or racist to repeat. A few ultra-right-wing websites reprinted the story as fact. With each new cycle, the levels of hysteria rose, and people started demanding that I be fired, deported or killed. For a few days, the digital intimidation veered out into the real world. Some people called my house late one night and woke up and threatened my daughters, who are 7 and 12.
  • The people spreading this story were not interested in the facts; they were interested in feeding prejudice. The original story was cleverly written to provide conspiracy theorists with enough ammunition to ignore evidence. It claimed that I had taken down the post after a few hours when I realized it “receive[d] negative attention.” So, when the occasional debunker would point out that there was no evidence of the post anywhere, it made little difference. When confronted with evidence that the story was utterly false, it only convinced many that there was a conspiracy and coverup.
  • conversations on Facebook are somewhat more civil, because people generally have to reveal their identities. But on Twitter and in other places — the online comments section of The Post, for example — people can be anonymous or have pseudonyms. And that is where bile and venom flow freely.
  • an experiment performed by two psychologists in 1970. They divided students into two groups based on their answers to a questionnaire: high prejudice and low prejudice. Each group was told to discuss controversial issues such as school busing and integrated housing. Then the questions were asked again. “The surveys revealed a striking pattern,” Kolbert noted. “Simply by talking to one another, the bigoted students had become more bigoted and the tolerant more tolerant.”
  • This “group polarization” is now taking place at hyper speed, around the world. It is how radicalization happens and extremism spreads.
Javier E

The National Book Awards Haul Translators Out of Obscurity - The Atlantic - 0 views

  • In 2018, American literature no longer means literature written by Americans, for Americans, about America. It means literature that, wherever it comes from, whatever nation it describes, American readers recognize as relevant to them, as familiar. Foreign is no longer foreign
  • the question of how “foreign” a translation should “feel” provokes fierce disagreement. When you open a translated novel from overseas, do you want to sense its author’s French, German, Swedish, Spanish or Italian sensibility, even if that breaks the spell of your reading experience? Or do you want to feel as if the book had magically converted itself into flawless, easeful English, attuned to your own idiom? (This is called the “foreignization vs. domestication” debate.)
  • And should a translation hew closely to the language and structure of the original, or should it recraft the language to appeal to the target audience? (This is the “faithfulness” question.) Hardly anyone agrees—not editors, not scholars, not translators, and not readers.
Javier E

When bias beats logic: why the US can't have a reasoned gun debate | US news | The Guar... - 1 views

  • Jon Stokes, a writer and software developer, said he is frustrated after each mass shooting by “the sentiment among very smart people, who are used to detail and nuance and doing a lot of research, that this is cut and dried, this is black and white”.
  • Stokes has lived on both sides of America’s gun culture war, growing up in rural Louisiana, where he got his first gun at age nine, and later studying at Harvard and the University of Chicago, where he adopted some of a big-city resident’s skepticism about guns. He’s written articles about the gun geek culture behind the popularity of the AR-15, why he owns a military-style rifle, and why gun owners are so skeptical of tech-enhanced “smart guns”.
  • Even to suggest that the debate is more complicated – that learning something about guns, by taking a course on how to safely carry a concealed weapon, or learning how to fire a gun, might shift their perspective on whichever solution they have just heard about on TV – “just upsets them, and they basically say you’re trying to obscure the issue”.
  • ...8 more annotations...
  • In early 2013, a few months after the mass shooting at Sandy Hook elementary school, a Yale psychologist created an experiment to test how political bias affects our reasoning skills. Dan Kahan was attempting to understand why public debates over social problems remain deadlocked, even when good scientific evidence is available. He decided to test a question about gun control.
  • Then Kahan ran the same test again. This time, instead of evaluating skin cream trials, participants were asked to evaluate whether a law banning citizens from carrying concealed firearms in public made crime go up or down. The result: when liberals and conservatives were confronted with a set of results that contradicted their political assumptions, the smartest people were barely more likely to arrive at the correct answer than the people with no math skills at all. Political bias had erased the advantages of stronger reasoning skills.
  • The reason that measurable facts were sidelined in political debates was not that people have poor reasoning skills, Kahan concluded. Presented with a conflict between holding to their beliefs or finding the correct answer to a problem, people simply went with their tribe.
  • It wasa reasonable strategy on the individual level – and a “disastrous” one for tackling social change, he concluded.
  • But the biggest distortion in the gun control debate is the dramatic empathy gap between different kinds of victims. It’s striking how puritanical the American imagination is, how narrow its range of sympathy. Mass shootings, in which the perpetrator kills complete strangers at random in a public place, prompt an outpouring of grief for the innocent lives lost. These shootings are undoubtedly horrifying, but they account for a tiny percentage of America’s overall gun deaths each year.
  • The roughly 60 gun suicides each day, the 19 black men and boys lost each day to homicide, do not inspire the same reaction, even though they represent the majority of gun violence victims. Yet there are meaningful measures which could save lives here – targeted inventions by frontline workers in neighborhoods where the gun homicide rate is 400 times higher than other developed countries, awareness campaigns to help gun owners in rural states learn about how to identify suicide risk and intervene with friends in trouble.
  • When it comes to suicide, “there is so much shame about that conversation … and where there is shame there is also denial,”
  • When young men of color are killed, “you have disdain and aggression,” fueled by the type of white supremacist argument which equates blackness with criminality.
Javier E

Bile, venom and lies: How I was trolled on the Internet - The Washington Post - 1 views

  • In a comprehensive new study of Facebook that analyzed posts made between 2010 and 2014, a group of scholars found that people mainly shared information that confirmed their prejudices, paying little attention to facts and veracity. (Hat tip to Cass Sunstein, the leading expert on this topic.) The result, the report says, is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust and paranoia.”
  • The authors specifically studied trolling — the creation of highly provocative, often false information, with the hope of spreading it widely. The report says that “many mechanisms cause false information to gain acceptance, which in turn generate false beliefs that, once adopted by an individual, are highly resistant to correction.”
  • in recent weeks I was the target of a trolling campaign and saw exactly how it works. It started when an obscure website published a post titled “CNN host Fareed Zakaria calls for jihad rape of white women.
  • ...3 more annotations...
  • Here is what happened next: Hundreds of people began linking to it, tweeting and retweeting it, and adding their comments, which are too vulgar or racist to repeat. A few ultra-right-wing websites reprinted the story as fact. With each new cycle, the levels of hysteria rose, and people started demanding that I be fired, deported or killed. For a few days, the digital intimidation veered out into the real world. Some people called my house late one night and woke up and threatened my daughters, who are 7 and 12.
  • The people spreading this story were not interested in the facts; they were interested in feeding prejudice. The original story was cleverly written to provide conspiracy theorists with enough ammunition to ignore evidence. It claimed that I had taken down the post after a few hours when I realized it “receive[d] negative attention.”
  • an experiment performed by two psychologists in 1970. They divided students into two groups based on their answers to a questionnaire: high prejudice and low prejudice. Each group was told to discuss controversial issues such as school busing and integrated housing. Then the questions were asked again. “The surveys revealed a striking pattern,” Kolbert noted. “Simply by talking to one another, the bigoted students had become more bigoted and the tolerant more tolerant.” This “group polarization” is now taking place at hyper speed, around the world. It is how radicalization happens and extremism spreads.
Javier E

Professors like me can't stay silent about this extremist moment on campuses - The Wash... - 0 views

  • At Reed College in Oregon, where I work, a group of students began protesting the required first-year humanities course a year ago. Three times a week, students sat in the lecture space holding signs — many too obscene to be printed here — condemning the course and its faculty as white supremacists, as anti-black, as not open to dialogue and criticism, on the grounds that we continue to teach, among many other things, Aristotle and Plato.
  • In the interest of supporting dissent and the free exchange of ideas, the faculty and administration allowed this.
  • Those who felt able to do so lectured surrounded by those signs for the better part of a year. I lectured, but dealt with physical anxiety — lack of sleep, nausea, loss of appetite, inability to focus — in the weeks leading up to my lecture. Instead of walking around or standing at the lectern, as I typically do, I sat as I tried to teach students how to read the poetry of Sappho. Inadvertently, I spoke more quietly, more timidly.
  • ...4 more annotations...
  • Some colleagues, including people of color, immigrants and those without tenure, found it impossible to work under these conditions. The signs intimidated faculty into silence, just as intended, and these silenced professors’ lectures were quietly replaced by talks from people willing and able to carry on teaching in the face of these demonstrations.
  • I think obscuring these acts of silencing was a mistake that resulted in an escalation of the protesters’ tactics.
  • This academic year, the first lecture was to be a panel introduction of the course: Along with two colleagues, I was going to offer my thoughts on the course, the study of the humanities and the importance of students’ knowing the history of the education they were beginning.
  • We introduced ourselves and took our seats. But as we were about to begin, the protesters seized our microphones, stood in front of us and shut down the lecture.
Javier E

Writing, Typing, and Economics - The Atlantic - 0 views

  • The first lesson would have to do with the all-important issue of inspiration. All writers know that on some golden mornings they are touched by the wand — are on intimate terms with poetry and cosmic truth. I have experienced those moments myself. Their lesson is simple: It's a total illusion.
  • And the danger in the illusion is that you will wait for those moments. Such is the horror of having to face the typewriter that you will spend all your time waiting. I am persuaded that most writers, like most shoemakers, are about as good one day as the next (a point which Trollope made), hangovers apart. The difference is the result of euphoria, alcohol, or imagination. The meaning is that one had better go to his or her typewriter every morning and stay there regardless of the seeming result. It will be much the same.
  • Writers, in contrast, do nothing because they are waiting for inspiration.In my own case there are days when the result is so bad that no fewer than five revisions are required. However, when I'm greatly inspired, only four revisions are needed before, as I've often said, I put in that note of spontaneity which even my meanest critics concede
  • ...13 more annotations...
  • It helps greatly in the avoidance of work to be in the company of others who are also waiting for the golden moment. The best place to write is by yourself, because writing becomes an escape from the terrible boredom of your own personality. It's the reason that for years I've favored Switzerland, where I look at the telephone and yearn to hear it ring.
  • There may be inspired writers for whom the first draft is just right. But anyone who is not certifiably a Milton had better assume that the first draft is a very primitive thing. The reason is simple: Writing is difficult work. Ralph Paine, who managed Fortune in my time, used to say that anyone who said writing was easy was either a bad writer or an unregenerate liar
  • Thinking, as Voltaire avowed, is also a very tedious thing which men—or women—will do anything to avoid. So all first drafts are deeply flawed by the need to combine composition with thought. Each later draft is less demanding in this regard. Hence the writing can be better
  • There does come a time when revision is for the sake of change—when one has become so bored with the words that anything that is different looks better. But even then it may be better.
  • the lesson of Harry Luce. No one who worked for him ever again escaped the feeling that he was there looking over one's shoulder. In his hand was a pencil; down on each page one could expect, any moment, a long swishing wiggle accompanied by the comment: "This can go." Invariably it could. It was written to please the author and not the reader. Or to fill in the space. The gains from brevity are obvious; in most efforts to achieve brevity, it is the worst and dullest that goes. It is the worst and dullest that spoils the rest.
  • Economics is an example, and within the field of economics the subject of money, with the history of which I have been much concerned, is an especially good case. Any specialist who ventures to write on money with a view to making himself intelligible works under a grave moral hazard. He will be accused of oversimplification. The charge will be made by his fellow professionals, however obtuse or incompetent
  • Reluctantly, but from a long and terrible experience, I would urge my young writers to avoid all attempts at humor
  • Only a very foolish man will use a form of language that is wholly uncertain in its effect. That is the nature of humor
  • Finally, I would come to a matter of much personal interest, intensely self-serving. It concerns the peculiar pitfalls of the writer who is dealing with presumptively difficult or technical matters
  • as he grew older, he became less and less interested in theory, more and more interested in information.
  • In the case of economics there are no important propositions that cannot be stated in plain language
  • Additionally, and especially in the social sciences, much unclear writing is based on unclear or incomplete thought
  • It is possible with safety to be technically obscure about something you haven't thought out. It is impossible to be wholly clear on something you do not understand. Clarity thus exposes flaws in the thought
Javier E

Ivy League Schools Are Overrated. Send Your Kids Elsewhere. | New Republic - 1 views

  • a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough.
  • With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.”
  • On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.
  • ...52 more annotations...
  • When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine.
  • Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.
  • “Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance
  • Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table.
  • It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.
  • I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.
  • Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.
  • So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk.
  • There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”
  • What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?
  • The first thing that college is for is to teach you to think.
  • College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.
  • it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.
  • College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.
  • Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions.
  • Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.
  • Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect.
  • At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much
  • professors and students have largely entered into what one observer called a “nonaggression pact.”
  • higher marks for shoddier work.
  • today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses
  • they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.
  • Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify
  • there is now a thriving sector devoted to producing essay-ready summers
  • To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society.
  • what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.
  • The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things
  • As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell.
  • Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science
  • It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”
  • t almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.
  • Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals
  • That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present
  • The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself.
  • This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent
  • The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game
  • Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools.
  • s there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing
  • Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.
  • Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.
  • The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely
  • U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.
  • The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.
  • Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.
  • The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth
  • More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
  • reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality
  • The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first.
  • I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.
  • High-quality public education, financed with public money, for the benefit of all
  • Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides.
  • We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

This Is Not a Market | Dissent Magazine - 0 views

  • Given how ordinary people use the term, it’s not surprising that academic economists are a little vague about it—but you’ll be glad to hear that they know they’re being vague. A generation of economists have criticized their colleagues’ inability to specify what a “market” actually is. George Stigler, back in 1967, thought it “a source of embarrassment that so little attention has been paid to the theory of markets.” Sociologists agree: according to Harrison White, there is no “neoclassical theory of the market—[only] a pure theory of exchange.” And Wayne Baker found that the idea of the market is “typically assumed—not studied” by most economists, who “implicitly characterize ‘market’ as a ‘featureless plane.’
  • When we say “market” now, we mean nothing particularly specific, and, at the same time, everything—the entire economy, of course, but also our lives in general. If you can name it, there’s a market in it: housing, education, the law, dating. Maybe even love is “just an economy based on resource scarcity.”
  • The use of markets to describe everything is odd, because talking about “markets” doesn’t even help us understand how the economy works—let alone the rest of our lives. Even though nobody seems to know what it means, we use the metaphor freely, even unthinkingly. Let the market decide. The markets are volatile. The markets responded poorly. Obvious facts—that the economy hasn’t rebounded after the recession—are hidden or ignored, because “the market” is booming, and what is the economy other than “the market”? Well, it’s lots of other things. We might see that if we talked about it a bit differently.
  • ...9 more annotations...
  • For instance, we might choose a different metaphor—like, say, the traffic system. Sounds ridiculous? No more so than the market metaphor. After all, we already talk about one important aspect of economic life in terms of traffic: online activity. We could describe it in market terms (the market demands Trump memes!), but we use a different metaphor, because it’s just intuitively more suitable. That last Trump meme is generating a lot of traffic. Redirect your attention as required.
  • We don’t know much about markets, because we don’t deal with them very often. But most of us know plenty about traffic systems: drivers will know the frustration of trying to turn left onto a major road, of ceaseless, pointless lane-switching on a stalled rush-hour freeway, but also the joys of clear highways.
  • We know the traffic system because, whether we like it or not, we are always involved in it, from birth
  • As of birth, Jean is in the economy—even if s/he rarely goes to a market. You can’t not be an economic actor; you can’t not be part of the transport system.
  • Consider also the composition of the traffic system and the economy. A market, whatever else it is, is always essentially the same thing: a place where people can come together to buy and sell things. We could set up a market right now, with a few fences and a sign announcing that people could buy and sell. We don’t even really need the fences. A traffic system, however, is far more complex. To begin with, the system includes publicly and privately run elements: most cars are privately owned, as are most airlines
  • If we don’t evaluate traffic systems based on their size, or their growth, how do we evaluate them? Mostly, by how well they help people get where they want to go. The market metaphor encourages us to think that all economic activity is motivated by the search for profit, and pursued in the same fashion everywhere. In a market, everyone’s desires are perfectly interchangeable. But, while everybody engages in the transport system, we have no difficulty remembering that we all want to go to different places, in different ways, at different times, at different speeds, for different reasons
  • Deciding how to improve the traffic system, how to expand people’s opportunities, is obviously a question of resource allocation and prioritization on a scale that private individuals—even traders—cannot influence on their own. That’s why government have not historically trusted the “magic of the markets” to produce better opportunities for transport. We intuitively understand that these decisions are made at the level of mass society and public policy. And, whether you like it or not, this is true for decisions about the economy as well.
  • Thinking of the economy in terms of the market—a featureless plane, with no entry or exit costs, little need for regulation, and equal opportunity for all—obscures this basic insight. And this underlying misconception creates a lot of problems: we’ve fetishized economic growth, we’ve come to distrust government regulation, and we imagine that the inequalities in our country, and our world, are natural or justified. If we imagine the economy otherwise—as a traffic system, for example—we see more clearly how the economy actually works.
  • We see that our economic life looks a lot less like going to “market” for fun and profit than it does sitting in traffic on our morning commute, hoping against hope that we’ll get where we want to go, and on time.
Emilio Ergueta

What is Art? and/or What is Beauty? | Issue 108 | Philosophy Now - 1 views

  • Art is something we do, a verb. Art is an expression of our thoughts, emotions, intuitions, and desires, but it is even more personal than that: it’s about sharing the way we experience the world, which for many is an extension of personality. It is the communication of intimate concepts that cannot be faithfully portrayed by words alone.
  • eauty is much more than cosmetic: it is not about prettiness. There are plenty of pretty pictures available at the neighborhood home furnishing store; but these we might not refer to as beautiful; and it is not difficult to find works of artistic expression that we might agree are beautiful that are not necessarily pretty.
  • Works of art may elicit a sense of wonder or cynicism, hope or despair, adoration or spite; the work of art may be direct or complex, subtle or explicit, intelligible or obscure; and the subjects and approaches to the creation of art are bounded only by the imagination of the artist.
  • ...4 more annotations...
  • The game changers – the square pegs, so to speak – are those who saw traditional standards of beauty and decided specifically to go against them, perhaps just to prove a point. Take Picasso, Munch, Schoenberg, to name just three. They have made a stand against these norms in their art. Otherwise their art is like all other art: its only function is to be experienced, appraised, and understood (or not).
  • art is not necessarily positive: it can be deliberately hurtful or displeasing: it can make you think about or consider things that you would rather not. But if it evokes an emotion in you, then it is art.
  • art cannot be simply defined on the basis of concrete tests like ‘fidelity of representation’ or vague abstract concepts like ‘beauty’. So how can we define art in terms applying to both cave-dwellers and modern city sophisticates? To do this we need to ask: What does art do? And the answer is surely that it provokes an emotional, rather than a simply cognitive response. One way of approaching the problem of defining art, then, could be to say: Art consists of shareable ideas that have a shareable emotional impact
  • . A work of art is that which asks a question which a non-art object such as a wall does not: What am I? What am I communicating? The responses, both of the creator artist and of the recipient audience, vary, but they invariably involve a judgement, a response to the invitation to answer. The answer, too, goes towards deciphering that deeper question – the ‘Who am I?’ which goes towards defining humanity.
sissij

Why Silicon Valley Titans Train Their Brains with Philosophy | Big Think - 0 views

  • To alleviate the stresses and open their minds, the execs have been known to experiment with microdosing on psychedelics, taking brain-stimulating nootropics, and sleeping in phases. What’s their latest greatest brain hack? Philosophy.
  • The guidance focuses on using reason and logic to unmask illusions about your life or work.
  • He thinks that approach can obscure the true understanding of human life. In an interview with Quartz, he says that rather than ask “How can I be more successful?” it’s actually more important to ask - "Why be successful?”
  • ...2 more annotations...
  • introduces thought and balance to people’s lives.
  • Thomas S. Kuhn
  •  
    I found this very interesting that philosophy can be linked to the modern silicon valley. The silicon valley always gives me a modern impression as if it is the lead of human technology and a believer of science. I am surprised that actually many people in silicon valley are interested in philosophy, something that I consider being not practical at all. I think this shows the importance of being cross-disciplined. --Sissi (5/23/2017)
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
clairemann

Democrats Move To Scrap Trump-Era Rollback Of Methane Rules | HuffPost - 1 views

  • Democrats have turned to an obscure procedural tool in an attempt to undo the Trump administration’s rollback of an Obama-era regulation on methane, a potent greenhouse gas released by oil and gas operations.
  • “Methane standards are one of the most important ways to address an important source of greenhouse gas emissions that contribute significantly to climate change, and the Trump administration’s weakening of those standards was a dagger in the heart of efforts to address the climate crisis,”
  • The Congressional Review Act is a 1996 law that gives Congress the power to nullify any major regulation finalized within the final 60 legislative days of a president’s term.
  • ...3 more annotations...
  • The move is “critical to confronting the climate crisis and reducing harmful air pollution,” Heinrich tweeted.
  • The Obama administration’s methane rule sought to rein in methane pollution from fossil fuel operations by requiring operators to monitor and prevent leaks from wells, pipelines and other facilities.
  • “We have over 100,000 wellheads that are not capped, leaking methane,” Biden said. “What are we doing? And by the way, we can put as many pipe fitters and miners to work capping those wells at the same price that they would charge to dig those wells.”
‹ Previous 21 - 40 of 55 Next ›
Showing 20 items per page