Skip to main content

Home/ TOK Friends/ Group items matching "requirements" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
7More

What Is College For? (Part 2) - NYTimes.com - 0 views

  • How, exactly, does college prepare students for the workplace? For most jobs, it provides basic intellectual skills: the ability to understand relatively complex instructions, to write and speak clearly and cogently, to evaluate options critically. Beyond these intellectual skills, earning a college degree shows that you have the “moral qualities” needed for most jobs: you have (to put it a bit cynically), for a period of four years and with relatively little supervision, deferred to authority, met deadlines and carried out difficult tasks even when you found them pointless and boring.
  • This sort of intellectual and moral training, however, does not require studying with experts doing cutting-edge work on, say, Homeric poetry, elementary particle theory or the philosophy of Kant. It does not, that is, require the immersion in the world of intellectual culture that a college faculty is designed to provide. It is, rather, the sort of training that ought to result from good elementary and high school education.
  • students graduating from high school should, to cite one plausible model, be able to read with understanding classic literature (from, say, Austen and Browning to Whitman and Hemingway) and write well-organized and grammatically sound essays; they should know the basic outlines of American and European history, have a good beginner’s grasp of at least two natural sciences as well as pre-calculus mathematics, along with a grounding in a foreign language.
  • ...4 more annotations...
  • Is it really possible to improve grade school and high school teaching to the level I’m suggesting? Yes, provided we employ the same sort of selection criteria for pre-college teachers as we do for other professionals such as doctors, lawyers and college professors. In contrast to other professions, teaching is not now the domain of the most successful students — quite the contrary. I’ve known many very bright students who had an initial interest in such teaching but soon realized that there is no comparison in terms of salary, prestige and working conditions.
  • Given this transformation in pre-college education, we could expect it to provide basic job-training for most students. At that point, we would still face a fundamental choice regarding higher education. We could see it as a highly restricted enterprise, educating only professionals who require advanced specialized skills. Correspondingly, only such professionals would have access to higher education as a locus of intellectual culture.
  • On the other hand, we could — as I would urge — see college as the entrée to intellectual culture for everyone who is capable of and interested in working at that level of intellectual engagement
  • Raising high school to the level I am proposing and opening college to everyone who will profit from it would be an expensive enterprise. We would need significant government support to ensure that all students receive an education commensurate with their abilities and aspirations, regardless of family resources. But the intellectual culture of our citizens should be a primary part of our national well-being, not just the predilection of an eccentric elite. As such, it should be among our highest priorities.
7More

Does Thinking Really Hard Burn More Calories?: Scientific American - 0 views

  • Just as vigorous exercise tires our bodies, intellectual exertion should drain the brain. What the latest science reveals, however, is that the popular notion of mental exhaustion is too simplistic. The brain continuously slurps up huge amounts of energy for an organ of its size, regardless of whether we are tackling integral calculus or clicking through the week's top 10 LOLcats. Although firing neurons summon extra blood, oxygen and glucose, any local increases in energy consumption are tiny compared with the brain's gluttonous baseline intake. So, in most cases, short periods of additional mental effort require a little more brainpower than usual, but not much more.
  • something must explain the feeling of mental exhaustion, even if its physiology differs from physical fatigue. Simply believing that our brains have expended a lot of effort might be enough to make us lethargic.
  • a typical adult human brain runs on around 12 watts—a fifth of the power required by a standard 60 watt lightbulb. Compared with most other organs, the brain is greedy; pitted against man-made electronics, it is astoundingly efficient. IBM's Watson, the supercomputer that defeated Jeopardy! champions, depends on ninety IBM Power 750 servers, each of which requires around one thousand watts.
  • ...4 more annotations...
  • people routinely enjoy intellectually invigorating activities without suffering mental exhaustion.
  • Such fatigue seems much more likely to follow sustained mental effort that we do not seek for pleasure—such as the obligatory SAT—especially when we expect that the ordeal will drain our brains. If we think an exam or puzzle will be difficult, it often will be.
  • Studies have shown that something similar happens when people exercise and play sports: a large component of physical exhaustion is in our heads. In related research, volunteers that cycled on an exercise bike following a 90-minute computerized test of sustained attention quit pedaling from exhaustion sooner than participants that watched emotionally neutral documentaries before exercising
  • In the specific case of the SAT, something beyond pure mental effort likely contributes to post-exam stupor: stress. After all, the brain does not function in a vacuum. Other organs burn up energy, too. Taking an exam that partially determines where one will spend the next four years is nerve-racking enough to send stress hormones swimming through the blood stream, induce sweating, quicken heart rates and encourage fidgeting and contorted body postures. The SAT and similar trials are not just mentally taxing—they are physically exhausting, too.
6More

When A MOOC Exploits Its Learners: A Coursera Case Study | NeoAcademic - 0 views

  • To facilitate a 50,000:1 teacher-student ratio, they rely on an instructional model requiring minimal instructor involvement, potentially to the detriment of learners.
  • The only real change in the year following “the year of the MOOC” is that these companies have now begun to strike deals with private organizations to funnel in high performing students. To me, this seems like a terrifically clever way to circumvent labor laws. Instead of paying new employees during an onboarding and training period, business can now require employees to take a “free course” before paying them a dime.
  • why not reach out to an audience ready and eager to learn just because they are intrinsically motivated to develop their skills? This is what has motivated me to look into producing an I/O Psychology MOOC
  • ...3 more annotations...
  • in Week 4, the assignment was to complete this research study, which was not linked with any learning objectives in that week (at least in any way indicated to students).  If you didn’t complete the research study, you earned a zero for the assignment.  There was no apparent way around it.
  • I can tell you emphatically that this would not be considered an ethical course design choice in a real college classroom. Research participation must be voluntary and non-mandatory. If an instructor does require research participation (common in Psychology to build a subject pool), there must always be an alternative non-data-collection-oriented assignment in order to obtain the same credit. Anyone that doesn’t want to be the subject of research must always have a way to do exactly that – skip research and still get course credit.
  • , I will not be completing this MOOC, and I can only wonder how many others dropped because they, too, felt exploited by their instructors.
25More

Love People, Not Pleasure - NYTimes.com - 0 views

  • Fame, riches and pleasure beyond imagination. Sound great? He went on to write:“I have diligently numbered the days of pure and genuine happiness which have fallen to my lot: They amount to 14.”Abd al-Rahman’s problem wasn’t happiness, as he believed — it was unhappiness
  • Happiness and unhappiness are certainly related, but they are not actually opposites.
  • Circumstances are certainly important. No doubt Abd al-Rahman could point to a few in his life. But paradoxically, a better explanation for his unhappiness may have been his own search for well-being. And the same might go for you.Continue reading the main story
  • ...22 more annotations...
  • As strange as it seems, being happier than average does not mean that one can’t also be unhappier than average.
  • In 2009, researchers from the University of Rochester conducted a study tracking the success of 147 recent graduates in reaching their stated goals after graduation. Some had “intrinsic” goals, such as deep, enduring relationships. Others had “extrinsic” goals, such as achieving reputation or fame. The scholars found that intrinsic goals were associated with happier lives. But the people who pursued extrinsic goals experienced more negative emotions, such as shame and fear. They even suffered more physical maladies.
  • the paradox of fame. Just like drugs and alcohol, once you become addicted, you can’t live without it. But you can’t live with it, either.
  • That impulse to fame by everyday people has generated some astonishing innovations.
  • Today, each of us can build a personal little fan base, thanks to Facebook, YouTube, Twitter and the like. We can broadcast the details of our lives to friends and strangers in an astonishingly efficient way. That’s good for staying in touch with friends, but it also puts a minor form of fame-seeking within each person’s reach. And several studies show that it can make us unhappy.
  • It makes sense. What do you post to Facebook? Pictures of yourself yelling at your kids, or having a hard time at work? No, you post smiling photos of a hiking trip with friends. You build a fake life — or at least an incomplete one — and share it. Furthermore, you consume almost exclusively the fake lives of your social media “friends.” Unless you are extraordinarily self-aware, how could it not make you feel worse to spend part of your time pretending to be happier than you are, and the other part of your time seeing how much happier others seem to be than you?Continue reading the main story
  • the bulk of the studies point toward the same important conclusion: People who rate materialistic goals like wealth as top personal priorities are significantly likelier to be more anxious, more depressed and more frequent drug users, and even to have more physical ailments than those who set their sights on more intrinsic values.
  • as the Dalai Lama pithily suggests, it is better to want what you have than to have what you want.
  • In 2004, two economists looked into whether more sexual variety led to greater well-being. They looked at data from about 16,000 adult Americans who were asked confidentially how many sex partners they had had in the preceding year, and about their happiness. Across men and women alike, the data show that the optimal number of partners is one.
  • This might seem totally counterintuitive. After all, we are unambiguously driven to accumulate material goods, to seek fame, to look for pleasure. How can it be that these very things can give us unhappiness instead of happiness? There are two explanations, one biological and the other philosophical.
  • From an evolutionary perspective, it makes sense that we are wired to seek fame, wealth and sexual variety. These things make us more likely to pass on our DNA.
  • here’s where the evolutionary cables have crossed: We assume that things we are attracted to will relieve our suffering and raise our happiness.
  • that is Mother Nature’s cruel hoax. She doesn’t really care either way whether you are unhappy — she just wants you to want to pass on your genetic material. If you conflate intergenerational survival with well-being, that’s your problem, not nature’s.
  • More philosophically, the problem stems from dissatisfaction — the sense that nothing has full flavor, and we want more. We can’t quite pin down what it is that we seek. Without a great deal of reflection and spiritual hard work, the likely candidates seem to be material things, physical pleasures or favor among friends and strangers.
  • We look for these things to fill an inner emptiness. They may bring a brief satisfaction, but it never lasts, and it is never enough. And so we crave more.
  • This search for fame, the lust for material things and the objectification of others — that is, the cycle of grasping and craving — follows a formula that is elegant, simple and deadly:Love things, use people.
  • This was Abd al-Rahman’s formula as he sleepwalked through life. It is the worldly snake oil peddled by the culture makers from Hollywood to Madison Avenue.
  • Simply invert the deadly formula and render it virtuous:Love people, use things.
  • It requires the courage to repudiate pride and the strength to love others — family, friends, colleagues, acquaintances, God and even strangers and enemies. Only deny love to things that actually are objects. The practice that achieves this is charity. Few things are as liberating as giving away to others that which we hold dear.
  • This also requires a condemnation of materialism.
  • Finally, it requires a deep skepticism of our own basic desires. Of course you are driven to seek admiration, splendor and physical license.
  • Declaring war on these destructive impulses is not about asceticism or Puritanism. It is about being a prudent person who seeks to avoid unnecessary suffering.
33More

Read this if you want to be happy in 2014 - The Washington Post - 2 views

  • people usually experience the sensation of happiness whenever they have both health and freedom. It’s a simple formula: Happiness = Health + Freedom
  • I’m talking about the everyday freedom of being able to do what you want when you want to do it, at work and elsewhere. For happiness, timing is as important as the thing you’re doing
  • Matching your mood to your activity is a baseline requirement for happiness
  • ...30 more annotations...
  • The good news is that timing is relatively controllable, especially in the long run.
  • If you’re just starting out in your career, it won’t be easy to find a job that gives you a flexible schedule. The best approach is a strategy of moving toward more flexibility over the course of your life.
  • There isn’t one formula for finding schedule flexibility. Just make sure all of your important decisions are consistent with an end game of a more flexible schedule. Otherwise you are shutting yourself off from the most accessible lever for happiness — timing.
  • if you knew that pasta is far lower on the glycemic index than a white potato, you would make a far healthier choice that requires no willpower at all. All it took was knowledge.
  • The most important thing to know about staying fit is this: If it takes willpower, you’re doing it wrong. Anything that requires willpower is unsustainable in the long run.
  • studies show that using willpower in one area diminishes how much willpower you have in reserve for other areas. You need to get willpower out of the system
  • My observation is that you can usually replace willpower with knowledge.
  • the trick for avoiding unhealthy foods is to make sure you always have access to healthy options that you enjoy eating. Your knowledge of this trick, assuming you use it, makes willpower far less necessary.
  • don’t give up too much income potential just to get a flexible schedule. There’s no point in having a flexible schedule if you can’t afford to do anything.
  • the fittest people have systems, not goals, unless they are training for something specific. A sensible system is to continuously learn more about the science of diet and the methods for making healthy food taste great. With that system, weight management will feel automatic. Goals aren’t needed.
  • Did you know that sleepiness causes you to feel hungry?
  • Did you know that eating peanuts is a great way to suppress appetite?
  • Did you know that eating mostly protein instead of simple carbs for lunch will help you avoid the afternoon energy slump?
  • Did you know that eating simple carbs can make you hungrier?
  • Did you know that exercise has only a small impact on your weight?
  • after I started noticing how drained and useless I felt after eating simple carbs, french fries became easy to resist.
  • I also learned that I can remove problem foods from my diet if I target them for extinction one at a time. It was easy to stop eating three large Snickers every day (which I was doing) when I realized I could eat anything else I wanted whenever I wanted
  • If you’re on a diet, you’re probably trying to avoid certain types of food, but you’re also trying to limit your portions. Instead of waging war on two fronts, try allowing yourself to eat as much as you want of anything that is healthy.
  • healthier food is almost self-regulating in the sense that you don’t have an insatiable desire to keep eating it the way you might with junk food. With healthy food, you tend to stop when you feel full
  • One of the biggest obstacles to healthy eating is the impression that healthy food generally tastes like cardboard. So consider making it a lifelong system to learn how to season and prepare healthy foods
  • Cheese adds calories, but the fat content will help suppress your appetite, so you probably come out ahead. If you didn’t already know that, you might end up using willpower to avoid cheese at dinner and willpower again later that night to resist snacking. A little knowledge replaces a lot of willpower.
  • ’m limiting my portion size. You only need to do that if you are eating the wrong foods. Eating half of your cake still keeps you addicted to cake. And portion control takes a lot of willpower. You’ll find that healthy food satisfies you sooner, so you don’t crave large portions.
  • No one can exercise enough to overcome a bad diet. Diet is the right button to push for losing weight, so long as you are active. People who eat right and stay active usually have no problems with weight.
  • I’m about to share with you the simplest and potentially most effective exercise plan in the world. Here it is: Be active every day.
  • When you’re active, and you don’t overdo it, you’ll find yourself in a good mood afterward. That reward becomes addictive over time.
  • After a few months of being moderately active every day, you’ll discover that it is harder to sit and do nothing than it is to get up and do something. That’s the frame of mind you want. You want exercise to become a habit with a reward so it evolves into a useful addiction
  • the intensity of your workout has a surprisingly small impact on your weight unless you’re running half-marathons every week. If your diet is right, moderate exercise is all you need.
  • When your body is feeling good, and you have some flexibility in your schedule, you’ll find that the petty annoyances that plague your life become nothing but background noise. And that’s a great launch pad for happiness.
  • As you find yourself getting healthier and happier, the people in your life will view you differently too. Healthy-looking people generally earn more money, get more offers and enjoy a better social life. All of that will help your happiness.
  • Keep in mind that happiness is a directional phenomenon. We feel happy when things are moving in the right direction no matter where we are at the moment.
41More

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
25More

Regulating Sex - The New York Times - 0 views

  • THIS is a strange moment for sex in America. We’ve detached it from pregnancy, matrimony and, in some circles, romance. At least, we no longer assume that intercourse signals the start of a relationship.
  • But the more casual sex becomes, the more we demand that our institutions and government police the line between what’s consensual and what isn’t. And we wonder how to define rape. Is it a violent assault or a violation of personal autonomy? Is a person guilty of sexual misconduct if he fails to get a clear “yes” through every step of seduction and consummation?
  • According to the doctrine of affirmative consent — the “yes means yes” rule — the answer is, well, yes, he is.
  • ...22 more annotations...
  • if one person can think he’s hooking up while the other feels she’s being raped, it makes sense to have a law that eliminates the possibility of misunderstanding. “You shouldn’t be allowed to make the assumption that if you find someone lying on a bed, they’re free for sexual pleasure,”
  • About a quarter of all states, and the District of Columbia, now say sex isn’t legal without positive agreement,
  • And though most people think of “yes means yes” as strictly for college students, it is actually poised to become the law of the land.
  • But criminal law is a very powerful instrument for reshaping sexual mores.
  • Should we really put people in jail for not doing what most people aren’t doing? (Or at least, not yet?)
  • It’s one thing to teach college students to talk frankly about sex and not to have it without demonstrable pre-coital assent. Colleges are entitled to uphold their own standards of comportment, even if enforcement of that behavior is spotty or indifferent to the rights of the accused. It’s another thing to make sex a crime under conditions of poor communication.
  • Most people just aren’t very talkative during the delicate tango that precedes sex, and the re-education required to make them more forthcoming would be a very big project. Nor are people unerringly good at decoding sexual signals. If they were, we wouldn’t have romantic comedies.
  • “If there’s no social consensus about what the lines are,” says Nancy Gertner, a senior lecturer at Harvard Law School and a retired judge, then affirmative consent “has no business being in the criminal law.”
  • The example points to a trend evident both on campuses and in courts: the criminalization of what we think of as ordinary sex and of sex previously considered unsavory but not illegal.
  • Some new crimes outlined in the proposed code, for example, assume consent to be meaningless under conditions of unequal power. Consensual sex between professionals (therapists, lawyers and the like) and their patients and clients, for instance, would be a fourth-degree felony, punishable by significant time in prison.
  • most of these occupations already have codes of professional conduct, and victims also have recourse in the civil courts. Miscreants, she says, “should be drummed out of the profession or sued for malpractice.”
  • It’s important to remember that people convicted of sex crimes may not only go to jail, they can wind up on a sex-offender registry, with dire and lasting consequences.
  • We shouldn’t forget the harm done to American communities by the national passion for incarceration, either. In a letter to the American Law Institute, Ms. Smith listed several disturbing statistics: roughly one person in 100 behind bars, one in 31 under correctional supervision
  • the case for affirmative consent is “compelling,” he says. Mr. Schulhofer has argued that being raped is much worse than having to endure that awkward moment when one stops to confirm that one’s partner is happy to continue. Silence or inertia, often interpreted as agreement, may actually reflect confusion, drunkenness or “frozen fright,” a documented physiological response in which a person under sexual threat is paralyzed by terror
  • To critics who object that millions of people are having sex without getting unqualified assent and aren’t likely to change their ways, he’d reply that millions of people drive 65 miles per hour despite a 55-mile-per-hour speed limit, but the law still saves lives. As long as “people know what the rules of the road are,” he says, “the overwhelming majority will comply with them.”
  • He understands that the law will have to bring a light touch to the refashioning of sexual norms, which is why the current draft of the model code suggests classifying penetration without consent as a misdemeanor, a much lesser crime than a felony.
  • This may all sound reasonable, but even a misdemeanor conviction goes on the record as a sexual offense and can lead to registration
  • An affirmative consent standard also shifts the burden of proof from the accuser to the accused, which represents a real departure from the traditions of criminal law in the United States. Affirmative consent effectively means that the accused has to show that he got the go-ahead
  • if the law requires a “no,” then the jury will likely perceive any uncertainty about that “no” as a weakness in the prosecution’s case and not convict. But if the law requires a “yes,” then ambiguity will bolster the prosecutor’s argument: The guy didn’t get unequivocal consent, therefore he must be guilty of rape.
  • “It’s an unworkable standard,” says the Harvard law professor Jeannie C. Suk. “It’s only workable if we assume it’s not going to be enforced, by and large.” But that’s worrisome too. Selectively enforced laws have a nasty history of being used to harass people deemed to be undesirable, because of their politics, race or other reasons.
  • it’s probably just a matter of time before “yes means yes” becomes the law in most states. Ms. Suk told me that she and her colleagues have noticed a generational divide between them and their students. As undergraduates, they’re learning affirmative consent in their mandatory sexual-respect training sessions, and they come to “believe that this really is the best way to define consent, as positive agreement,” she says. When they graduate and enter the legal profession, they’ll probably reshape the law to reflect that belief.
  • Sex may become safer for some, but it will be a whole lot more anxiety-producing for others.
8More

Polling's Secrecy Problem - NYTimes.com - 0 views

  • In the polling world, no survey firm releases its microdata in a timely manner. When pollsters release it at all — usually months after publication, to an archive that requires a paid subscription for access — they seldom provide the detailed methodological explanations necessary to replicate the survey results.
  • Critics have raised charges of full-scale fabrication, like that alleged in the LaCour study, about a handful of pollsters in recent years, and such a wholly fraudulent poll might well be able to evade detection.
  • But a bigger problem is that pollsters may be using questionable means to ensure that their results end up in a certain place. Any poll is required to make judgments about exactly how to weight the sample or decide who is likely to vote. But such choices can swing the results by several percentage points, and these decisions can be made with the results in mind.
  • ...5 more annotations...
  • In particular, there is reason to think that pollsters engage in a behavior known as herding, in which they announce results that are similar to those of other recent polls
  • There is strong evidence that some firms have engaged in herding.
  • The firm has since switched to more conventional weighting techniques, bringing the wild swings in the racial composition of the electorate to an end. Its findings now resemble those of other pollsters: offering no obvious signs of herding, but also not providing the data that would rule it out.
  • the association recently began a transparency initiative intended to address some of the concerns.But few polling organizations have signed up so far, and the transparency initiative’s disclosure standards generally fall short of what is needed to allow others researchers to identify or deter dubious practices.
  • It’s hard to say what’s more telling: that the transparency standards wouldn’t be enough to preclude bad practices, or that so few pollsters seem willing to adhere to even those requirements.
20More

Getting It Right - NYTimes.com - 1 views

  • What is it to truly know something?
  • In the complacent 1950s, it was received wisdom that we know a given proposition to be true if, and only if, it is true, we believe it to be true, and we are justified in so believing.
  • This consensus was exploded in a brief 1963 note by Edmund Gettier in the journal Analysis.
  • ...17 more annotations...
  • Suppose you have every reason to believe that you own a Bentley, since you have had it in your possession for many years, and you parked it that morning at its usual spot. However, it has just been destroyed by a bomb, so that you own no Bentley, despite your well justified belief that you do. As you sit in a cafe having your morning latte, you muse that someone in that cafe owns a Bentley (since after all you do). And it turns out you are right, but only because the other person in the cafe, the barista, owns a Bentley, which you have no reason to suspect. So you here have a well justified true belief that is not knowledge.
  • After many failed attempts to fix the justified-true-belief account with minor modifications, philosophers tried more radical departures. One promising approach suggests that knowledge is a form of action, comparable to an archer’s success when he consciously aims to hit a target.
  • An archer’s shot can be assessed in several ways. It can be accurate (successful in hitting the target). It can also be adroit (skillful or competent). An archery shot is adroit only if, as the arrow leaves the bow, it is oriented well and powerfully enough.
  • A shot’s aptness requires that its success be attained not just by luck (such as the luck of that second gust). The success must rather be a result of competence.
  • we can generalize from this example, to give an account of a fully successful attempt of any sort. Any attempt will have a distinctive aim and will thus be fully successful only if it succeeds not only adroitly but also aptly.
  • We need people to be willing to affirm things publicly. And we need them to be sincere (by and large) in doing so, by aligning public affirmation with private judgment. Finally, we need people whose assertions express what they actually know.
  • Aristotle in his “Nicomachean Ethics” developed an AAA account of attempts to lead a flourishing life in accord with fundamental human virtues (for example, justice or courage). Such an approach is called virtue ethics.
  • a fully successful attempt is good overall only if the agent’s goal is good enough. An attempt to murder an innocent person is not good even if it fully succeeds.
  • Virtue epistemology begins by recognizing assertions or affirmations.
  • A particularly important sort of affirmation is one aimed at attaining truth, at getting it right
  • All it takes for an affirmation to be alethic is that one of its aims be: getting it right.
  • Humans perform acts of public affirmation in the endeavor to speak the truth, acts with crucial importance to a linguistic species. We need such affirmations for activities of the greatest import for life in society: for collective deliberation and coordination, and for the sharing of information.
  • Since there is much truth that must be grasped if one is to flourish, some philosophers have begun to treat truth’s apt attainment as virtuous in the Aristotelian sense, and have developed a virtue epistemology
  • Virtue epistemology gives an AAA account of knowledge: to know affirmatively is to make an affirmation that is accurate (true) and adroit (which requires taking proper account of the evidence). But in addition, the affirmation must be apt; that is, its accuracy must be attributable to competence rather than luck.
  • Requiring knowledge to be apt (in addition to accurate and adroit) reconfigures epistemology as the ethics of belief.
  • as a bonus, it allows contemporary virtue epistemology to solve our Gettier problem. We now have an explanation for why you fail to know that someone in the cafe owns a Bentley, when your own Bentley has been destroyed by a bomb, but the barista happens to own one. Your belief in that case falls short of knowledge for the reason that it fails to be apt. You are right that someone in the cafe owns a Bentley, but the correctness of your belief does not manifest your cognitive or epistemic competence. You are right only because by epistemic luck the barista happens to own one.
  • When in your musings you affirm to yourself that someone in the cafe owns a Bentley, therefore, your affirmation is not an apt alethic affirmation, and hence falls short of knowledge.
5More

The Spoils of Happiness - NYTimes.com - 0 views

  • Happiness is more like knowledge than like belief. There are lots of things we believe but don’t know. Knowledge is not just up to you, it requires the cooperation of the world beyond you — you might be mistaken. Still, even if you’re mistaken, you believe what you believe. Pleasure is like belief that way. But happiness isn’t just up to you. It also requires the cooperation of the world beyond you. Happiness, like knowledge, and unlike belief and pleasure, is not a state of mind.
  • If happiness is not a state of mind, if happiness is a kind of tango between your feelings on one hand and events and things in the world around you on the other, then there’s the possibility of error about whether you’re happy. If you believe you’re experiencing pleasure or, perhaps especially, pain, then, presumably, you are. But the view of happiness here allows that “you may think you’re happy, but you’re not.”
  • One especially apt way of thinking about happiness — a way that’s found already in the thought of Aristotle — is in terms of “flourishing.” Take someone really flourishing in their new career, or really flourishing now that they’re off in college. The sense of the expression is not just that they feel good, but that they’re, for example, accomplishing some things and taking appropriate pleasure in those accomplishments. If they were simply sitting at home playing video games all day, even if this seemed to give them a great deal of pleasure, and even if they were not frustrated, we wouldn’t say they were flourishing. Such a life could not in the long term constitute a happy life. To live a happy life is to flourish.
  • ...2 more annotations...
  • Happiness is harder to get. It’s enjoyed after you’ve worked for something, or in the presence of people you love, or upon experiencing a magnificent work of art or performance — the kind of state that requires us to engage in real activities of certain sorts, to confront real objects and respond to them.
  • viewing happiness as subject to external influence limits our control — not just in the sense that whether you get to live happily might depend on how things go, but also in the sense that what happiness is is partly a matter of how things beyond you are. We might do everything we can to live happily — and have everything it takes on our part to be happy, all the right thoughts and feelings — and yet fall short, even unbeknownst to us.
15More

Kung Fu for Philosophers - NYTimes.com - 0 views

  • any ability resulting from practice and cultivation could accurately be said to embody kung fu.
  • the predominant orientation of traditional Chinese philosophy is the concern about how to live one’s life, rather than finding out the truth about reality.
  • Confucius’s call for “rectification of names” — one must use words appropriately — is more a kung fu method for securing sociopolitical order than for capturing the essence of things, as “names,” or words, are placeholders for expectations of how the bearer of the names should behave and be treated. This points to a realization of what J. L. Austin calls the “performative” function of language.
  • ...12 more annotations...
  • Instead of leading to a search for certainty, as Descartes’s dream did, Zhuangzi came to the realization that he had perceived “the transformation of things,” indicating that one should go along with this transformation rather than trying in vain to search for what is real.
  • the views of Mencius and his later opponent Xunzi’s views about human nature are more recommendations of how one should view oneself in order to become a better person than metaphysical assertions about whether humans are by nature good or bad. Though each man’s assertions about human nature are incompatible with each other, they may still function inside the Confucian tradition as alternative ways of cultivation.
  • The Buddhist doctrine of no-self surely looks metaphysical, but its real aim is to free one from suffering, since according to Buddhism suffering comes ultimately from attachment to the self. Buddhist meditations are kung fu practices to shake off one’s attachment, and not just intellectual inquiries for getting propositional truth.
  • The essence of kung fu — various arts and instructions about how to cultivate the person and conduct one’s life — is often hard to digest for those who are used to the flavor and texture of mainstream Western philosophy. It is understandable that, even after sincere willingness to try, one is often still turned away by the lack of clear definitions of key terms and the absence of linear arguments in classic Chinese texts. This, however, is not a weakness, but rather a requirement of the kung fu orientation — not unlike the way that learning how to swim requires one to focus on practice and not on conceptual understanding.
  • It even expands epistemology into the non-conceptual realm in which the accessibility of knowledge is dependent on the cultivation of cognitive abilities, and not simply on whatever is “publicly observable” to everyone. It also shows that cultivation of the person is not confined to “knowing how.” An exemplary person may well have the great charisma to affect others but does not necessarily know how to affect others.
  • Western philosophy at its origin is similar to classic Chinese philosophy. The significance of this point is not merely in revealing historical facts. It calls our attention to a dimension that has been eclipsed by the obsession with the search for eternal, universal truth and the way it is practiced, namely through rational arguments.
  • One might well consider the Chinese kung fu perspective a form of pragmatism.  The proximity between the two is probably why the latter was well received in China early last century when John Dewey toured the country. What the kung fu perspective adds to the pragmatic approach, however, is its clear emphasis on the cultivation and transformation of the person, a dimension that is already in Dewey and William James but that often gets neglected
  • A kung fu master does not simply make good choices and use effective instruments to satisfy whatever preferences a person happens to have. In fact the subject is never simply accepted as a given. While an efficacious action may be the result of a sound rational decision, a good action that demonstrates kung fu has to be rooted in the entire person, including one’s bodily dispositions and sentiments, and its goodness is displayed not only through its consequences but also in the artistic style one does it. It also brings forward what Charles Taylor calls the “background” — elements such as tradition and community — in our understanding of the formation of a person’s beliefs and attitudes. Through the kung fu approach, classic Chinese philosophy displays a holistic vision that brings together these marginalized dimensions and thereby forces one to pay close attention to the ways they affect each other.
  • This kung fu approach shares a lot of insights with the Aristotelian virtue ethics, which focuses on the cultivation of the agent instead of on the formulation of rules of conduct. Yet unlike Aristotelian ethics, the kung fu approach to ethics does not rely on any metaphysics for justification.
  • This approach opens up the possibility of allowing multiple competing visions of excellence, including the metaphysics or religious beliefs by which they are understood and guided, and justification of these beliefs is then left to the concrete human experiences.
  • it is more appropriate to consider kung fu as a form of art. Art is not ultimately measured by its dominance of the market. In addition, the function of art is not accurate reflection of the real world; its expression is not constrained to the form of universal principles and logical reasoning, and it requires cultivation of the artist, embodiment of virtues/virtuosities, and imagination and creativity.
  • If philosophy is “a way of life,” as Pierre Hadot puts it, the kung fu approach suggests that we take philosophy as the pursuit of the art of living well, and not just as a narrowly defined rational way of life.
16More

To Justify Every 'A,' Some Professors Hand Over Grading Power to Outsiders - Technology... - 0 views

  • The best way to eliminate grade inflation is to take professors out of the grading process: Replace them with professional evaluators who never meet the students, and who don't worry that students will punish harsh grades with poor reviews. That's the argument made by leaders of Western Governors University, which has hired 300 adjunct professors who do nothing but grade student work.
  • These efforts raise the question: What if professors aren't that good at grading? What if the model of giving instructors full control over grades is fundamentally flawed? As more observers call for evidence of college value in an era of ever-rising tuition costs, game-changing models like these are getting serious consideration.
  • Professors do score poorly when it comes to fair grading, according to a study published in July in the journal Teachers College Record. After crunching the numbers on decades' worth of grade reports from about 135 colleges, the researchers found that average grades have risen for 30 years, and that A is now the most common grade given at most colleges. The authors, Stuart Rojstaczer and Christopher Healy, argue that a "consumer-based approach" to higher education has created subtle incentives for professors to give higher marks than deserved. "The standard practice of allowing professors free rein in grading has resulted in grades that bear little relation to actual performance," the two professors concluded.
  • ...13 more annotations...
  • Western Governors is entirely online, for one thing. Technically it doesn't offer courses; instead it provides mentors who help students prepare for a series of high-stakes homework assignments. Those assignments are designed by a team of professional test-makers to prove competence in various subject areas. The idea is that as long as students can leap all of those hurdles, they deserve degrees, whether or not they've ever entered a classroom, watched a lecture video, or participated in any other traditional teaching experience. The model is called "competency-based education."
  • Ms. Johnson explains that Western Governors essentially splits the role of the traditional professor into two jobs. Instructional duties fall to a group the university calls "course mentors," who help students master material. The graders, or evaluators, step in once the homework is filed, with the mind-set of, "OK, the teaching's done, now our job is to find out how much you know," says Ms. Johnson. They log on to a Web site called TaskStream and pluck the first assignment they see. The institution promises that every assignment will be graded within two days of submission.
  • Western Governors requires all evaluators to hold at least a master's degree in the subject they're grading.
  • Evaluators are required to write extensive comments on each task, explaining why the student passed or failed to prove competence in the requisite skill. No letter grades are given—students either pass or fail each task.
  • Another selling point is the software's fast response rate. It can grade a batch of 1,000 essay tests in minutes. Professors can set the software to return the grade immediately and can give students the option of making revisions and resubmitting their work on the spot.
  • The graders must regularly participate in "calibration exercises," in which they grade a simulated assignment to make sure they are all scoring consistently. As the phrase suggests, the process is designed to run like a well-oiled machine.
  • Other evaluators want to push talented students to do more than the university's requirements for a task, or to allow a struggling student to pass if he or she is just under the bar. "Some people just can't acclimate to a competency-based environment," says Ms. Johnson. "I tell them, If they don't buy this, they need to not be here.
  • She and some teaching assistants scored the tests by hand and compared their performance with the computer's.
  • The graduate students became fatigued and made mistakes after grading several tests in a row, she told me, "but the machine was right-on every time."
  • He argues that students like the idea that their tests are being evaluated in a consistent way.
  • All evaluators initially receive a month of training, conducted online, about how to follow each task's grading guidelines, which lay out characteristics of a passing score.
  • He said once students get essays back instantly, they start to view essay tests differently. "It's almost like a big math problem. You don't expect to get everything right the first time, but you work through it.
  • robot grading is the hottest trend in testing circles, says Jacqueline Leighton, a professor of educational psychology at the University of Alberta who edits the journal Educational Measurement: Issues and Practice. Companies building essay-grading robots include the Educational Testing Service, which sells e-rater, and Pearson Education, which makes Intelligent Essay Assessor. "The research is promising, but they're still very much in their infancy," Ms. Leighton says.
3More

How To Repel Tourism « The Dish - 0 views

  • In short: Demanding a visa from a country’s travelers in advance is associated with a 70 percent lower level of tourist entries than from a similar country where there is no visa requirement. The U.S. requires an advance visa from citizens of 81 percent of the world’s countries; if it waived that requirement, the researchers estimate, inbound tourism arrivals would more than double, and tourism expenditure would climb by $123 billion.
  • what it is like to enter the US as a non-citizen. It’s a grueling, off-putting, frightening, and often brutal process. Compared with entering a European country, it’s like entering a police state. When you add the sheer difficulty of getting a visa, the brusque, rude and contemptuous treatment you routinely get from immigration officials at the border, the sense that all visitors are criminals and potential terrorists unless proven otherwise, the US remains one of the most unpleasant places for anyone in the world to try and get access to.
  • And this, of course, is a function not only of a vast and all-powerful bureaucracy. It’s a function of this country’s paranoia and increasing insularity. It’s a thoroughly democratic decision to keep foreigners out as much as possible. And it’s getting worse and worse.
9More

Brainlike Computers, Learning From Experience - NYTimes.com - 0 views

  • Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
  • Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.
  • The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information.
  • ...6 more annotations...
  • In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control.
  • “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr
  • The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
  • They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “learned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
  • Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic. I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
  • Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
21More

A Meditation on the Art of Not Trying - NYTimes.com - 0 views

  • It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself. But when you’re nervous, how can you be yourself?
  • Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
  • He calls it the paradox of wu wei, the Chinese term for “effortless action.”
  • ...18 more annotations...
  • Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
  • the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good.
  • But there was always the danger that someone was faking it and would make a perfectly rational decision to put his own interest first if he had a chance to shirk his duty.
  • To be trusted, it wasn’t enough just to be a sensible, law-abiding citizen, and it wasn’t even enough to dutifully strive to be virtuous. You had to demonstrate that your virtue was so intrinsic that it came to you effortlessly.
  • the discovery in 1993 of bamboo strips in a tomb in the village of Guodian in central China. The texts on the bamboo, composed more than three centuries before Christ, emphasize that following rules and fulfilling obligations are not enough to maintain social order.
  • These texts tell aspiring politicians that they must have an instinctive sense of their duties to their superiors: “If you try to be filial, this not true filiality; if you try to be obedient, this is not true obedience. You cannot try, but you also cannot not try.”
  • is that authentic wu wei? Not according to the rival school of Taoists that arose around the same time as Confucianism, in the fifth century B.C. It was guided by the Tao Te Ching, “The Classic of the Way and Virtue,” which took a direct shot at Confucius: “The worst kind of Virtue never stops striving for Virtue, and so never achieves Virtue.”
  • Through willpower and the rigorous adherence to rules, traditions and rituals, the Confucian “gentleman” was supposed to learn proper behavior so thoroughly that it would eventually become second nature to him.
  • Taoists did not strive. Instead of following the rigid training and rituals required by Confucius, they sought to liberate the natural virtue within. They went with the flow. They disdained traditional music in favor of a funkier new style with a beat. They emphasized personal meditation instead of formal scholarship.
  • Variations of this debate would take place among Zen Buddhist, Hindu and Christian philosophers, and continue today among psychologists and neuroscientists arguing how much of morality and behavior is guided by rational choices or by unconscious feelings.
  • “Psychological science suggests that the ancient Chinese philosophers were genuinely on to something,” says Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “Particularly when one has developed proficiency in an area, it is often better to simply go with the flow. Paralysis through analysis and overthinking are very real pitfalls that the art of wu wei was designed to avoid.”
  • Before signing a big deal, businesspeople often insist on getting to know potential partners at a boozy meal because alcohol makes it difficult to fake feelings.
  • Some people, like politicians and salespeople, can get pretty good at faking spontaneity, but we’re constantly looking for ways to expose them.
  • However wu wei is attained, there’s no debate about the charismatic effect it creates. It conveys an authenticity that makes you attractive, whether you’re addressing a crowd or talking to one person.
  • what’s the best strategy for wu wei — trying or not trying? Dr. Slingerland recommends a combination. Conscious effort is necessary to learn a skill, and the Confucian emphasis on following rituals is in accord with psychological research showing we have a limited amount of willpower. Training yourself to follow rules automatically can be liberating, because it conserves cognitive energy for other tasks.
  • He likes the compromise approach of Mencius, a Chinese philosopher in the fourth century B.C. who combined the Confucian and Taoist approaches: Try, but not too hard.
  • “But in many domains actual success requires the ability to transcend our training and relax completely into what we are doing, or simply forget ourselves as agents.”
  • The sprouts were Mencius’ conception of wu wei: Something natural that requires gentle cultivation. You plant the seeds and water the sprouts, but at some point you need to let nature take its course. Just let the sprouts be themselves.
10More

The Cost of Relativism - NYTimes.com - 0 views

  • One of America’s leading political scientists, Robert Putnam, has just come out with a book called “Our Kids” about the growing chasm between those who live in college-educated America and those who live in high-school-educated America
  • Reintroducing norms will require, first, a moral vocabulary. These norms weren’t destroyed because of people with bad values. They were destroyed by a plague of nonjudgmentalism, which refused to assert that one way of behaving was better than another. People got out of the habit of setting standards or understanding how they were set.
  • sympathy is not enough. It’s not only money and better policy that are missing in these circles; it’s norms.
  • ...7 more annotations...
  • The health of society is primarily determined by the habits and virtues of its citizens.
  • In many parts of America there are no minimally agreed upon standards for what it means to be a father. There are no basic codes and rules woven into daily life, which people can absorb unconsciously and follow automatically.
  • Roughly 10 percent of the children born to college grads grow up in single-parent households. Nearly 70 percent of children born to high school grads do. There are a bunch of charts that look like open scissors. In the 1960s or 1970s, college-educated and noncollege-educated families behaved roughly the same. But since then, behavior patterns have ever more sharply diverged. High-school-educated parents dine with their children less than college-educated parents, read to them less, talk to them less, take them to church less, encourage them less and spend less time engaging in developmental activity.
  • Next it will require holding people responsible. People born into the most chaotic situations can still be asked the same questions: Are you living for short-term pleasure or long-term good? Are you living for yourself or for your children? Do you have the freedom of self-control or are you in bondage to your desires?
  • Next it will require holding everybody responsible. America is obviously not a country in which the less educated are behaving irresponsibly and the more educated are beacons of virtue. America is a country in which privileged people suffer from their own characteristic forms of self-indulgence: the tendency to self-segregate, the comprehensive failures of leadership in government and industry.
  • People sometimes wonder why I’ve taken this column in a spiritual and moral direction of late. It’s in part because we won’t have social repair unless we are more morally articulate, unless we have clearer definitions of how we should be behaving at all levels.
  • History is full of examples of moral revival, when social chaos was reversed, when behavior was tightened and norms reasserted. It happened in England in the 1830s and in the U.S. amid economic stress in the 1930s.
12More

Great Scientists Don't Need Math - WSJ - 0 views

  • Without advanced math, how can you do serious work in the sciences? Well, I have a professional secret to share: Many of the most successful scientists in the world today are mathematically no more than semiliterate.
  • I was reassured by the discovery that superior mathematical ability is similar to fluency in foreign languages. I might have become fluent with more effort and sessions talking with the natives, but being swept up with field and laboratory research, I advanced only by a small amount.
  • Far more important throughout the rest of science is the ability to form concepts, during which the researcher conjures images and processes by intuition.
  • ...9 more annotations...
  • exceptional mathematical fluency is required in only a few disciplines, such as particle physics, astrophysics and information theory
  • When something new is encountered, the follow-up steps usually require mathematical and statistical methods to move the analysis forward. If that step proves too technically difficult for the person who made the discovery, a mathematician or statistician can be added as a collaborator
  • Ideas in science emerge most readily when some part of the world is studied for its own sake. They follow from thorough, well-organized knowledge of all that is known or can be imagined of real entities and processes within that fragment of existence
  • Ramped up and disciplined, fantasies are the fountainhead of all creative thinking. Newton dreamed, Darwin dreamed, you dream. The images evoked are at first vague. They may shift in form and fade in and out. They grow a bit firmer when sketched as diagrams on pads of paper, and they take on life as real examples are sought and found.
  • Over the years, I have co-written many papers with mathematicians and statisticians, so I can offer the following principle with confidence. Call it Wilson's Principle No. 1: It is far easier for scientists to acquire needed collaboration from mathematicians and statisticians than it is for mathematicians and statisticians to find scientists able to make use of their equations.
  • If your level of mathematical competence is low, plan to raise it, but meanwhile, know that you can do outstanding scientific work with what you have. Think twice, though, about specializing in fields that require a close alternation of experiment and quantitative analysis. These include most of physics and chemistry, as well as a few specialties in molecular biology.
  • Newton invented calculus in order to give substance to his imagination
  • Darwin had little or no mathematical ability, but with the masses of information he had accumulated, he was able to conceive a process to which mathematics was later applied.
  • For aspiring scientists, a key first step is to find a subject that interests them deeply and focus on it. In doing so, they should keep in mind Wilson's Principle No. 2: For every scientist, there exists a discipline for which his or her level of mathematical competence is enough to achieve excellence.
16More

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
23More

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
‹ Previous 21 - 40 of 581 Next › Last »
Showing 20 items per page