Skip to main content

Home/ TOK Friends/ Group items tagged Past

Rss Feed Group items tagged

Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Javier E

Why Study History? (1985) | AHA - 0 views

  • Isn't there quite enough to learn about the world today? Why add to the burden by looking at the past
  • Historical knowledge is no more and no less than carefully and critically constructed collective memory. As such it can both make us wiser in our public choices and more richly human in our private lives.
  • Without individual memory, a person literally loses his or her identity, and would not know how to act in encounters with others. Imagine waking up one morning unable to tell total strangers from family and friends!
  • ...37 more annotations...
  • Collective memory is similar, though its loss does not immediately paralyze everyday private activity. But ignorance of history-that is, absent or defective collective memory-does deprive us of the best available guide for public action, especially in encounters with outsider
  • Often it is enough for experts to know about outsiders, if their advice is listened to. But democratic citizenship and effective participation in the determination of public policy require citizens to share a collective memory, organized into historical knowledge and belief
  • This value of historical knowledge obviously justifies teaching and learning about what happened in recent times, for the way things are descends from the way they were yesterday and the day before that
  • in fact, institutions that govern a great deal of our everyday behavior took shape hundreds or even thousands of years ago
  • Only an acquaintance with the entire human adventure on earth allows us to understand these dimensions of contemporary reality.
  • it follows that study of history is essential for every young person.
  • Collective memory is quite the same. Historians are always at work reinterpreting the past, asking new questions, searching new sources and finding new meanings in old documents in order to bring the perspective of new knowledge and experience to bear on the task of understanding the past.
  • what we know and believe about history is always changing. In other words, our collective, codified memory alters with time just as personal memories do, and for the same reasons.
  • skeptics are likely to conclude that history has no right to take student time from other subjects. If what is taught today is not really true, how can it claim space in a crowded school curriculum?
  • what if the world is more complicated and diverse than words can ever tell? What if human minds are incapable of finding' neat pigeon holes into which everything that happens will fit?
  • What if we have to learn to live with uncertainty and probabilities, and act on the basis of the best guesswork we are capable of?
  • Then, surely, the changing perspectives of historical understanding are the very best introduction we can have to the practical problems of real life. Then, surely, a serious effort to understand the interplay of change and continuity in human affairs is the only adequate introduction human beings can have to the confusing flow of events that constitutes the actual, adult world.
  • Memory is not something fixed and forever. As time passes, remembered personal experiences take on new meanings.
  • Early in this century, teachers and academic administrators pretty well agreed that two sorts of history courses were needed: a survey of the national history of the United States and a survey of European history.
  • Memory, indeed, makes us human. History, our collective memory, carefully codified and critically revised, makes us social, sharing ideas and ideals with others so as to form all sorts of different human groups
  • The varieties of history are enormous; facts and probabilities about the past are far too numerous for anyone to comprehend them all. Every sort of human group has its own histor
  • Where to start? How bring some sort of order to the enormous variety of things known and believed about the past?
  • Systematic sciences are not enough. They discount time, and therefore oversimplify reality, especially human reality.
  • This second course was often broadened into a survey of Western civilization in the 1930s and 1940s
  • But by the 1960s and 1970s these courses were becoming outdated, left behind by the rise of new kinds social and quantitative history, especially the history of women, of Blacks, and of other formerly overlooked groups within the borders of the United States, and of peoples emerging from colonial status in the world beyond our borders.
  • much harder to combine old with new to make an inclusive, judiciously balanced (and far less novel) introductory course for high school or college students.
  • But abandoning the effort to present a meaningful portrait of the entire national and civilizational past destroyed the original justification for requiring students to study history
  • Competing subjects abounded, and no one could or would decide what mattered most and should take precedence. As this happened, studying history became only one among many possible ways of spending time in school.
  • The costs of this change are now becoming apparent, and many concerned persons agree that returning to a more structured curriculum, in which history ought to play a prominent part, is imperative.
  • three levels of generality seem likely to have the greatest importance for ordinary people.
  • First is family, local, neighborhood history
  • Second is national history, because that is where political power is concentrated in our time.
  • Last is global history, because intensified communications make encounters with all the other peoples of the earth increasingly important.
  • Other pasts are certainly worth attention, but are better studied in the context of a prior acquaintance with personal-local, national, and global history. That is because these three levels are the ones that affect most powerfully what all other groups and segments of society actually do.
  • National history that leaves out Blacks and women and other minorities is no longer acceptable; but American history that leaves out the Founding Fathers and the Constitution is not acceptable either. What is needed is a vision of the whole, warts and all.
  • the study of history does not lead to exact prediction of future events. Though it fosters practical wisdom, knowledge of the past does not permit anyone to know exactly what is going to happen
  • Consequently, the lessons of history, though supremely valuable when wisely formulated, become grossly misleading when oversimplifiers try to transfer them mechanically from one age to another, or from one place to another.
  • Predictable fixity is simply not the human way of behaving. Probabilities and possibilities-together with a few complete surprises-are what we live with and must learn to expect.
  • Second, as acquaintance with the past expands, delight in knowing more and more can and often does become an end in itself.
  • On the other hand, studying alien religious beliefs, strange customs, diverse family patterns and vanished social structures shows how differently various human groups have tried to cop
  • Broadening our humanity and extending our sensibilities by recognizing sameness and difference throughout the recorded past is therefore an important reason for studying history, and especially the history of peoples far away and long ago
  • For we can only know ourselves by knowing how we resemble and how we differ from others. Acquaintance with the human past is the only way to such self knowledge.
Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
Javier E

The New History Wars - The Atlantic - 0 views

  • Critical historians who thought they were winning the fight for control within the academy now face dire retaliation from outside the academy. The dizzying turn from seeming triumph in 2020 to imminent threat in 2022 has unnerved many practitioners of the new history. Against this background, they did not welcome it when their association’s president suggested that maybe their opponents had a smidgen of a point.
  • a background reality of the humanities in the contemporary academy: a struggle over who is entitled to speak about what. Nowhere does this struggle rage more fiercely than in anything to do with the continent of Africa. Who should speak? What may be said? Who will be hired?
  • ne obvious escape route from the generational divide in the academy—and the way the different approaches to history, presentist and antiquarian, tend to map onto it—is for some people, especially those on the older and whiter side of the divide, to keep their mouths shut about sensitive issues
  • ...15 more annotations...
  • The political and methodological stresses within the historical profession are intensified by economic troubles. For a long time, but especially since the economic crisis of 2008, university students have turned away from the humanities, preferring to major in fields that seem to offer more certain and lucrative employment. Consequently, academic jobs in the humanities and especially in history have become radically more precarious for younger faculty—even as universities have sought to meet diversity goals in their next-generation hiring by expanding offerings in history-adjacent specialties, such as gender and ethnic studies.
  • The result has produced a generational divide. Younger scholars feel oppressed and exploited by universities pressing them to do more labor for worse pay with less security than their elders; older scholars feel that overeager juniors are poised to pounce on the least infraction as an occasion to end an elder’s career and seize a job opening for themselves. Add racial difference as an accelerant, and what was intended as an interesting methodological discussion in a faculty newsletter can explode into a national culture war.
  • One of the greatest American Africanists was the late Philip Curtin. He wrote one of the first attempts to tally the exact number of persons trafficked by the transatlantic slave trade. Upon publication in 1972, his book was acclaimed as a truly pioneering work of history. By 1995, however, he was moved to protest against trends in the discipline at that time in an article in the Chronicle of Higher Education:I am troubled by increasing evidence of the use of racial criteria in filling faculty posts in the field of African history … This form of intellectual apartheid has been around for several decades, but it appears to have become much more serious in the past few years, to the extent that white scholars trained in African history now have a hard time finding jobs.
  • Much of academia is governed these days by a joke from the Soviet Union: “If you think it, don’t speak it. If you speak it, don’t write it. If you write it, don’t sign it. But if you do think it, speak it, write it, and sign it—don’t be surprised.”
  • Yet this silence has consequences, too. One of the most unsettling is the displacement of history by mythmaking
  • mythmaking is spreading from “just the movies” to more formal and institutional forms of public memory. If old heroes “must fall,” their disappearance opens voids for new heroes to be inserted in their place—and that insertion sometimes requires that new history be fabricated altogether, the “bad history” that Sweet tried to warn against.
  • If it is not the job of the president of the American Historical Association to confront those questions, then whose is it?
  • Sweet used a play on words—“Is History History?”—for the title of his complacency-shaking essay. But he was asking not whether history is finished, done with, but Is history still history? Is it continuing to do what history is supposed to do? Or is it being annexed for other purposes, ideological rather than historical ones?
  • Advocates of studying the more distant past to disturb and challenge our ideas about the present may accuse their academic rivals of “presentism.”
  • In real life, of course, almost everybody who cares about history believes in a little of each option. But how much of each? What’s the right balance? That’s the kind of thing that historians do argue about, and in the arguing, they have developed some dismissive labels for one another
  • Those who look to the more recent past to guide the future may accuse the other camp of “antiquarianism.”
  • The accusation of presentism hurts because it implies that the historian is sacrificing scholarly objectivity for ideological or political purposes. The accusation of antiquarianism stings because it implies that the historian is burrowing into the dust for no useful purpose at all.
  • In his mind, he was merely reopening one of the most familiar debates in professional history: the debate over why? What is the value of studying the past? To reduce the many available answers to a stark choice: Should we study the more distant past to explore its strangeness—and thereby jolt ourselves out of easy assumptions that the world we know is the only possible one?
  • Or should we study the more recent past to understand how our world came into being—and thereby learn some lessons for shaping the future?
  • The August edition of the association’s monthly magazine featured, as usual, a short essay by the association’s president, James H. Sweet, a professor at the University of Wisconsin at Madison. Within hours of its publication, an outrage volcano erupted on social media. A professor at Cornell vented about the author’s “white gaze.”
Javier E

A Leading Memory Researcher Explains How to Make Precious Moments Last - The New York T... - 0 views

  • Our memories form the bedrock of who we are. Those recollections, in turn, are built on one very simple assumption: This happened. But things are not quite so simple
  • “We update our memories through the act of remembering,” says Charan Ranganath, a professor of psychology and neuroscience at the University of California, Davis, and the author of the illuminating new book “Why We Remember.” “So it creates all these weird biases and infiltrates our decision making. It affects our sense of who we are.
  • Rather than being photo-accurate repositories of past experience, Ranganath argues, our memories function more like active interpreters, working to help us navigate the present and future. The implication is that who we are, and the memories we draw on to determine that, are far less fixed than you might think. “Our identities,” Ranganath says, “are built on shifting sand.”
  • ...24 more annotations...
  • People believe that memory should be effortless, but their expectations for how much they should remember are totally out of whack with how much they’re capable of remembering.1
  • What is the most common misconception about memory?
  • Another misconception is that memory is supposed to be an archive of the past. We expect that we should be able to replay the past like a movie in our heads.
  • we don’t replay the past as it happened; we do it through a lens of interpretation and imagination.
  • How much are we capable of remembering, from both an episodic2 2 Episodic memory is the term for the memory of life experiences. and a semantic3 3 Semantic memory is the term for the memory of facts and knowledge about the world. standpoint?
  • I would argue that we’re all everyday-memory experts, because we have this exceptional semantic memory, which is the scaffold for episodic memory.
  • If what we’re remembering, or the emotional tenor of what we’re remembering, is dictated by how we’re thinking in a present moment, what can we really say about the truth of a memory?
  • But if memories are malleable, what are the implications for how we understand our “true” selves?
  • your question gets to a major purpose of memory, which is to give us an illusion of stability in a world that is always changing. Because if we look for memories, we’ll reshape them into our beliefs of what’s happening right now. We’ll be biased in terms of how we sample the past. We have these illusions of stability, but we are always changing
  • And depending on what memories we draw upon, those life narratives can change.
  • we have this illusion that much of the world is cause and effect. But the reason, in my opinion, that we have that illusion is that our brain is constantly trying to find the patterns
  • One thing that makes the human brain so sophisticated is that we have a longer timeline in which we can integrate information than many other species. That gives us the ability to say: “Hey, I’m walking up and giving money to the cashier at the cafe. The barista is going to hand me a cup of coffee in about a minute or two.”
  • There is this illusion that we know exactly what’s going to happen, but the fact is we don’t. Memory can overdo it: Somebody lied to us once, so they are a liar; somebody shoplifted once, they are a thief.
  • If people have a vivid memory of something that sticks out, that will overshadow all their knowledge about the way things work. So there’s kind of an illus
  • I know it sounds squirmy to say, “Well, I can’t answer the question of how much we remember,” but I don’t want readers to walk away thinking memory is all made up.
  • I think of memory more like a painting than a photograph. There’s often photorealistic aspects of a painting, but there’s also interpretation. As a painter evolves, they could revisit the same subject over and over and paint differently based on who they are now. We’re capable of remembering things in extraordinary detail, but we infuse meaning into what we remember. We’re designed to extract meaning from the past, and that meaning should have truth in it. But it also has knowledge and imagination and, sometimes, wisdom.
  • memory, often, is educated guesses by the brain about what’s important. So what’s important? Things that are scary, things that get your desire going, things that are surprising. Maybe you were attracted to this person, and your eyes dilated, your pulse went up. Maybe you were working on something in this high state of excitement, and your dopamine was up.
  • It could be any of those things, but they’re all important in some way, because if you’re a brain, you want to take what’s surprising, you want to take what’s motivationally important for survival, what’s new.
  • On the more intentional side, are there things that we might be able to do in the moment to make events last in our memories? In some sense, it’s about being mindful. If we want to form a new memory, focus on aspects of the experience you want to take with you.
  • If you’re with your kid, you’re at a park, focus on the parts of it that are great, not the parts that are kind of annoying. Then you want to focus on the sights, the sounds, the smells, because those will give you rich detail later on
  • Another part of it, too, is that we kill ourselves by inducing distractions in our world. We have alerts on our phones. We check email habitually.
  • When we go on trips, I take candid shots. These are the things that bring you back to moments. If you capture the feelings and the sights and the sounds that bring you to the moment, as opposed to the facts of what happened, that is a huge part of getting the best of memory.
  • this goes back to the question of whether the factual truth of a memory matters to how we interpret it. I think it matters to have some truth, but then again, many of the truths we cling to depend on our own perspective.
  • There’s a great experiment on this. These researchers had people read this story about a house.8 8 The study was “Recall of Previously Unrecallable Information Following a Shift in Perspective,” by Richard C. Anderson and James W. Pichert. One group of subjects is told, I want you to read this story from the perspective of a prospective home buyer. When they remember it, they remember all the features of the house that are described in the thing. Another group is told, I want you to remember this from the perspective of a burglar. Those people tend to remember the valuables in the house and things that you would want to take. But what was interesting was then they switched the groups around. All of a sudden, people could pull up a number of details that they didn’t pull up before. It was always there, but they just didn’t approach it from that mind-set. So we do have a lot of information that we can get if we change our perspective, and this ability to change our perspective is exceptionally important for being accurate. It’s exceptionally important for being able to grow and modify our beliefs
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
Javier E

The Problem With History Classes - The Atlantic - 3 views

  • The passion and urgency with which these battles are fought reflect the misguided way history is taught in schools. Currently, most students learn history as a set narrative—a process that reinforces the mistaken idea that the past can be synthesized into a single, standardized chronicle of several hundred pages. This teaching pretends that there is a uniform collective story, which is akin to saying everyone remembers events the same.
  • Yet, history is anything but agreeable. It is not a collection of facts deemed to be "official" by scholars on high. It is a collection of historians exchanging different, often conflicting analyses.
  • rather than vainly seeking to transcend the inevitable clash of memories, American students would be better served by descending into the bog of conflict and learning the many "histories" that compose the American national story.
  • ...18 more annotations...
  • Perhaps Fisher offers the nation an opportunity to divorce, once and for all, memory from history. History may be an attempt to memorialize and preserve the past, but it is not memory; memories can serve as primary sources, but they do not stand alone as history. A history is essentially a collection of memories, analyzed and reduced into meaningful conclusions—but that collection depends on the memories chosen.
  • Memories make for a risky foundation: As events recede further into the past, the facts are distorted or augmented by entirely new details
  • people construct unique memories while informing perfectly valid histories. Just as there is a plurality of memories, so, too, is there a plurality of histories.
  • Scholars who read a diverse set of historians who are all focused on the same specific period or event are engaging in historiography
  • This approach exposes textbooks as nothing more than a compilation of histories that the authors deemed to be most relevant and useful.
  • In historiography, the barrier between historian and student is dropped, exposing a conflict-ridden landscape. A diplomatic historian approaches an event from the perspective of the most influential statesmen (who are most often white males), analyzing the context, motives, and consequences of their decisions. A cultural historian peels back the objects, sights, and sounds of a period to uncover humanity’s underlying emotions and anxieties. A Marxist historian adopts the lens of class conflict to explain the progression of events. There are intellectual historians, social historians, and gender historians, among many others. Historians studying the same topic will draw different interpretations—sometimes radically so, depending on the sources they draw from
  • Jacoba Urist points out that history is "about explaining and interpreting past events analytically." If students are really to learn and master these analytical tools, then it is absolutely essential that they read a diverse set of historians and learn how brilliant men and women who are scrutinizing the same topic can reach different conclusions
  • Rather than constructing a curriculum based on the muddled consensus of boards, legislatures, and think tanks, schools should teach students history through historiography. The shortcomings of one historian become apparent after reading the work of another one on the list.
  • Although, as Urist notes, the AP course is "designed to teach students to think like historians," my own experience in that class suggests that it fails to achieve that goal.
  • The course’s framework has always served as an outline of important concepts aiming to allow educators flexibility in how to teach; it makes no reference to historiographical conflicts. Historiography was an epiphany for me because I had never before come face-to-face with how historians think and reason
  • When I took AP U.S. History, I jumbled these diverse histories into one indistinct narrative. Although the test involved open-ended essay questions, I was taught that graders were looking for a firm thesis—forcing students to adopt a side. The AP test also, unsurprisingly, rewards students who cite a wealth of supporting details
  • By the time I took the test in 2009, I was a master at "checking boxes," weighing political factors equally against those involving socioeconomics and ensuring that previously neglected populations like women and ethnic minorities received their due. I did not know that I was pulling ideas from different historiographical traditions. I still subscribed to the idea of a prevailing national narrative and served as an unwitting sponsor of synthesis, oblivious to the academic battles that made such synthesis impossible.
  • Although there may be an inclination to seek to establish order where there is chaos, that urge must be resisted in teaching history. Public controversies over memory are hardly new. Students must be prepared to confront divisiveness, not conditioned to shoehorn agreement into situations where none is possible
  • When conflict is accepted rather than resisted, it becomes possible for different conceptions of American history to co-exist. There is no longer a need to appoint a victor.
  • More importantly, the historiographical approach avoids pursuing truth for the sake of satisfying a national myth
  • The country’s founding fathers crafted some of the finest expressions of personal liberty and representative government the world has ever seen; many of them also held fellow humans in bondage. This paradox is only a problem if the goal is to view the founding fathers as faultless, perfect individuals. If multiple histories are embraced, no one needs to fear that one history will be lost.
  • History is not indoctrination. It is a wrestling match. For too long, the emphasis has been on pinning the opponent. It is time to shift the focus to the struggle itself
  • There is no better way to use the past to inform the present than by accepting the impossibility of a definitive history—and by ensuring that current students are equipped to grapple with the contested memories in their midst.
Javier E

You Won't Stay the Same, Study Finds - NYTimes.com - 0 views

  • When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years.
  • when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.
  • They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement
  • ...5 more annotations...
  • the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.
  • At every age we think we’re having the last laugh, and at every age we’re wrong.”
  • a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness. “Believing that we just reached the peak of our personal evolution makes us feel good,”
  • Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,”
  • “The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.
Javier E

The Importance of Doing Recent History | History News Network - 1 views

  • We argue that writing contemporary history is different from the role historians might play as public intellectuals who draw on their expertise to comment on recent events in the media. Instead, the writing of recent history shifts the boundaries of what might be considered a legitimate topic of historical study. The very definition of “history” has hinged on the sense of a break between past and present that allows for critical perspective. The historians’ traditional task has been to bring a “dead,” absent past back into the present. However, those doing recent history recognize that their subject matter is not fully past, or as Renee Romano puts it in our edited collection about recent history, it’s “not dead yet.”
  • studying the recent past presents real methodological challenges. It untethers the academic historian from the aspects of our practice that give us all, regardless of field or political bent, a sense of common enterprise: objectivity, perspective, a defined archive, and a secondary literature that is there to be argued with, corrected and leaned upon.
Sophia C

Thomas Kuhn: Revolution Against Scientific Realism* - 1 views

  • as such a complex system that nobody believed that it corresponded to the physical reality of the universe. Although the Ptolemaic system accounted for observations-"saved the appearances"-its epicycles and deferents were never intended be anything more than a mathematical model to use in predicting the position of heavenly bodies. [3]
  • lileo that he was free to continue his work with Copernican theory if he agreed that the theory did not describe physical reality but was merely one of the many potential mathematical models. [10] Galileo continued to work, and while he "formally (23)claimed to prove nothing," [11] he passed his mathematical advances and his observational data to Newton, who would not only invent a new mathematics but would solve the remaining problems posed by Copernicus. [12]
  • Thus without pretending that his method could find the underlying causes of things such as gravity, Newton believed that his method produced theory, based upon empirical evidence, that was a close approximation of physical reality.
  • ...27 more annotations...
  • Medieval science was guided by "logical consistency."
  • The logical empiricist's conception of scientific progress was thus a continuous one; more comprehensive theory replaced compatible, older theory
  • Hempel also believed that science evolved in a continuous manner. New theory did not contradict past theory: "theory does not simply refute the earlier empirical generalizations in its field; rather, it shows that within a certain limited range defined by qualifying conditions, the generalizations hold true in fairly close approximation." [21]
  • New theory is more comprehensive; the old theory can be derived from the newer one and is one special manifestation" [22] of the more comprehensive new theory.
  • movement combined induction, based on empiricism, and deduction in the form of logic
  • It was the truth, and the prediction and control that came with it, that was the goal of logical-empirical science.
  • Each successive theory's explanation was closer to the truth than the theory before.
  • e notion of scientific realism held by Newton led to the evolutionary view of the progress of science
  • he entities and processes of theory were believed to exist in nature, and science should discover those entities and processes
  • Particularly disturbing discoveries were made in the area of atomic physics. For instance, Heisenberg's indeterminacy (25)principle, according to historian of science Cecil Schneer, yielded the conclusion that "the world of nature is indeterminate.
  • "even the fundamental principle of causality fail[ed] ."
  • was not until the second half of the twentieth century that the preservers of the evolutionary idea of scientific progress, the logical empiricists, were seriously challenged
  • revolutionary model of scientific change and examined the role of the scientific community in preventing and then accepting change. Kuhn's conception of scientific change occurring through revolutions undermined the traditional scientific goal, finding "truth" in nature
  • Textbooks inform scientists-to-be about this common body of knowledge and understanding.
  • for the world is too huge and complex to be explored randomly.
  • a scientist knows what facts are relevant and can build on past research
  • Normal science, as defined by Kuhn, is cumulative. New knowledge fills a gap of ignorance
  • ne standard product of the scientific enterprise is missing. Normal science does not aim at novelties of fact or theory and, when successful, finds none."
  • ntain a mechanism that uncovers anomaly, inconsistencies within the paradigm.
  • eventually, details arise that are inconsistent with the current paradigm
  • hese inconsistencies are eventually resolved or are ignored.
  • y concern a topic of central importance, a crisis occurs and normal science comes to a hal
  • that the scientists re-examine the foundations of their science that they had been taking for granted
  • it resolves the crisis better than the others, it offers promise for future research, and it is more aesthetic than its competitors. The reasons for converting to a new paradigm are never completely rational.
  • Unlike evolutionary science, in which new knowledge fills a gap of ignorance, in Kuhn's model new knowledge replaces incompatible knowledge.
  • Thus science is not a continuous or cumulative endeavor: when a paradigm shift occurs there is a revolution similar to a political revolution, with fundamental and pervasive changes in method and understanding. Each successive vision about the nature of the universe makes the past vision obsolete; predictions, though more precise, remain similar to the predictions of the past paradigm in their general orientation, but the new explanations do not accommodate the old
  • In a sense, we have circled back to the ancient and medieval practice of separating scientific theory from physical reality; both medieval scientists and Kuhn would agree that no theory corresponds to reality and therefore any number of theories might equally well explain a natural phenomenon. [36] Neither twentieth-century atomic theorists nor medieval astronomers are able to claim that their theories accurately describe physical phenomena. The inability to return to scientific realism suggests a tripartite division of the history of science, with a period of scientific realism fitting between two periods in which there is no insistence that theory correspond to reality. Although both scientific realism and the evolutionary idea of scientific progress appeal to common sense, both existed for only a few hundred years.
Javier E

How Poor Are the Poor? - NYTimes.com - 0 views

  • “Anyone who studies the issue seriously understands that material poverty has continued to fall in the U.S. in recent decades, primarily due to the success of anti-poverty programs” and the declining cost of “food, air-conditioning, communications, transportation, and entertainment,”
  • Despite the rising optimism, there are disagreements over how many poor people there are and the conditions they live under. There are also questions about the problem of relative poverty, what we are now calling inequality
  • Jencks argues that the actual poverty rate has dropped over the past five decades – far below the official government level — if poverty estimates are adjusted for food and housing benefits, refundable tax credits and a better method of determining inflation rates. In Jencks’s view, the war on poverty worked.
  • ...15 more annotations...
  • Democratic supporters of safety net programs can use Jencks’s finding that poverty has dropped below 5 percent as evidence that the war on poverty has been successful.
  • At the same time liberals are wary of positive news because, as Jencks notes:It is easier to rally support for such an agenda by saying that the problem in question is getting worse
  • The plus side for conservatives of Jencks’s low estimate of the poverty rate is the implication that severe poverty has largely abated, which then provides justification for allowing enemies of government entitlement programs to further cut social spending.
  • At the same time, however, Jencks’s data undermines Republican claims that the war on poverty has been a failure – a claim exemplified by Ronald Reagan’s famous 1987 quip: “In the sixties we waged a war on poverty, and poverty won.”
  • Jencks’s conclusion: “The absolute poverty rate has declined dramatically since President Johnson launched his war on poverty in 1964.” At 4.8 percent, Jencks’s calculation is the lowest poverty estimate by a credible expert in the field.
  • his conclusion — that instead of the official count of 45.3 million people living in poverty, the number of poor people in America is just under 15 million — understates the scope of hardship in this country.
  • There are strong theoretical justifications for the use of a relative poverty measure. The Organization for Economic Cooperation and Development puts it this way:In order to participate fully in the social life of a community, individuals may need a level of resources that is not too inferior to the norms of a community. For example, the clothing budget that allows a child not to feel ashamed of his school attire is much more related to national living standards than to strict requirements for physical survival
  • using a relative measure shows that the United States lags well behind other developed countries:If you use the O.E.C.D. standard of 50 percent of median income as a poverty line, the United States looks pretty bad in cross-national relief. We have a relative poverty rate exceeded only by Chile, Turkey, Mexico and Israel (which has seen a big increase in inequality in recent years). And that rate in 2010 was essentially where it was in 1995
  • While the United States “has achieved real progress in reducing absolute poverty over the past 50 years,” according to Burtless, “the country may have made no progress at all in reducing the relative economic deprivation of folks at the bottom.”
  • the heart of the dispute: How severe is the problem of poverty?
  • Kathryn Edin, a professor of sociology at Johns Hopkins, and Luke Schaefer, a professor of social work at the University of Michigan, contend that the poverty debate overlooks crucial changes that have taken place within the population of the poor.
  • welfare reform, signed into law by President Clinton in 1996 (the Personal Responsibility and Work Opportunity Act), which limited eligibility for welfare benefits to five years. The limitation has forced many of the poor off welfare: over the past 19 years, the percentage of families falling under the official poverty line who receive welfare benefits has fallen from to 26 percent from 68 percent. Currently, three-quarters of those in poverty, under the official definition, receive no welfare payments.
  • he enactment of expanded benefits for the working poor through the earned-income tax credit and the child tax credit.According to Edin and Schaefer, the consequence of these changes, taken together, has been to divide the poor who no longer receive welfare into two groups. The first group is made up of those who have gone to work and have qualified for tax credits. Expanded tax credits lifted about 3.2 million children out of poverty in 2013
  • he second group, though, has really suffered. These are the very poor who are without work, part of a population that is struggling desperately. Edin and Schaefer write that among the losers are an estimated 3.4 million “children who over the course of a year live for at least three months under a $2 per person per day threshold.”
  • ocusing on these findings, Mishel argues, diverts attention from the more serious problem of “the failure of the labor market to adequately reward low-wage workers.”To support his case, Mishel points out that hourly pay for those in the bottom fifth grew only 7.7 percent from 1979 to 2007, while productivity grew by 64 percent, and education levels among workers in this quintile substantially improved.
sissij

A Transgender Student Won Her Battle. Now It's War. - The New York Times - 0 views

  • But just being allowed to set foot in that locker room was a huge victory for the girl. She is transgender.
  • but the war over how to accommodate transgender students is far from over in her Chicago suburb.
  • on the grounds that they are “the opposite biological sex.” Their presence, the opponents argue, violates community standards of decency.
  • ...5 more annotations...
  • But that agreement, essentially a contract between the district and the federal government, applies only to one person: Student A.
  • “We didn’t choose to be black,” he said, “and no matter what choice we make in the future, guess what? We’re still going to be black.”
  • A letter from the church says: “God created two distinct and complementary sexes in the very biology of the human race. A biological male is never female or vice versa.”
  • Like many of the opponents, Mr. Harrington and his daughter have not met the transgender students they are talking about. Sarah suspects, but is not sure, that one of the younger transgender boys was in her Girl Scout troop.
  • Yes, he said, he knows Student A. He has since middle school, where she was bullied. And, he said with a smile, they attend the same church. “At least among my friend group,” he said, “it’s pretty well accepted that nobody really cares.”
  •  
    Transgender has always been a sensitive topic that people hold controversial views on. Religion is a big obstacle on the way. Transgender is obviously a rather new thing and it takes time for the whole society accept it as normal. I have always been interested about how transgender people lived in the past. Transgender is not a thing that only exist now. What about its history in the past? Why is the issue of transgender only starting to ever emerge and becoming widely discussed now? --Sissi (4/2/2017)
Javier E

MacIntyre | Internet Encyclopedia of Philosophy - 0 views

  • For MacIntyre, “rationality” comprises all the intellectual resources, both formal and substantive, that we use to judge truth and falsity in propositions, and to determine choice-worthiness in courses of action
  • Rationality in this sense is not universal; it differs from community to community and from person to person, and may both develop and regress over the course of a person’s life or a community’s history.
  • So rationality itself, whether theoretical or practical, is a concept with a history: indeed, since there are also a diversity of traditions of enquiry, with histories, there are, so it will turn out, rationalities rather than rationality, just as it will also turn out that there are justices rather than justice
  • ...164 more annotations...
  • Rationality is the collection of theories, beliefs, principles, and facts that the human subject uses to judge the world, and a person’s rationality is, to a large extent, the product of that person’s education and moral formation.
  • To the extent that a person accepts what is handed down from the moral and intellectual traditions of her or his community in learning to judge truth and falsity, good and evil, that person’s rationality is “tradition-constituted.” Tradition-constituted rationality provides the schemata by which we interpret, understand, and judge the world we live in
  • The apparent problem of relativism in MacIntyre’s theory of rationality is much like the problem of relativism in the philosophy of science. Scientific claims develop within larger theoretical frameworks, so that the apparent truth of a scientific claim depends on one’s judgment of the larger framework. The resolution of the problem of relativism therefore appears to hang on the possibility of judging frameworks or rationalities, or judging between frameworks or rationalities from a position that does not presuppose the truth of the framework or rationality, but no such theoretical standpoint is humanly possible.
  • MacIntyre finds that the world itself provides the criterion for the testing of rationalities, and he finds that there is no criterion except the world itself that can stand as the measure of the truth of any philosophical theory.
  • MacIntyre’s philosophy is indebted to the philosophy of science, which recognizes the historicism of scientific enquiry even as it seeks a truthful understanding of the world. MacIntyre’s philosophy does not offer a priori certainty about any theory or principle; it examines the ways in which reflection upon experience supports, challenges, or falsifies theories that have appeared to be the best theories so far to the people who have accepted them so far. MacIntyre’s ideal enquirers remain Hamlets, not Emmas.
  • history shows us that individuals, communities, and even whole nations may commit themselves militantly over long periods of their histories to doctrines that their ideological adversaries find irrational. This qualified relativism of appearances has troublesome implications for anyone who believes that philosophical enquiry can easily provide certain knowledge of the world
  • According to MacIntyre, theories govern the ways that we interpret the world and no theory is ever more than “the best standards so far” (3RV, p. 65). Our theories always remain open to improvement, and when our theories change, the appearances of our world—the apparent truths of claims judged within those theoretical frameworks—change with them.
  • From the subjective standpoint of the human enquirer, MacIntyre finds that theories, concepts, and facts all have histories, and they are all liable to change—for better or for worse.
  • MacIntyre holds that the rationality of individuals is not only tradition-constituted, it is also tradition constitutive, as individuals make their own contributions to their own rationality, and to the rationalities of their communities. Rationality is not fixed, within either the history of a community or the life of a person
  • The modern account of first principles justifies an approach to philosophy that rejects tradition. The modern liberal individualist approach is anti-traditional. It denies that our understanding is tradition-constituted and it denies that different cultures may differ in their standards of rationality and justice:
  • Modernity does not see tradition as the key that unlocks moral and political understanding, but as a superfluous accumulation of opinions that tend to prejudice moral and political reasoning.
  • Although modernity rejects tradition as a method of moral and political enquiry, MacIntyre finds that it nevertheless bears all the characteristics of a moral and political tradition.
  • If historical narratives are only projections of the interests of historians, then it is difficult to see how this historical narrative can claim to be truthful
  • For these post-modern theorists, “if the Enlightenment conceptions of truth and rationality cannot be sustained,” either relativism or perspectivism “is the only possible alternative” (p. 353). MacIntyre rejects both challenges by developing his theory of tradition-constituted and tradition-constitutive rationality on pp. 354-369
  • How, then, is one to settle challenges between two traditions? It depends on whether the adherents of either take the challenges of the other tradition seriously. It depends on whether the adherents of either tradition, on seeing a failure in their own tradition are willing to consider an answer offered by their rival (p. 355)
  • how a person with no traditional affiliation is to deal with the conflicting claims of rival traditions: “The initial answer is: that will depend upon who you are and how you understand yourself. This is not the kind of answer which we have been educated to expect in philosophy”
  • MacIntyre focuses the critique of modernity on the question of rational justification. Modern epistemology stands or falls on the possibility of Cartesian epistemological first principles. MacIntyre’s history exposes that notion of first principle as a fiction, and at the same time demonstrates that rational enquiry advances (or declines) only through tradition
  • MacIntyre cites Foucault’s 1966 book, Les Mots et les choses (The Order of Things, 1970) as an example of the self-subverting character of Genealogical enquiry
  • Foucault’s book reduces history to a procession of “incommensurable ordered schemes of classification and representation” none of which has any greater claim to truth than any other, yet this book “is itself organized as a scheme of classification and representation.”
  • From MacIntyre’s perspective, there is no question of deciding whether or not to work within a tradition; everyone who struggles with practical, moral, and political questions simply does. “There is no standing ground, no place for enquiry . . . apart from that which is provided by some particular tradition or other”
  • Three Rival Versions of Moral Enquiry (1990). The central idea of the Gifford Lectures is that philosophers make progress by addressing the shortcomings of traditional narratives about the world, shortcomings that become visible either through the failure of traditional narratives to make sense of experience, or through the introduction of contradictory narratives that prove impossible to dismiss
  • MacIntyre compares three traditions exemplified by three literary works published near the end of Adam Gifford’s life (1820–1887)
  • The Ninth Edition of the Encyclopaedia Britannica (1875–1889) represents the modern tradition of trying to understand the world objectively without the influence of tradition.
  • The Genealogy of Morals (1887), by Friedrich Nietzsche embodies the post-modern tradition of interpreting all traditions as arbitrary impositions of power.
  • The encyclical letter Aeterni Patris (1879) of Pope Leo XIII exemplifies the approach of acknowledging one’s predecessors within one’s own tradition of enquiry and working to advance or improve that tradition in the pursuit of objective truth. 
  • Of the three versions of moral enquiry treated in 3RV, only tradition, exemplified in 3RV by the Aristotelian, Thomistic tradition, understands itself as a tradition that looks backward to predecessors in order to understand present questions and move forward
  • Encyclopaedia obscures the role of tradition by presenting the most current conclusions and convictions of a tradition as if they had no history, and as if they represented the final discovery of unalterable truth
  • Encyclopaedists focus on the present and ignore the past.
  • Genealogists, on the other hand, focus on the past in order to undermine the claims of the present.
  • In short, Genealogy denies the teleology of human enquiry by denying (1) that historical enquiry has been fruitful, (2) that the enquiring person has a real identity, and (3) that enquiry has a real goal. MacIntyre finds this mode of enquiry incoherent.
  • Genealogy is self-deceiving insofar as it ignores the traditional and teleological character of its enquiry.
  • Genealogical moral enquiry must make similar exceptions to its treatments of the unity of the enquiring subject and the teleology of moral enquiry; thus “it seems to be the case that the intelligibility of genealogy requires beliefs and allegiances of a kind precluded by the genealogical stance” (3RV, p. 54-55)
  • MacIntyre uses Thomism because it applies the traditional mode of enquiry in a self-conscious manner. Thomistic students learn the work of philosophical enquiry as apprentices in a craft (3RV, p. 61), and maintain the principles of the tradition in their work to extend the understanding of the tradition, even as they remain open to the criticism of those principles.
  • 3RV uses Thomism as its example of tradition, but this use should not suggest that MacIntyre identifies “tradition” with Thomism or Thomism-as-a-name-for-the-Western-tradition. As noted above, WJWR distinguished four traditions of enquiry within the Western European world alone
  • MacIntyre’s emphasis on the temporality of rationality in traditional enquiry makes tradition incompatible with the epistemological projects of modern philosophy
  • Tradition is not merely conservative; it remains open to improvement,
  • Tradition differs from both encyclopaedia and genealogy in the way it understands the place of its theories in the history of human enquiry. The adherent of a tradition must understand that “the rationality of a craft is justified by its history so far,” thus it “is inseparable from the tradition through which it was achieved”
  • MacIntyre uses Thomas Aquinas to illustrate the revolutionary potential of traditional enquiry. Thomas was educated in Augustinian theology and Aristotelian philosophy, and through this education he began to see not only the contradictions between the two traditions, but also the strengths and weaknesses that each tradition revealed in the other. His education also helped him to discover a host of questions and problems that had to be answered and solved. Many of Thomas Aquinas’ responses to these concerns took the form of disputed questions. “Yet to each question the answer produced by Aquinas as a conclusion is no more than and, given Aquinas’s method, cannot but be no more than, the best answer reached so far. And hence derives the essential incompleteness”
  • argue that the virtues are essential to the practice of independent practical reason. The book is relentlessly practical; its arguments appeal only to experience and to purposes, and to the logic of practical reasoning.
  • Like other intelligent animals, human beings enter life vulnerable, weak, untrained, and unknowing, and face the likelihood of infirmity in sickness and in old age. Like other social animals, humans flourish in groups. We learn to regulate our passions, and to act effectively alone and in concert with others through an education provided within a community. MacIntyre’s position allows him to look to the animal world to find analogies to the role of social relationships in the moral formation of human beings
  • The task for the human child is to make “the transition from the infantile exercise of animal intelligence to the exercise of independent practical reasoning” (DRA, p. 87). For a child to make this transition is “to redirect and transform her or his desires, and subsequently to direct them consistently towards the goods of different stages of her or his life” (DRA, p. 87). The development of independent practical reason in the human agent requires the moral virtues in at least three ways.
  • DRA presents moral knowledge as a “knowing how,” rather than as a “knowing that.” Knowledge of moral rules is not sufficient for a moral life; prudence is required to enable the agent to apply the rules well.
  • “Knowing how to act virtuously always involves more than rule-following” (DRA, p. 93). The prudent person can judge what must be done in the absence of a rule and can also judge when general norms cannot be applied to particular cases.
  • Flourishing as an independent practical reasoner requires the virtues in a second way, simply because sometimes we need our friends to tell us who we really are. Independent practical reasoning also requires self-knowledge, but self-knowledge is impossible without the input of others whose judgment provides a reliable touchstone to test our beliefs about ourselves. Self-knowledge therefore requires the virtues that enable an agent to sustain formative relationships and to accept the criticism of trusted friends
  • Human flourishing requires the virtues in a third way, by making it possible to participate in social and political action. They enable us to “protect ourselves and others against neglect, defective sympathies, stupidity, acquisitiveness, and malice” (DRA, p. 98) by enabling us to form and sustain social relationships through which we may care for one another in our infirmities, and pursue common goods with and for the other members of our societies.
  • MacIntyre argues that it is impossible to find an external standpoint, because rational enquiry is an essentially social work (DRA, p. 156-7). Because it is social, shared rational enquiry requires moral commitment to, and practice of, the virtues to prevent the more complacent members of communities from closing off critical reflection upon “shared politically effective beliefs and concepts”
  • MacIntyre finds himself compelled to answer what may be called the question of moral provincialism: If one is to seek the truth about morality and justice, it seems necessary to “find a standpoint that is sufficiently external to the evaluative attitudes and practices that are to be put to the question.” If it is impossible for the agent to take such an external standpoint, if the agent’s commitments preclude radical criticism of the virtues of the community, does that leave the agent “a prisoner of shared prejudices” (DRA, p. 154)?
  • The book moves from MacIntyre’s assessment of human needs for the virtues to the political implications of that assessment. Social and political institutions that form and enable independent practical reasoning must “satisfy three conditions.” (1) They must enable their members to participate in shared deliberations about the communities’ actions. (2) They must establish norms of justice “consistent with exercise of” the virtue of justice. (3) They must enable the strong “to stand proxy” as advocates for the needs of the weak and the disabled.
  • The social and political institutions that MacIntyre recommends cannot be identified with the modern nation state or the modern nuclear family
  • The political structures necessary for human flourishing are essentially local
  • Yet local communities support human flourishing only when they actively support “the virtues of just generosity and shared deliberation”
  • MacIntyre rejects individualism and insists that we view human beings as members of communities who bear specific debts and responsibilities because of our social identities. The responsibilities one may inherit as a member of a community include debts to one’s forbearers that one can only repay to people in the present and future
  • The constructive argument of the second half of the book begins with traditional accounts of the excellences or virtues of practical reasoning and practical rationality rather than virtues of moral reasoning or morality. These traditional accounts define virtue as arête, as excellence
  • Practices are supported by institutions like chess clubs, hospitals, universities, industrial corporations, sports leagues, and political organizations.
  • Practices exist in tension with these institutions, since the institutions tend to be oriented to goods external to practices. Universities, hospitals, and scholarly societies may value prestige, profitability, or relations with political interest groups above excellence in the practices they are said to support.
  • Personal desires and institutional pressures to pursue external goods may threaten to derail practitioners’ pursuits of the goods internal to practices. MacIntyre defines virtue initially as the quality of character that enables an agent to overcome these temptations:
  • “A virtue is an acquired human quality the possession and exercise of which tends to enable us to achieve those goods which are internal to practices
  • Excellence as a human agent cannot be reduced to excellence in a particular practice (See AV, pp. 204–
  • The virtues therefore are to be understood as those dispositions which will not only sustain practices and enable us to achieve the goods internal to practices, but which will also sustain us in the relevant kind of quest for the good, by enabling us to overcome the harms, dangers, temptations, and distractions which we encounter, and which will furnish us with increasing self-knowledge and increasing knowledge of the good (AV, p. 219).
  • The excellent human agent has the moral qualities to seek what is good and best both in practices and in life as a whole.
  • The virtues find their point and purpose not only in sustaining those relationships necessary if the variety of goods internal to practices are to be achieved and not only in sustaining the form of an individual life in which that individual may seek out his or her good as the good of his or her whole life, but also in sustaining those traditions which provide both practices and individual lives with their necessary historical context (AV, p. 223)
  • Since “goods, and with them the only grounds for the authority of laws and virtues, can only be discovered by entering into those relationships which constitute communities whose central bond is a shared vision of and understanding of goods” (AV, p. 258), any hope for the transformation and renewal of society depends on the development and maintenance of such communities.
  • MacIntyre’s Aristotelian approach to ethics as a study of human action distinguishes him from post-Kantian moral philosophers who approach ethics as a means of determining the demands of objective, impersonal, universal morality
  • This modern approach may be described as moral epistemology. Modern moral philosophy pretends to free the individual to determine for her- or himself what she or he must do in a given situation, irrespective of her or his own desires; it pretends to give knowledge of universal moral laws
  • Aristotelian metaphysicians, particularly Thomists who define virtue in terms of the perfection of nature, rejected MacIntyre’s contention that an adequate Aristotelian account of virtue as excellence in practical reasoning and human action need not appeal to Aristotelian metaphysic
  • one group of critics rejects MacIntyre’s Aristotelianism because they hold that any Aristotelian account of the virtues must first account for the truth about virtue in terms of Aristotle’s philosophy of nature, which MacIntyre had dismissed in AV as “metaphysical biology”
  • Many of those who rejected MacIntyre’s turn to Aristotle define “virtue” primarily along moral lines, as obedience to law or adherence to some kind of natural norm. For these critics, “virtuous” appears synonymous with “morally correct;” their resistance to MacIntyre’s appeal to virtue stems from their difficulties either with what they take to be the shortcomings of MacIntyre’s account of moral correctness or with the notion of moral correctness altogether
  • MacIntyre continues to argue from the experience of practical reasoning to the demands of moral education.
  • Descartes and his successors, by contrast, along with certain “notable Thomists of the last hundred years” (p. 175), have proposed that philosophy begins from knowledge of some “set of necessarily true first principles which any truly rational person is able to evaluate as true” (p. 175). Thus for the moderns, philosophy is a technical rather than moral endeavor
  • MacIntyre distinguishes two related challenges to his position, the “relativist challenge” and the “perspectivist challenge.” These two challenges both acknowledge that the goals of the Enlightenment cannot be met and that, “the only available standards of rationality are those made available by and within traditions” (p. 252); they conclude that nothing can be known to be true or false
  • MacIntyre follows the progress of the Western tradition through “three distinct traditions:” from Homer and Aristotle to Thomas Aquinas, from Augustine to Thomas Aquinas and from Augustine through Calvin to Hume
  • Chapter 17 examines the modern liberal denial of tradition, and the ironic transformation of liberalism into the fourth tradition to be treated in the book.
  • MacIntyre credits John Stuart Mill and Thomas Aquinas as “two philosophers of the kind who by their writing send us beyond philosophy into immediate encounter with the ends of life
  • First, both were engaged by questions about the ends of life as questioning human beings and not just as philosophers. . . .
  • Secondly, both Mill and Aquinas understood their speaking and writing as contributing to an ongoing philosophical conversation. . . .
  • Thirdly, it matters that both the end of the conversation and the good of those who participate in it is truth and that the nature of truth, of good, of rational justification, and of meaning therefore have to be central topics of that conversation (Tasks, pp. 130-1).
  • Without these three characteristics, philosophy is first reduced to “the exercise of a set of analytic and argumentative skills. . . . Secondly, philosophy may thereby become a diversion from asking questions about the ends of life with any seriousness”
  • Neither Rosenzweig nor Lukács made philosophical progress because both failed to relate “their questions about the ends of life to the ends of their philosophical writing”
  • First, any adequate philosophical history or biography must determine whether the authors studied remain engaged with the questions that philosophy studies, or set the questions aside in favor of the answers. Second, any adequate philosophical history or biography must determine whether the authors studied insulated themselves from contact with conflicting worldviews or remained open to learning from every available philosophical approach. Third, any adequate philosophical history or biography must place the authors studied into a broader context that shows what traditions they come from and “whose projects” they are “carrying forward
  • MacIntyre’s recognition of the connection between an author’s pursuit of the ends of life and the same author’s work as a philosophical writer prompts him to finish the essay by demanding three things of philosophical historians and biographers
  • Philosophy is not just a study; it is a practice. Excellence in this practice demands that an author bring her or his struggles with the questions of the ends of philosophy into dialogue with historic and contemporary texts and authors in the hope of making progress in answering those questions
  • MacIntyre defends Thomistic realism as rational enquiry directed to the discovery of truth.
  • The three Thomistic essays in this book challenge those caricatures by presenting Thomism in a way that people outside of contemporary Thomistic scholarship may find surprisingly flexible and open
  • To be a moral agent, (1) one must understand one’s individual identity as transcending all the roles that one fills; (2) one must see oneself as a practically rational individual who can judge and reject unjust social standards; and (3) one must understand oneself as “as accountable to others in respect of the human virtues and not just in respect of [one’s] role-performances
  • J is guilty because he complacently accepted social structures that he should have questioned, structures that undermined his moral agency. This essay shows that MacIntyre’s ethics of human agency is not just a descriptive narrative about the manner of moral education; it is a standard laden account of the demands of moral agency.
  • MacIntyre considers “the case of J” (J, for jemand, the German word for “someone”), a train controller who learned, as a standard for his social role, to take no interest in what his trains carried, even during war time when they carried “munitions and . . . Jews on their way to extermination camps”
  • J had learned to do his work for the railroad according to one set of standards and to live other parts of his life according to other standards, so that this compliant participant in “the final solution” could contend, “You cannot charge me with moral failure” (E&P, p. 187).
  • The epistemological theories of Modern moral philosophy were supposed to provide rational justification for rules, policies, and practical determinations according to abstract universal standards, but MacIntyre has dismissed those theorie
  • Modern metaethics is supposed to enable its practitioners to step away from the conflicting demands of contending moral traditions and to judge those conflicts from a neutral position, but MacIntyre has rejected this project as well
  • In his ethical writings, MacIntyre seeks only to understand how to liberate the human agent from blindness and stupidity, to prepare the human agent to recognize what is good and best to do in the concrete circumstances of that agent’s own life, and to strengthen the agent to follow through on that judgment.
  • In his political writings, MacIntyre investigates the role of communities in the formation of effective rational agents, and the impact of political institutions on the lives of communities. This kind of ethics and politics is appropriately named the ethics of human agency.
  • The purpose of the modern moral philosophy of authors like Kant and Mill was to determine, rationally and universally, what kinds of behavior ought to be performed—not in terms of the agent’s desires or goals, but in terms of universal, rational duties. Those theories purported to let agents know what they ought to do by providing knowledge of duties and obligations, thus they could be described as theories of moral epistemology.
  • Contemporary virtue ethics purports to let agents know what qualities human beings ought to have, and the reasons that we ought to have them, not in terms of our fitness for human agency, but in the same universal, disinterested, non-teleological terms that it inherits from Kant and Mill.
  • For MacIntyre, moral knowledge remains a “knowing how” rather than a “knowing that;” MacIntyre seeks to identify those moral and intellectual excellences that make human beings more effective in our pursuit of the human good.
  • MacIntyre’s purpose in his ethics of human agency is to consider what it means to seek one’s good, what it takes to pursue one’s good, and what kind of a person one must become if one wants to pursue that good effectively as a human agent.
  • As a philosophy of human agency, MacIntyre’s work belongs to the traditions of Aristotle and Thomas Aquinas.
  • in keeping with the insight of Marx’s third thesis on Feuerbach, it maintained the common condition of theorists and people as peers in the pursuit of the good life.
  • He holds that the human good plays a role in our practical reasoning whether we recognize it or not, so that some people may do well without understanding why (E&P, p. 25). He also reads Aristotle as teaching that knowledge of the good can make us better agents
  • AV defines virtue in terms of the practical requirements for excellence in human agency, in an agent’s participation in practices (AV, ch. 14), in an agent’s whole life, and in an agent’s involvement in the life of her or his community
  • MacIntyre’s Aristotelian concept of “human action” opposes the notion of “human behavior” that prevailed among mid-twentieth-century determinist social scientists. Human actions, as MacIntyre understands them, are acts freely chosen by human agents in order to accomplish goals that those agents pursue
  • Human behavior, according to mid-twentieth-century determinist social scientists, is the outward activity of a subject, which is said to be caused entirely by environmental influences beyond the control of the subject.
  • Rejecting crude determinism in social science, and approaches to government and public policy rooted in determinism, MacIntyre sees the renewal of human agency and the liberation of the human agent as central goals for ethics and politics.
  • MacIntyre’s Aristotelian account of “human action” examines the habits that an agent must develop in order to judge and act most effectively in the pursuit of truly choice-worthy ends
  • MacIntyre seeks to understand what it takes for the human person to become the kind of agent who has the practical wisdom to recognize what is good and best to do and the moral freedom to act on her or his best judgment.
  • MacIntyre rejected the determinism of modern social science early in his career (“Determinism,” 1957), yet he recognizes that the ability to judge well and act freely is not simply given; excellence in judgment and action must be developed, and it is the task of moral philosophy to discover how these excellences or virtues of the human agent are established, maintained, and strengthened
  • MacIntyre’s Aristotelian philosophy investigates the conditions that support free and deliberate human action in order to propose a path to the liberation of the human agent through participation in the life of a political community that seeks its common goods through the shared deliberation and action of its members
  • As a classics major at Queen Mary College in the University of London (1945-1949), MacIntyre read the Greek texts of Plato and Aristotle, but his studies were not limited to the grammars of ancient languages. He also examined the ethical theories of Immanuel Kant and John Stuart Mill. He attended the lectures of analytic philosopher A. J. Ayer and of philosopher of science Karl Popper. He read Ludwig Wittgenstein’s Tractatus Logico Philosophicus, Jean-Paul Sartre’s L'existentialisme est un humanisme, and Marx’s Eighteenth Brumaire of Napoleon Bonaparte (What happened, pp. 17-18). MacIntyre met the sociologist Franz Steiner, who helped direct him toward approaching moralities substantively
  • Alasdair MacIntyre’s philosophy builds on an unusual foundation. His early life was shaped by two conflicting systems of values. One was “a Gaelic oral culture of farmers and fishermen, poets and storytellers.” The other was modernity, “The modern world was a culture of theories rather than stories” (MacIntyre Reader, p. 255). MacIntyre embraced both value systems
  • From Marxism, MacIntyre learned to see liberalism as a destructive ideology that undermines communities in the name of individual liberty and consequently undermines the moral formation of human agents
  • For MacIntyre, Marx’s way of seeing through the empty justifications of arbitrary choices to consider the real goals and consequences of political actions in economic and social terms would remain the principal insight of Marxism
  • After his retirement from teaching, MacIntyre has continued his work of promoting a renewal of human agency through an examination of the virtues demanded by practices, integrated human lives, and responsible engagement with community life. He is currently affiliated with the Centre for Contemporary Aristotelian Studies in Ethics and Politics (CASEP) at London Metropolitan University.
  • The second half of AV proposes a conception of practice and practical reasoning and the notion of excellence as a human agent as an alternative to modern moral philosophy
  • AV rejects the view of “modern liberal individualism” in which autonomous individuals use abstract moral principles to determine what they ought to do. The critique of modern normative ethics in the first half of AV rejects modern moral reasoning for its failure to justify its premises, and criticizes the frequent use of the rhetoric of objective morality and scientific necessity to manipulate people to accept arbitrary decisions
  • MacIntyre uses “modern liberal individualism” to name a much broader category that includes both liberals and conservatives in contemporary American political parlance, as well as some Marxists and anarchists (See ASIA, pp. 280-284). Conservatism, liberalism, Marxism, and anarchism all present the autonomous individual as the unit of civil society
  • The sources of modern liberal individualism—Hobbes, Locke, and Rousseau—assert that human life is solitary by nature and social by habituation and convention. MacIntyre’s Aristotelian tradition holds, on the contrary, that human life is social by nature.
  • MacIntyre identifies moral excellence with effective human agency, and seeks a political environment that will help to liberate human agents to recognize and seek their own goods, as components of the common goods of their communities, more effectively. For MacIntyre therefore, ethics and politics are bound together.
  • For MacIntyre ethics is not an application of principles to facts, but a study of moral action. Moral action, free human action, involves decisions to do things in pursuit of goals, and it involves the understanding of the implications of one’s actions for the whole variety of goals that human agents seek
  • In this sense, “To act morally is to know how to act” (SMJ, p. 56). “Morality is not a ‘knowing that’ but a ‘knowing how’”
  • If human action is a ‘knowing how,’ then ethics must also consider how one learns ‘how.’ Like other forms of ‘knowing how,’ MacIntyre finds that one learns how to act morally within a community whose language and shared standards shape our judgment
  • MacIntyre had concluded that ethics is not an abstract exercise in the assessment of facts; it is a study of free human action and of the conditions that enable rational human agency.
  • MacIntyre gives Marx credit for concluding in the third of the Theses on Feuerbach, that the only way to change society is to change ourselves, and that “The coincidence of the changing of human activity or self-changing can only be comprehended and rationally understood as revolutionary practice”
  • MacIntyre distinguishes “religion which is an opiate for the people from religion which is not” (MI, p. 83). He condemns forms of religion that justify social inequities and encourage passivity. He argues that authentic Christian teaching criticizes social structures and encourages action
  • Where “moral philosophy textbooks” discuss the kinds of maxims that should guide “promise-keeping, truth-telling, and the like,” moral maxims do not guide real agents in real life at all. “They do not guide us because we do not need to be guided. We know what to do” (ASIA, p. 106). Sometimes we do this without any maxims at all, or even against all the maxims we know. MacIntyre Illustrates his point with Huckleberry Finn’s decision to help Jim, Miss Watson’s escaped slave, to make his way to freedom
  • MacIntyre develops the ideas that morality emerges from history, and that morality organizes the common life of a community
  • The book concludes that the concepts of morality are neither timeless nor ahistorical, and that understanding the historical development of ethical concepts can liberate us “from any false absolutist claims” (SHE, p. 269). Yet this conclusion need not imply that morality is essentially arbitrary or that one could achieve freedom by liberating oneself from the morality of one’s society.
  • From this “Aristotelian point of view,” “modern morality” begins to go awry when moral norms are separated from the pursuit of human goods and moral behavior is treated as an end in itself. This separation characterizes Christian divine command ethics since the fourteenth century and has remained essential to secularized modern morality since the eighteenth century
  • From MacIntyre’s “Aristotelian point of view,” the autonomy granted to the human agent by modern moral philosophy breaks down natural human communities and isolates the individual from the kinds of formative relationships that are necessary to shape the agent into an independent practical reasoner.
  • the 1977 essay “Epistemological Crises, Dramatic Narrative, and the Philosophy of Science” (Hereafter EC). This essay, MacIntyre reports, “marks a major turning-point in my thought in the 1970s” (The Tasks of Philosophy, p. vii) EC may be described fairly as MacIntyre’s discourse on method
  • First, Philosophy makes progress through the resolution of problems. These problems arise when the theories, histories, doctrines and other narratives that help us to organize our experience of the world fail us, leaving us in “epistemological crises.” Epistemological crises are the aftermath of events that undermine the ways that we interpret our world
  • it presents three general points on the method for philosophy.
  • To live in an epistemological crisis is to be aware that one does not know what one thought one knew about some particular subject and to be anxious to recover certainty about that subject.
  • To resolve an epistemological crisis it is not enough to impose some new way of interpreting our experience, we also need to understand why we were wrong before: “When an epistemological crisis is resolved, it is by the construction of a new narrative which enables the agent to understand both how he or she could intelligibly have held his or her original beliefs and how he or she could have been so drastically misled by them
  • MacIntyre notes, “Philosophers have customarily been Emmas and not Hamlets” (p. 6); that is, philosophers have treated their conclusions as accomplished truths, rather than as “more adequate narratives” (p. 7) that remain open to further improvement.
  • To illustrate his position on the open-endedness of enquiry, MacIntyre compares the title characters of Shakespeare’s Hamlet and Jane Austen’s Emma. When Emma finds that she is deeply misled in her beliefs about the other characters in her story, Mr. Knightly helps her to learn the truth and the story comes to a happy ending (p. 6). Hamlet, by contrast, finds no pat answers to his questions; rival interpretations remain throughout the play, so that directors who would stage the play have to impose their own interpretations on the script
  • Another approach to education is the method of Descartes, who begins by rejecting everything that is not clearly and distinctly true as unreliable and false in order to rebuild his understanding of the world on a foundation of undeniable truth.
  • Descartes presents himself as willfully rejecting everything he had believed, and ignores his obvious debts to the Scholastic tradition, even as he argues his case in French and Latin. For MacIntyre, seeking epistemological certainty through universal doubt as a precondition for enquiry is a mistake: “it is an invitation not to philosophy but to mental breakdown, or rather to philosophy as a means of mental breakdown.
  • MacIntyre contrasts Descartes’ descent into mythical isolation with Galileo, who was able to make progress in astronomy and physics by struggling with the apparently insoluble questions of late medieval astronomy and physics, and radically reinterpreting the issues that constituted those questions
  • To make progress in philosophy one must sort through the narratives that inform one’s understanding, struggle with the questions that those narratives raise, and on occasion, reject, replace, or reinterpret portions of those narratives and propose those changes to the rest of one’s community for assessment. Human enquiry is always situated within the history and life of a community.
  • The third point of EC is that we can learn about progress in philosophy from the philosophy of science
  • Kuhn’s “paradigm shifts,” however, are unlike MacIntyre’s resolutions of epistemological crises in two ways.
  • First they are not rational responses to specific problems. Kuhn compares paradigm shifts to religious conversions (pp. 150, 151, 158), stressing that they are not guided by rational norms and he claims that the “mopping up” phase of a paradigm shift is a matter of convention in the training of new scientists and attrition among the holdouts of the previous paradigm
  • Second, the new paradigm is treated as a closed system of belief that regulates a new period of “normal science”; Kuhn’s revolutionary scientists are Emmas, not Hamlets
  • MacIntyre proposes elements of Imre Lakatos’ philosophy of science as correctives to Kuhn’s. While Lakatos has his own shortcomings, his general account of the methodologies of scientific research programs recognizes the role of reason in the transitions between theories and between research programs (Lakatos’ analog to Kuhn’s paradigms or disciplinary matrices). Lakatos presents science as an open ended enquiry, in which every theory may eventually be replaced by more adequate theories. For Lakatos, unlike Kuhn, rational scientific progress occurs when a new theory can account both for the apparent promise and for the actual failure of the theory it replaces.
  • The third conclusion of MacIntyre’s essay is that decisions to support some theories over others may be justified rationally to the extent that those theories allow us to understand our experience and our history, including the history of the failures of inadequate theories
  • For Aristotle, moral philosophy is a study of practical reasoning, and the excellences or virtues that Aristotle recommends in the Nicomachean Ethics are the intellectual and moral excellences that make a moral agent effective as an independent practical reasoner.
  • MacIntyre also finds that the contending parties have little interest in the rational justification of the principles they use. The language of moral philosophy has become a kind of moral rhetoric to be used to manipulate others in defense of the arbitrary choices of its users
  • examining the current condition of secular moral and political discourse. MacIntyre finds contending parties defending their decisions by appealing to abstract moral principles, but he finds their appeals eclectic, inconsistent, and incoherent.
  • The secular moral philosophers of the eighteenth and nineteenth centuries shared strong and extensive agreements about the content of morality (AV, p. 51) and believed that their moral philosophy could justify the demands of their morality rationally, free from religious authority.
  • MacIntyre traces the lineage of the culture of emotivism to the secularized Protestant cultures of northern Europe
  • Modern moral philosophy had thus set for itself an incoherent goal. It was to vindicate both the moral autonomy of the individual and the objectivity, necessity, and categorical character of the rules of morality
  • MacIntyre turns to an apparent alternative, the pragmatic expertise of professional managers. Managers are expected to appeal to the facts to make their decisions on the objective basis of effectiveness, and their authority to do this is based on their knowledge of the social sciences
  • An examination of the social sciences reveals, however, that many of the facts to which managers appeal depend on sociological theories that lack scientific status. Thus, the predictions and demands of bureaucratic managers are no less liable to ideological manipulation than the determinations of modern moral philosophers.
  • Modern moral philosophy separates moral reasoning about duties and obligations from practical reasoning about ends and practical deliberation about the means to one’s ends, and in doing so it separates morality from practice.
  • Many Europeans also lost the practical justifications for their moral norms as they approached modernity; for these Europeans, claiming that certain practices are “immoral,” and invoking Kant’s categorical imperative or Mill’s principle of utility to explain why those practices are immoral, seems no more adequate than the Polynesian appeal to taboo.
  • MacIntyre sifts these definitions and then gives his own definition of virtue, as excellence in human agency, in terms of practices, whole human lives, and traditions in chapters 14 and 15 of AV.
  • In the most often quoted sentence of AV, MacIntyre defines a practice as (1) a complex social activity that (2) enables participants to gain goods internal to the practice. (3) Participants achieve excellence in practices by gaining the internal goods. When participants achieve excellence, (4) the social understandings of excellence in the practice, of the goods of the practice, and of the possibility of achieving excellence in the practice “are systematically extended”
  • Practices, like chess, medicine, architecture, mechanical engineering, football, or politics, offer their practitioners a variety of goods both internal and external to these practices. The goods internal to practices include forms of understanding or physical abilities that can be acquired only by pursuing excellence in the associated practice
  • Goods external to practices include wealth, fame, prestige, and power; there are many ways to gain these external goods. They can be earned or purchased, either honestly or through deception; thus the pursuit of these external goods may conflict with the pursuit of the goods internal to practices.
  • An intelligent child is given the opportunity to win candy by learning to play chess. As long as the child plays chess only to win candy, he has every reason to cheat if by doing so he can win more candy. If the child begins to desire and pursue the goods internal to chess, however, cheating becomes irrational, because it is impossible to gain the goods internal to chess or any other practice except through an honest pursuit of excellence. Goods external to practices may nevertheless remain tempting to the practitioner.
  • Since MacIntyre finds social identity necessary for the individual, MacIntyre’s definition of the excellence or virtue of the human agent needs a social dimension:
  • These responsibilities also include debts incurred by the unjust actions of ones’ predecessors.
  • The enslavement and oppression of black Americans, the subjugation of Ireland, and the genocide of the Jews in Europe remained quite relevant to the responsibilities of citizens of the United States, England, and Germany in 1981, as they still do today.
  • Thus an American who said “I never owned any slaves,” “the Englishman who says ‘I never did any wrong to Ireland,’” or “the young German who believes that being born after 1945 means that what Nazis did to Jews has no moral relevance to his relationship to his Jewish contemporaries” all exhibit a kind of intellectual and moral failure.
  • “I am born with a past, and to cut myself off from that past in the individualist mode, is to deform my present relationships” (p. 221).  For MacIntyre, there is no moral identity for the abstract individual; “The self has to find its moral identity in and through its membership in communities” (p. 221).
ilanaprincilus06

Humans need to become smarter thinkers to beat climate denial | Dana Nuccitelli | Envir... - 0 views

  • using ‘misconception-based learning’ to dislodge climate myths from peoples’ brains and replace them with facts, and beating denial by inoculating people against misinformers’ tricks.
  • The idea is that when people are faced with a myth and a competing fact, the fact will more easily win out if the fallacy underpinning the myth is revealed.
  • If people can learn to implement a simple six-step critical thinking process, they’ll be able to evaluate whether climate-related claims are valid.
  • ...13 more annotations...
  • Identify the claim being made
  • the most popular contrarian argument: “Earth’s climate has changed naturally in the past, so current climate change is natural.”
  • Construct the argument by identifying the premises leading to that conclusion.
  • Determine whether the argument is deductive, meaning that it starts out with a general statement and reaches a definitive conclusion.
  • the first premise is that Earth’s climate has changed in the past through natural processes, and the second premise is that the climate is currently changing.
  • Not all climate change is equal
  • Identify hidden premises. By adding an extra premise to make an invalid argument valid, we can gain a deeper understanding of why the argument is flawed.
  • the hidden assumption is “if nature caused climate change in the past, it must always be the cause of climate change.”
  • Check to see if the argument relies on ambiguity.
  • Check the argument for validity; does the conclusion follow from the premises?
  • Therefore, human activity is necessary to explain current climate change.
  • If the argument hasn’t yet been ruled out, determine the truth of its premises.
  • the argument that “if something was the cause in the past, it will be the cause in the future” is invalid if the effect has multiple plausible causes or mechanisms
knudsenlu

Imagining the Future Is Just Another Form of Memory - The Atlantic - 0 views

  • humans predict what the future will be like by using their memories
  • “When somebody’s preparing for a date with someone they’ve never been on a date with before, or a job interview—these situations where we don’t have past experience, that’s where we think this ability to imagine the future really matters,” says Karl Szpunar, a professor of psychology at the University of Illinois at Chicago. People “can take bits and pieces, like who’s going to be there, where it’s going to be, and try to put all that together into a novel simulation of events.”
  • The first clue that memory and imagining the future might go hand in hand came from amnesia patients. When they lost their pasts, it seemed, they lost their futures as well.
  • ...4 more annotations...
  • Since then, functional MRI scans have allowed researchers to determine that many of the same brain structures are indeed involved in both remembering and forecasting
  • And just as memories are more detailed the more recent they are, imagined future scenes are more detailed the nearer in the future they are
  • It’s not hard to see how this ability to imagine the future gives humans an evolutionary advantage. If you can plan for the future, you’re more likely to survive it. But there’s are limitations as well. Your accumulated experiences—and your cultural life script—are the only building blocks you have to construct a vision of the future. This can make it hard to expect the unexpected, and it means people often expect the future to be more like the past, or the present, than it will be.
  • There’s also an “optimistic, extreme positivity bias toward the future,” Bohn says. To the point that people “always say future events are more important to their identity and life story than the past events.” Talk about being nostalgic for the future.
Javier E

The Brain Has a Special Kind of Memory for Past Infections - Scientific American - 0 views

  • immune cells from the periphery routinely patrol the central nervous system and support its function. In a new study, researchers showed for the first time that—just as the brain remembers people, places, smells, and so on—it also stores what they call “memory traces” of the body’s past infections. Reactivating the same brain cells that encode this information is enough to swiftly summon the peripheral immune system to defend at-risk tissues.
  • It is clear the peripheral immune system is capable of retaining information about past infections to fight off future ones—otherwise, vaccines would not work. But Asya Rolls, a neuroimmunologist at Technion–Israel Institute of Technology and the paper’s senior author, says the study expands this concept of classical immunologic memory. Initially, she was taken aback that the brain could store traces of immune activity and use them to trigger such a precise response. “I was amazed,” she says.
  • After the infection and immune response dissipated, the researchers injected the mice with a drug that artificially reactivated those same groups of brain cells. They were stunned by what they saw: upon reactivation, the insular cortex directed the immune system to mount a targeted response in the gut at the site of the original inflammation—even though, by that time, there was no infection, tissue damage or pathogen-initiated local inflammation to be found. The brain had retained some sort of memory of the infection and was prepared to reinitiate the fight.
  • ...2 more annotations...
  • The new study provides “unassailable” evidence that the central nervous system can control the peripheral immune system, Tracey says. “It’s an incredibly important contribution to the fields of neuroscience and immunology.”
  • Just as researchers have traced sensory and motor processing to specific brain regions, Tracey suspects that a similar neurological “map” of immunologic information also exists. This new study, he says, is the first direct evidence of that map. “It’s going to be really exciting to see what comes next,” he adds.
lenaurick

Why time seems to speed up as we get older - Vox - 0 views

  • As part of a lifelong experiment on circadian rhythms, Sothern, now 69, is trying to confirm or reject a widely held belief: Many people feel that time flies by more quickly as they age.
  • So far, Sothern's results are inconclusive
  • "I'm tending now to overestimate the minute more than I used to," he tells me. But then again, he had detected a similar pattern — more overestimates — in the 1990s, only to have his estimates fall in the 2000s. "Time estimation isn't a perfect science," he says.
  • ...17 more annotations...
  • There's very little scientific evidence to suggest our perception of time changes as we age. And yet, we consistently report that the past felt longer — that time is flying by faster as we age. What's going on?
  • Scientists can look at time estimation, or our ability to estimate how long a minute passes, compared with a clock. (This is what Sothern is doing.) They can also look at time awareness, or the broad feeling that time is moving quickly or slowly. Finally there's time perspective, the sense of a past, present, and future as constructed by our memories.
  • What researchers have found out is that while time estimation and time awareness don't change much as we age, time perspective does. In other words: Our memories create the illusion time is accelerating.
  • There weren't many differences between the old and the young. "[C]hronological age showed no systematic influence on the perception of these brief intervals of time up," the authors wrote. (That said, the researchers did find that males overestimate time while females underestimate it, perhaps due to having slightly different circadian clocks and therefore slightly different metabolic rates
  • Here, too, age seemed not to matter. Older people didn't seem to be aware of time passing any faster than younger people. The only question that yielded a statistically significant difference was, "How fast did the last decade pass?" Even there, the reported differences were tiny, and the effect appeared to plateau around age 50.
  • psychologists William Friedman and Steve Janssen found scant evidence that the subjective experience of time speeds up with age. They write in their 2009 paper, "We can concluded that when adults report on their general impressions of the speed of time, age differences are very small."
  • One possibility is that participants were simply biased by the (incorrect) conventional wisdom — they reported their later years as flying by more quickly because that's what everyday lore says should happen.
  • When people reflect back on their own life, they feel like their early years went by very slowly and their later years go by more quickly. This could be the source of the belief that time goes more quickly as they age.
  •  "Most people feel that time is currently passing faster for them than it did in the past," Janssen writes me in an email. "They have forgotten how they experienced the passage of time when they were younger."
  • We use significant events as signposts to gauge the passage of time. The fewer events, the faster time seems to go by.
  • Childhood is full of big, memorable moments like learning to ride a bike or making first friends. By contrast, adult life becomes ordinary and mechanized, and ambles along by.
  • Each passing year converts some of this experience into automatic routine which we hardly notice at all, the days and weeks smooth themselves out in recollection, and the years grow hollow and collapse.
  • Each new minute represents a smaller fraction of our lives. One day as a 10 year old represents about .027 percent of the kid's life. A day for a 60 year old? .0045 percent. The kid's life is just... bigger.
  • Also, our ability to recall events declines with age. If we can't remember a time, it didn't happen.
  • "[F]inding that there is insufficient time to get things done may be reinterpreted as the feeling that time is passing quickly," they write. Deadlines always come sooner than we'd like.
  • Psychologists have long understood the phenomenon called "forward telescoping" — i.e., our tendency to underestimate how long ago very memorable events occurred. "Because we know that memories fade over time, we use the clarity of a memory as a guide to its recency," science writer Claudia Hammond writes in her book Time Warped. "So if a memory seems unclear we assumed it happened longer ago." But very clear memories are assumed to be more recent.
  • If our memories can trick us into thinking time is moving quickly, then maybe there are ways to trick our brains into thinking that time is slowing down — such as committing to breaking routines and learning new things. You're more likely to remember learning how to skydive than watching another hour of mindless television.
Javier E

What We Are Hearing About Clinton and Trump - The New York Times - 0 views

  • With a few exceptions, namely Mr. Trump’s views on immigration, Americans have little recall of reading, hearing or seeing information about the policies of the presidential candidates or their positions on issues. Our research shows instead that in the case of Mr. Trump, Americans monitor his statements, his accusations, his travel and his events, and in the case of Mrs. Clinton they report mainly hearing about her past behavior, her character and, most recently, her health.
  • The continuing research, conducted by Gallup in conjunction with University of Michigan and Georgetown, found that since early July more than seven in 10 Americans read, saw or heard something about at least one of the presidential candidates in the days before the daily interviews. On some days that number rises to over 80 percent and has never, even on weekends, fallen below 60 percen
  • Since July we have asked more than 30,000 Americans to say exactly what it was they read, saw or heard about the two major party candidates over the past several days. The type of information getting through to Americans varies significantly depending on whether the candidate in question is Mr. Trump or Mrs. Clinton
  • ...7 more annotations...
  • If Mr. Trump talks about Muslim parents and their son who was killed in action, that’s what the public remembers. If he goes to Mexico or Louisiana, that’s what they recall reading or hearing about him. If Mr. Trump calls President Obama the founder of the Islamic State, “ISIS” moves to the top of the list of what Americans tell us they are hearing about the Republican candidate.
  • it may not matter exactly why Americans are so likely to recall reading about Mrs. Clinton’s email situation week after week. Its looming prominence in the public’s mind has become a reality, and it has the effect of superseding public awareness of her policy speeches and statements about issues.
  • we can assume he wants his statements and actions to be seen and heard, to attract attention. The evidence is clear that they are. The public may be getting no more than a superficial understanding of Mr. Trump’s positions on key issues or how he would implement them as laws if he is elected, but the public clearly is repeating back to us what he intends for it to hear.
  • By contrast, it’s clear that Mrs. Clinton and her campaign team have not wanted her handling of emails to dominate what Americans have been taking away from her campaign over the past two months
  • What Americans recall hearing about Mrs. Clinton is significantly less varied. Specifically — and to an extraordinary degree — Americans have consistently told us that they are reading and hearing about her handling of emails
  • the public may be learning about the candidates’ temperament, character, personality and health issues, but from what they tell us, Americans aren’t getting much in the way of real substance.
  • The moderators of the coming series of debates will most likely focus directly on the candidates’ positions on issues. This may shift what Americans tell us they are learning about the candidates, and if so, it could signal a significant upgrade in the way the process is working. But that also means that a lot still depends on the candidates themselves and how they end up shaping the contours of the debates
sissij

The Future of Privacy - The New York Times - 0 views

  • Turning Point: Apple resists the F.B.I. in unlocking an iPhone in the San Bernardino terrorism case.
  • Privacy confuses me, beyond my simplest understanding, which is that individuals prefer, to different degrees, that information about them not be freely available to others.
  • What does it mean, in an ostensible democracy, for the state to keep secrets from its citizens? The idea of the secret state seems antithetical to democracy, since its citizens, the voters, can’t know what their government is doing.
  • ...7 more annotations...
  • If you have nothing to hide and you trust your government, what can you possibly have to fear? Except that one can just as readily ask: If you have nothing to hide, what do you really have, aside from the panoptic attention of a state, which itself keeps secrets?
  • Is individual privacy and state privacy the same thing?
  • In the short term, the span of a lifetime, many of us would argue for privacy, and therefore against transparency.
  • But history, the long term, is transparency; it is the absence of secrets.
  • The past, our own past, which our descendants will see us as having emerged from, will not be the past from which we now see ourselves emerging, but a reinterpretation of it, based on subsequently available information, greater transparency and fewer secrets.
  • our species is the poorer for every secret faithfully kept. Any permanently unbreakable encryption seems counter to that.
  • So perhaps that desire is as much a part of us, as a species, as our need to build these memory palaces.
  •  
    I found this article very interesting because it talked about the dilemma in the definition of privacy. I found that our idea of privacy is very complicated. As everybody wants to keep their secret, they are curious of others' secret at the same time. This article also relate privacy with the democratic society. I think it shows one of the weakness of the democratic society: it trying to fulfill the desire of everybody and sometimes those desires contradicts each other. I think privacy is just like freedom, something very theoretical and does not exist in the reality. --Sissi (12/7/2016)
kushnerha

Our Natural History, Endangered - The New York Times - 0 views

  • Worse, this rumored dustiness reinforces the widespread notion that natural history museums are about the past — just a place to display bugs and brontosaurs. Visitors may go there to be entertained, or even awe-struck, but they are often completely unaware that curators behind the scenes are conducting research into climate change, species extinction and other pressing concerns of our day. That lack of awareness is one reason these museums are now routinely being pushed to the brink. Even the National Science Foundation, long a stalwart of federal support for these museums, announced this month that it was suspending funding for natural history collections as it conducts a yearlong budget review.
  • It gets worse: A new Republican governor last year shut down the renowned Illinois State Museum, ostensibly to save the state $4.8 million a year. The museum pointed out that this would actually cost $33 million a year in lost tourism revenue and an untold amount in grants. But the closing went through, endangering a trove of 10 million artifacts, from mastodon bones to Native American tools, collected over 138 years, and now just languishing in the shuttered building. Eric Grimm, the museum’s director of science, characterized it as an act of “political corruption and malevolent anti-intellectualism.”
  • Other museums have survived by shifting their focus from research to something like entertainment.
  • ...9 more annotations...
  • The pandering can be insidious, too. The Perot Museum of Nature and Science in Dallas, which treats visitors to a virtual ride down a hydraulic fracturing well, recently made headlines for avoiding explicit references to climate change. Other museums omit scientific information on evolution. “We don’t need people to come in here and reject us,”
  • Even the best natural history museums have been obliged to reduce their scientific staff in the face of government cutbacks and the decline in donations following the 2008 economic crash. They still have their collections, and their public still comes through the door. But they no longer employ enough scientists to interpret those collections adequately for visitors or the world at large. Hence the journal Nature last year characterized natural history collections as “the endangered dead.”
  • these collections are less about the past than about our world and how it is changing. Sediment cores like the ones at the Illinois State Museum, for instance, may not sound terribly important, but the pollen in them reveals how past climates changed, what species lived and died as a result, and thus how our own future may be rapidly unfolding.
  • Natural history museums are so focused on the future that they have for centuries routinely preserved such specimens to answer questions they didn’t yet know how to ask, requiring methodologies that had not yet been invented, to make discoveries that would have been, for the original collectors, inconceivable.
  • THE people who first put gigantic mammoth and mastodon specimens in museums, for instance, did so mainly out of dumb wonderment. But those specimens soon led to the stunning 18th-century recognition that parts of God’s creation could become extinct. The heretical idea of extinction then became an essential preamble to Darwin, whose understanding of evolution by natural selection depended in turn on the detailed study of barnacle specimens collected and preserved over long periods and for no particular reason. Today, those same specimens continue to answer new questions with the help of genome sequencing, CT scans, stable isotope analysis and other technologies.
  • These museums also play a critical role in protecting what’s left of the natural world, in part because they often combine biological and botanical knowledge with broad anthropological experience.
  • “You have no nationality. You are scientists. You speak for nature.” Just since 1999, according to the Field Museum, inventories by its curators and their collaborators have been a key factor in the protection of 26.6 million acres of wilderness, mainly in the headwaters of the Amazon.
  • It may be optimistic to say that natural history museums have saved the world. It may even be too late for that. But they provide one other critical service that can save us, and our sense of wonder: Almost everybody in this country — even children in Denver who have never been to the Rocky Mountains, or people in San Francisco who have never walked on a Pacific Ocean beach — goes to a natural history museum at some point in his life, and these visits influence us in deep and unpredictable ways.
  • we dimly begin to understand the passage of time and cultures, and how our own species fits amid millions of others. We start to understand the strangeness and splendor of the only planet where we will ever have the great pleasure of living.
1 - 20 of 611 Next › Last »
Showing 20 items per page