Skip to main content

Home/ TOK@ISPrague/ Group items tagged standards

Rss Feed Group items tagged

markfrankel18

When are you dead? - 2011 SPRING - Stanford Medicine Magazine - Stanford University Sch... - 0 views

  • A little more than 40 years ago, a partially functioning brain would not have gotten in the way of organ donation; irreversible cardiopulmonary failure was still the only standard for determining death. But during the 1970s, that began to change, and by the early 1980s, the cessation of all brain activity — brain death — had become a widely accepted standard. In the transplant community, brain death was attractive for one particular reason: The bodies of such donors could remain on respirators to keep their organs healthy, even during much of the organ-removal surgery. Today, the medical establishment, facing a huge shortage of organs, needs new sources for transplantation. One solution has been a return to procuring organs from patients who die of heart failure. Before dying, these patients are likely to have been in a coma, sustained by a ventilator, with very minimal brain function — a hopeless distance from what we mean by consciousness. Still, many people, including some physicians, consider this type of organ donation, known as “donation after cardiac death” or DCD, as akin to murder.
Lawrence Hrubes

Brain Games are Bogus | GarethCook - 0 views

  •  
    " A pair of scientists in Europe recently gathered all of the best research-twenty-three investigations of memory training by teams around the world-and employed a standard statistical technique (called meta-analysis) to settle this controversial issue. The conclusion: the games may yield improvements in the narrow task being trained, but this does not transfer to broader skills like the ability to read or do arithmetic, or to other measures of intelligence."
Lawrence Hrubes

Executing Them Softly - NYTimes.com - 0 views

  • Since the late 19th century in the United States, critical responses to the spectacle of pain in executions have continued to spur ardent calls for the improvement of killing technology. One of the most prolific legal theorists of capital punishment, Austin Sarat, has concisely referred to this history: “The movement from hanging to electrocution, from electrocution to the gas chamber, from gas to lethal injection, reads like someone’s version of the triumph of progress, with each new technique enthusiastically embraced as the latest and best way to kill without imposing pain.” Recent debates over the administration of midazolam and pentobarbital, and in what dosage, seamlessly integrate themselves into Sarat’s grim progress narrative. The inexhaustible impulse to seek out less painful killing technologies puts a series of questions in sharp relief: What is, and should be, the role of pain in retributive justice? And how has the law come to rationalize the condemned’s experience of pain during an execution? While the Eighth Amendment stipulates the necessity of avoiding “cruel and unusual punishment,” in 1890 the Supreme Court decided this clause could mean that no method of execution should impose “something more than the mere extinguishment of life.” And then, in 1958, the court also determined that the amendment should reflect the “evolving standards of decency that mark the progress of a maturing society.” If we were to consider the “standard of decency” in our society today, we would be pushed to ask: By what moral order have we continued to establish the “extinguishment of life” as something “mere,” and the pain of the condemned as excessive? In other words, how has the pain experienced during an execution become considered cruel and unconstitutional but not the very act of killing itself? We should dial back to older histories of law to tap into pain’s perennially vexed role in retributive theories of justice.
markfrankel18

The Moral Instinct - New York Times - 3 views

  • It seems we may all be vulnerable to moral illusions the ethical equivalent of the bending lines that trick the eye on cereal boxes and in psychology textbooks. Illusions are a favorite tool of perception scientists for exposing the workings of the five senses, and of philosophers for shaking people out of the naïve belief that our minds give us a transparent window onto the world (since if our eyes can be fooled by an illusion, why should we trust them at other times?). Today, a new field is using illusions to unmask a sixth sense, the moral sense.
  • The first hallmark of moralization is that the rules it invokes are felt to be universal. Prohibitions of rape and murder, for example, are felt not to be matters of local custom but to be universally and objectively warranted. One can easily say, “I don’t like brussels sprouts, but I don’t care if you eat them,” but no one would say, “I don’t like killing, but I don’t care if you murder someone.”The other hallmark is that people feel that those who commit immoral acts deserve to be punished.
  • Until recently, it was understood that some people didn’t enjoy smoking or avoided it because it was hazardous to their health. But with the discovery of the harmful effects of secondhand smoke, smoking is now treated as immoral. Smokers are ostracized; images of people smoking are censored; and entities touched by smoke are felt to be contaminated (so hotels have not only nonsmoking rooms but nonsmoking floors). The desire for retribution has been visited on tobacco companies, who have been slapped with staggering “punitive damages.” At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices.
  • ...10 more annotations...
  • But whether an activity flips our mental switches to the “moral” setting isn’t just a matter of how much harm it does. We don’t show contempt to the man who fails to change the batteries in his smoke alarms or takes his family on a driving vacation, both of which multiply the risk they will die in an accident. Driving a gas-guzzling Hummer is reprehensible, but driving a gas-guzzling old Volvo is not; eating a Big Mac is unconscionable, but not imported cheese or crème brûlée. The reason for these double standards is obvious: people tend to align their moralization with their own lifestyles.
  • People don’t generally engage in moral reasoning, Haidt argues, but moral rationalization: they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.
  • Together, the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis.
  • The psychologist Philip Tetlock has shown that the mentality of taboo — a conviction that some thoughts are sinful to think — is not just a superstition of Polynesians but a mind-set that can easily be triggered in college-educated Americans. Just ask them to think about applying the sphere of reciprocity to relationships customarily governed by community or authority. When Tetlock asked subjects for their opinions on whether adoption agencies should place children with the couples willing to pay the most, whether people should have the right to sell their organs and whether they should be able to buy their way out of jury duty, the subjects not only disagreed but felt personally insulted and were outraged that anyone would raise the question.
  • The moral sense, then, may be rooted in the design of the normal human brain. Yet for all the awe that may fill our minds when we reflect on an innate moral law within, the idea is at best incomplete. Consider this moral dilemma: A runaway trolley is about to kill a schoolteacher. You can divert the trolley onto a sidetrack, but the trolley would trip a switch sending a signal to a class of 6-year-olds, giving them permission to name a teddy bear Muhammad. Is it permissible to pull the lever? This is no joke. Last month a British woman teaching in a private school in Sudan allowed her class to name a teddy bear after the most popular boy in the class, who bore the name of the founder of Islam. She was jailed for blasphemy and threatened with a public flogging, while a mob outside the prison demanded her death. To the protesters, the woman’s life clearly had less value than maximizing the dignity of their religion, and their judgment on whether it is right to divert the hypothetical trolley would have differed from ours. Whatever grammar guides people’s moral judgments can’t be all that universal. Anyone who stayed awake through Anthropology 101 can offer many other examples.
  • The impulse to avoid harm, which gives trolley ponderers the willies when they consider throwing a man off a bridge, can also be found in rhesus monkeys, who go hungry rather than pull a chain that delivers food to them and a shock to another monkey. Respect for authority is clearly related to the pecking orders of dominance and appeasement that are widespread in the animal kingdom. The purity-defilement contrast taps the emotion of disgust that is triggered by potential disease vectors like bodily effluvia, decaying flesh and unconventional forms of meat, and by risky sexual practices like incest.
  • All this brings us to a theory of how the moral sense can be universal and variable at the same time. The five moral spheres are universal, a legacy of evolution. But how they are ranked in importance, and which is brought in to moralize which area of social life — sex, government, commerce, religion, diet and so on — depends on the culture.
  • By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness. The idea that the moral sense is an innate part of human nature is not far-fetched. A list of human universals collected by the anthropologist Donald E. Brown includes many moral concepts and emotions, including a distinction between right and wrong; empathy; fairness; admiration of generosity; rights and obligations; proscription of murder, rape and other forms of violence; redress of wrongs; sanctions for wrongs against the community; shame; and taboos.
  • Here is the worry. The scientific outlook has taught us that some parts of our subjective experience are products of our biological makeup and have no objective counterpart in the world. The qualitative difference between red and green, the tastiness of fruit and foulness of carrion, the scariness of heights and prettiness of flowers are design features of our common nervous system, and if our species had evolved in a different ecosystem or if we were missing a few genes, our reactions could go the other way. Now, if the distinction between right and wrong is also a product of brain wiring, why should we believe it is any more real than the distinction between red and green? And if it is just a collective hallucination, how could we argue that evils like genocide and slavery are wrong for everyone, rather than just distasteful to us?
  • Putting God in charge of morality is one way to solve the problem, of course, but Plato made short work of it 2,400 years ago. Does God have a good reason for designating certain acts as moral and others as immoral? If not — if his dictates are divine whims — why should we take them seriously? Suppose that God commanded us to torture a child. Would that make it all right, or would some other standard give us reasons to resist? And if, on the other hand, God was forced by moral reasons to issue some dictates and not others — if a command to torture a child was never an option — then why not appeal to those reasons directly?
markfrankel18

The Price of Denialism - NYTimes.com - 1 views

  • In other words, we need to be able to tell when we believe or disbelieve in something based on high standards of evidence and when we are just engaging in a bit of motivated reasoning and letting our opinions take over. When we withhold belief because the evidence does not live up to the standards of science, we are skeptical. When we refuse to believe something, even in the face of what most others would take to be compelling evidence, we are engaging in denial. In most cases, we do this because at some level it upsets us to think that the theory is true.
  • So how to tell a fact from an opinion? By the time we sit down to evaluate the evidence for a scientific theory, it is probably too late. If we take the easy path in our thinking, it eventually becomes a habit. If we lie to others, sooner or later we may believe the lie ourselves. The real battle comes in training ourselves to embrace the right attitudes about belief formation in the first place, and for this we need to do a little philosophy.
markfrankel18

Why Americans Are the Weirdest People in the World - 1 views

  • Henrich’s work with the ultimatum game was an example of a small but growing countertrend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that, like Henrich’s work with the Machiguenga, challenged long-held assumptions of human psychological universality.
  • As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies. When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.
  • Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, however, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
Lawrence Hrubes

You're an Adult. Your Brain, Not So Much. - The New York Times - 0 views

  • Leah H. Somerville, a Harvard neuroscientist, sometimes finds herself in front of an audience of judges. They come to hear her speak about how the brain develops.It’s a subject on which many legal questions depend. How old does someone have to be to be sentenced to death? When should someone get to vote? Can an 18-year-old give informed consent?
  • Eventually this reshaping slows, a sign that the brain is maturing. But it happens at different rates in different parts of the brain.The pruning in the occipital lobe, at the back of the brain, tapers off by age 20. In the frontal lobe, in the front of the brain, new links are still forming at age 30, if not beyond.“It challenges the notion of what ‘done’ really means,” Dr. Somerville said.
Lawrence Hrubes

The Responsibility of Knowledge: Developing Holocaust Education for the Third... - 2 views

  • In a radio address in 1966 the prominent German philosopher, Theodor Adorno, declared his dissatisfaction with the state of Holocaust consciousness. He claimed that ignorance of the barbarity of the Holocaust is “itself a symptom of the continuing potential for its recurrence as far as peoples’ conscious and unconscious is concerned.” (Adorno, Education After Auschwitz). It is for this reason that he envisioned education as the institution which would be most responsible for instilling values in the masses so that they have the agency to oppose barbarism.  Adorno spoke not only of education in childhood, but “then the general enlightenment that provides an intellectual, cultural, and social climate in which a recurrence would no longer be possible.” Almost 40 years later, the Holocaust education is still important, not only to combat another genocide but also to provide a consciousness of human rights necessary in a world where such standards are becoming commonplace. Holocaust education is in a state of constant evolution. As generations grow up and new ones are born, as distance from the Holocaust increases, it is necessary to reform the methods in which its history is taught. As survivors die and the third generation slowly drifts out of the Holocaust’s shadow, education must be buttressed with an understanding of the applicable lessons and principles that may derive from the Holocaust. For this education to have any meaning, those mechanisms that allowed the Holocaust to take place must be fully understood. History must empower pupils with the understanding of various choices they must make and their ultimate impact on society. 
Lawrence Hrubes

Muslim Boys at a Swiss School Must Shake Teachers' Hands, Even Female Ones - The New Yo... - 0 views

  • When two Syrian immigrant brothers refused to shake their female teacher’s hand this year, it ignited national outrage in Switzerland.This week, the authorities in the northern canton of Basel-Landschaft ruled that the students, who studied at a public school in the small town of Therwil, could not refuse to shake their teacher’s hand on religious grounds. They said that parents whose children refused to obey the longstanding tradition could be fined up to 5,000 Swiss francs, about $5,050.Shaking a teacher’s hand before and after class is part of Switzerland’s social fabric, and is considered an important sign of politeness and respect.
Lawrence Hrubes

Do Honeybees Feel? Scientists Are Entertaining the Idea - The New York Times - 0 views

  • Bees find nectar and tell their hive-mates; flies evade the swatter; and cockroaches seem to do whatever they like wherever they like. But who would believe that insects are conscious, that they are aware of what’s going on, not just little biobots?Neuroscientists and philosophers apparently. As scientists lean increasingly toward recognizing that nonhuman animals are conscious in one way or another, the question becomes: Where does consciousness end?Andrew B. Barron, a cognitive scientist, and Colin Klein, a philosopher, at Macquarie University in Sydney, Australia, propose in Proceedings of the National Academy of Sciences that insects have the capacity for consciousness.
Lawrence Hrubes

A New Form of Stem-Cell Engineering Raises Ethical Questions - The New York Times - 0 views

  • As biological research races forward, ethical quandaries are piling up. In a report published Tuesday in the journal eLife, researchers at Harvard Medical School said it was time to ponder a startling new prospect: synthetic embryos.In recent years, scientists have moved beyond in vitro fertilization. They are starting to assemble stem cells that can organize themselves into embryolike structures.Soon, experts predict, they will learn how to engineer these cells into new kinds of tissues and organs. Eventually, they may take on features of a mature human being.
markfrankel18

Scientific Pride and Prejudice - NYTimes.com - 0 views

  • The natural sciences often offer themselves as a model to other disciplines. But this time science might look for help to the humanities, and to literary criticism in particular.A major root of the crisis is selective use of data. Scientists, eager to make striking new claims, focus only on evidence that supports their preconceptions
  • Despite the popular belief that anything goes in literary criticism, the field has real standards of scholarly validity.
  • In his 1960 book “Truth and Method,” the influential German philosopher Hans-Georg Gadamer argues that an interpreter of a text must first question “the validity — of the fore-meanings dwelling within him.” However, “this kind of sensitivity involves neither ‘neutrality’ with respect to content nor the extinction of one’s self.” Rather, “the important thing is to be aware of one’s own bias.” To deal with the problem of selective use of data, the scientific community must become self-aware and realize that it has a problem. In literary criticism, the question of how one’s arguments are influenced by one’s prejudgments has been a central methodological issue for decades.
  • ...1 more annotation...
  • Perhaps because of its self-awareness about what Austen would call the “whims and caprices” of human reasoning, the field of psychology has been most aggressive in dealing with doubts about the validity of its research.
markfrankel18

English Has a New Preposition, Because Internet - Megan Garber - The Atlantic - 0 views

  • Linguists are recognizing the delightful evolution of the word "because." 
  • he word "because," in standard English usage, is a subordinating conjunction, which means that it connects two parts of a sentence in which one (the subordinate) explains the other. In that capacity, "because" has two distinct forms. It can be followed either by a finite clause (I'm reading this because [I saw it on the web]) or by a prepositional phrase (I'm reading this because [of the web]). These two forms are, traditionally, the only ones to which "because" lends itself. I mention all that ... because language. Because evolution. Because there is another way to use "because." Linguists are calling it the "prepositional-because." Or the "because-noun." You probably know it better, however, as explanation by way of Internet—explanation that maximizes efficiency and irony in equal measure. I'm late because YouTube. You're reading this because procrastination. As the language writer Stan Carey delightfully sums it up: "'Because' has become a preposition, because grammar." 
Michael Peters

Increasing Number Of Men Pressured To Accept Realistic Standards Of Female Beauty | The... - 0 views

  •  
    Satire, but totally on-point.
Lawrence Hrubes

Louis C.K. Against the Common Core : The New Yorker - 1 views

  • “Students who already believe they are not as academically successful as their more affluent peers, will further internalize defeat,” Carol Burris, a principal from Rockville Centre, wrote in the Washington Post last summer, calling on policymakers to “re-examine their belief that college readiness is achieved by attaining a score on a test, and its corollary—that is possible to create college readiness score thresholds for eight year olds.” This week, teachers at International High School at Prospect Heights, which serves a population of recently arrived immigrants from non-English-speaking countries, announced that they would not administer an assessment required by the city. A pre-test in the fall “was a traumatic and demoralizing experience for students,” a statement issued by the teachers said. “Many students, after asking for help that teachers were not allowed to give, simply put their heads down for the duration. Some students even cried.”
Lawrence Hrubes

BBC News - Caesium: A brief history of timekeeping - 1 views

  • The answer is that whenever you have a network operating over distance, accurate timekeeping is essential for synchronisation. And the faster the speed of travel, the more accurate the timekeeping must be. Hence in the modern world, where information travels at almost the speed of light down wires or through the air, accuracy is more important than ever. What caesium has done is to raise the standards for the measurement of time exponentially.
  • As the electron moves out into the wider orbit it absorbs energy, and as it jumps back in it releases it in the form of light, fluorescing very slightly. That means you can tell when you've hit the sweet spot of 9,192,631,770 Hz. It's because this transition frequency is so much higher than the resonant frequency of quartz that a caesium clock is so much more accurate. The caesium fountain at NPL, Szymaniec tells me proudly, is accurate to one second in every 158 million years. That means it would only be a second out if it had started keeping time back in the peak of the Jurassic Period when diplodocus were lumbering around and pterodactyls wheeling in the sky.
  • Now, if such insane levels of accuracy seem pointless, then think again. Without the caesium clock, for example, satellite navigation would be impossible. GPS satellites carry synchronised caesium clocks that enable them collectively to triangulate your position and work out where on earth you are. And the practical applications do not end there. Just ask Leon Lobo - he's in charge of time "dissemination" at NPL. His job is to tell the time to the UK. For a fee. NPL has just begun offering businesses standardised timekeeping accurate to the nearest microsecond - a millionth of a second. Mr Lobo is targeting a wide range of clients that all have one thing in common - they need to synchronise a network that operates at speeds far faster than any trains.
  • ...3 more annotations...
  • Mr Lobo's biggest target is the financial markets, which these days are dominated by computers programmed to place thousands of trades per second, transmitted down wires at almost the speed of light. In this world, the equivalent of a train crash would be ill-timed bets that rack up millions of dollars in losses, and might even briefly sink the market in the process. Unsurprisingly, financial regulators increasingly require a super-accurate timestamp on every transaction.
  • The switch to atomic time was for good reason. The rotation of the earth, it turned out, was not such a reliable measure of time. No day or year is exactly the same length.
  • Due to the earth's elliptical orbit, the sun can be as much as 16 minutes out of line with mean solar time. Add the distortion of time zones, which average time across huge regions, and the difference is far greater. China, which is almost 5,000km wide, has a single time zone spanning 1h40 of solar time. The decision of some countries to adjust the clocks twice a year as a "daylight saving" measure exaggerates the issue yet further.
markfrankel18

Jaron Lanier on Lack of Transparency in Facebook Study - NYTimes.com - 0 views

  • SHOULD we worry that technology companies can secretly influence our emotions? Apparently so.
  • Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook’s generic click-through agreement, which almost no one reads and which doesn’t mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.
  • It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. Imagine a pharmaceutical researcher saying, “I was only looking at a narrow research question, so I don’t know if my drug harmed anyone, and I haven’t bothered to find out.” Unfortunately, this seems to be an acceptable attitude when it comes to experimenting with people over social networks. It needs to change.
  • ...1 more annotation...
  • Stealth emotional manipulation could be channeled to sell things (you suddenly find that you feel better after buying from a particular store, for instance), but it might also be used to exert influence in a multitude of other ways.
Lawrence Hrubes

Lawsuit Filed Today on Behalf of Chimpanzee Seeking Legal Personhood : The Nonhuman Rig... - 0 views

  • The Nonhuman Rights Project is the only organization working toward actual LEGAL rights for members of species other than our own. Our mission is to change the common law status of at least some nonhuman animals from mere “things,” which lack the capacity to possess any legal right, to “persons,” who possess such fundamental rights as bodily integrity and bodily liberty, and those other legal rights to which evolving standards of morality, scientific discovery, and human experience entitle them. Our first cases were filed in 2013.
  • The legal cause of action that we are using is the common law writ of habeas corpus, through which somebody who is being held captive, for example in prison, seeks relief by having a judge call upon his captors to show cause as to why they have the right to hold him. More specifically, our suits are based on a case that was fought in England in 1772, when an American slave, James Somerset, who had been taken to London by his owner, escaped, was recaptured and was being held in chains on a ship that was about to set sail for the slave markets of Jamaica. With help from a group of abolitionist attorneys, Somerset’s godparents filed a writ of habeas corpus on Somerset’s behalf in order to challenge Somerset’s classification as a legal thing, and the case went before the Chief Justice of the Court of King’s Bench, Lord Mansfield. In what became one of the most important trials in Anglo-American history, Lord Mansfield ruled that Somerset was not a piece of property, but instead a legal person, and he set him free.
Lawrence Hrubes

FDA Ponders Putting Homeopathy To A Tougher Test : Shots - Health News : NPR - 0 views

  • In 1988, the Food and Drug Administration decided not to require homeopathic remedies to go through the same drug-approval process as standard medical treatments. Now the FDA is revisiting that decision. It will hold two days of hearings this week to decide if homeopathic remedies should have to be proven safe and effective.
markfrankel18

The Problem With History Classes - Atlantic Mobile - 1 views

  • Currently, most students learn history as a set narrative—a process that reinforces the mistaken idea that the past can be synthesized into a single, standardized chronicle of several hundred pages. This teaching pretends that there is a uniform collective story, which is akin to saying everyone remembers events the same. Yet, history is anything but agreeable. It is not a collection of facts deemed to be "official" by scholars on high. It is a collection of historians exchanging different, often conflicting analyses. And rather than vainly seeking to transcend the inevitable clash of memories, American students would be better served by descending into the bog of conflict and learning the many "histories" that compose the American national story.
  • History may be an attempt to memorialize and preserve the past, but it is not memory; memories can serve as primary sources, but they do not stand alone as history. A history is essentially a collection of memories, analyzed and reduced into meaningful conclusions—but that collection depends on the memories chosen.
  • Although, as Urist notes, the AP course is "designed to teach students to think like historians," my own experience in that class suggests that it fails to achieve that goal. The course’s framework has always served as an outline of important concepts aiming to allow educators flexibility in how to teach; it makes no reference to historiographical conflicts. Historiography was an epiphany for me because I had never before come face-to-face with how historians think and reason—how they construct an argument, what sources animate that approach, and how their position responds to other historians. When I took AP U.S. History, I jumbled these diverse histories into one indistinct narrative. Although the test involved open-ended essay questions, I was taught that graders were looking for a firm thesis—forcing students to adopt a side. The AP test also, unsurprisingly, rewards students who cite a wealth of supporting details. By the time I took the test in 2009, I was a master at "checking boxes," weighing political factors equally against those involving socioeconomics and ensuring that previously neglected populations like women and ethnic minorities received their due. I did not know that I was pulling ideas from different historiographical traditions. I still subscribed to the idea of a prevailing national narrative and served as an unwitting sponsor of synthesis, oblivious to the academic battles that made such synthesis impossible.  
1 - 20 of 34 Next ›
Showing 20 items per page