Skip to main content

Home/ TOK Friends/ Group items tagged history

Rss Feed Group items tagged

Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
Javier E

Book review - The Dawn of Everything: A New History of Humanity | The Inquisitive Biolo... - 0 views

  • Every few years, it seems, there is a new bestselling Big History book. And not infrequently, they have rather grandiose titles.
  • , I hope to convince you why I think this book will stand the test of time better.
  • First, rather than one author’s pet theory, The Dawn of Everything is the brainchild of two outspoken writers: anthropologist David Graeber (a figurehead in the Occupy Wall Street movement and author of e.g. Bullshit Jobs) and archaeologist David Wengrow (author of e.g. What Makes Civilization?). I expect a large part of their decade-long collaboration consisted of shooting holes in each other’s arguments
  • ...24 more annotations...
  • Colonisation exposed us to new ideas that shocked and confused us. Graeber & Wengrow focus on the French coming into contact with Native Americans in Canada, and in particular on Wendat Confederacy philosopher–statesman Kandiaronk as an example of European traders, missionaries, and intellectuals debating with, and being criticized by indigenous people. Historians have downplayed how much these encounters shaped Enlightenment ideas.
  • this thought-provoking book is armed to the teeth with fascinating ideas and interpretations that go against mainstream thinking
  • ather than yet another history book telling you how humanity got here, they take their respective disciplines to task for dealing in myths.
  • Its legacy, shaped via several iterations, is the modern textbook narrative: hunter-gathering was replaced by pastoralism and then farming; the agricultural revolution resulted in larger populations producing material surpluses; these allowed for specialist occupations but also needed bureaucracies to share and administer them to everyone; and this top-down control led to today’s nation states. Ta-daa!
  • this simplistic tale of progress ignores and downplays that there was nothing linear or inevitable about where we have ended up.
  • ake agriculture. Rather than humans enthusiastically entering into what Harari in Sapiens called a Faustian bargain with crops, there were many pathways and responses
  • Experiments show that plant domestication could have been achieved in as little as 20–30 years, so the fact that cereal domestication here took some 3,000 years questions the notion of an agricultural “revolution”. Lastly, this book includes many examples of areas where agriculture was purposefully rejected. Designating such times and places as “pre-agricultural” is misleading, write the authors, they were anti-agricultural.
  • The idea that agriculture led to large states similarly needs revision
  • correlation is not causation, and some 15–20 additional centres of domestication have since been identified that followed different paths. Some cities have previously remained hidden in the sediments of ancient river deltas until revealed by modern remote-sensing technology.
  • “extensive agriculture may thus have been an outcome, not a cause, of urbanization”
  • And cities did not automatically imply social stratification. The Dawn of Everything fascinates with its numerous examples of large settlements without ruling classes, such as Ukrainian mega-sites, the Harappan civilization, or Mexican city-states.
  • These instead relied on collective decision-making through assemblies or councils, which questions some of the assumptions of evolutionary psychology about scale: that larger human groups require complex (i.e. hierarchical) systems to organize them.
  • e what is staring them in the face
  • humans have always been very capable of consciously experimenting with different social arrangements. And—this is rarely acknowledged—they did so on a seasonal basis, spending e.g. part of the year settled in large communal groups under a leader, and another part as small, independently roving bands.
  • Throughout, Graeber & Wengrow convincingly argue that the only thing we can say about our ancestors is that “there is no single pattern. The only consistent phenomenon is the very fact of alteration […] If human beings, through most of our history, have moved back and forth fluidly between different social arrangements […] maybe the real question should be ‘how did we get stuck?
  • Next to criticism, the authors put out some interesting ideas of their own, of which I want to quickly highlight two.
  • The first is that some of the observed variations in social arrangements resulted from schismogenesis. Anthropologist Gregory Bateson coined this term in the 1930s to describe how people define themselves against or in opposition to others, adopting behaviours and attitudes that are different.
  • The second idea is that states can be described in terms of three elementary forms of domination: control of violence, control of information, and individual charisma, which express themselves as sovereignty, administration, and competitive politics.
  • Our current states combine these three, and thus we have state-endorsed violence in the form of law enforcement and armies, bureaucracy, and the popularity contests we call elections in some countries, and monarchs, oligarchs, or tyrants in other countries. But looking at history, there is no reason why this should be and the authors provide examples of societies that showed only one or two such forms of control
  • Asking which past society most resembles today’s is the wrong question to ask. It risks slipping into an exercise in retrofitting, “which makes us scour the ancient world for embryonic versions of our modern nation states”
  • I have left unmentioned several other topics: the overlooked role of women, the legacy of Rousseau’s and Hobbes’s ideas, the origins of inequality and the flawed assumptions hiding behind that question
  • There are so many historical details and delights hiding between these covers that I was thoroughly enthralle
  • If you have any interest in big history, archaeology, or anthropology, this book is indispensable. I am confident that the questions and critiques raised here will remain relevant for a long time to come.
  • I was particularly impressed by the in-depth critique by worbsintowords on his YouTube channel What is Politics? of (so far) five videos
Emily Freilich

The trouble with teaching history | Nick Shepley | Comment is free | theguardian.com - 1 views

  • it became quickly clear that the students didn't know what the Holocaust was
  • s it possible that poor teaching has played a part in the fall in standards? I think it is less to do with poor teaching and more the result of a teaching culture in which humanities subjects
  • If humanities do matter at all, they are always of secondary importance to scientific subjects, and we tell students that if they master sciences, good jobs and future success will be theirs.
  • ...5 more annotations...
  • Teaching factual knowledge is one thing, but empowering children to interpret the facts is another altogether – there is, after all, not one history but many.
  • n an increasingly multicultural classroom, it is surely anachronistic at best, if one is teaching about the history of the British empire, to present a fixed, monolithic interpretation of the past, one written by the victors
  • Pupils who simply have to listen, copy, repeat and memorise quickly find tasks meaningless and the subject of no relevance to them
  • Get them to write history themselves.
  • This year, I have encouraged a dozen or so pupils to write their own history of the second world war. Using print-on-demand publishing, at the end of the summer term they will have their own book – a physical, tangible product of their labours.
sissij

Millicent Fawcett Is First Woman to Get Statue in London's Parliament Square - The New ... - 0 views

  • LONDON — Britain, which has its second female prime minister and a queen who is the world’s longest-reigning monarch, is getting its first statue of a woman in Parliament Square in London, where there are 11 statues of men.
  • It is right and proper that she is honored in Parliament Square alongside former leaders who changed our country. Her statue will stand as a reminder of how politics only has value if it works for everyone in society.”
  • Mrs. Fawcett considered herself a suffragist, a moderate opposed to the sometimes violent protests of campaigners like Emmeline and Christabel Pankhurst, a mother and daughter who were known as suffragettes.
  • ...1 more annotation...
  • Mrs. Fawcett inspired a women’s rights charity, the Fawcett Society. Its chief executive, Sam Smethers, said of plans for a statue in her honor: “Her contribution was great, but she has been overlooked and unrecognized until now. By honoring her, we also honor the wider suffrage movement.”
  •  
    As now we are learning about history in TOK, I think the value of history can be giving us courage. For example, in this article, we can see that how women in the past take hold of the strong pressure possessed on them to be themselves rather than a belonging and property of their husband. I think the history of this woman is very encouraging and valuable for the feminism in nowadays. History gives us inspiration. --Sissi (4/2/2017)
Javier E

He Wants to Save Classics From Whiteness. Can the Field Survive? - The New York Times - 0 views

  • Padilla laid out an indictment of his field. “If one were intentionally to design a discipline whose institutional organs and gatekeeping protocols were explicitly aimed at disavowing the legitimate status of scholars of color,” he said, “one could not do better than what classics has done.”
  • Padilla believes that classics is so entangled with white supremacy as to be inseparable from it. “Far from being extrinsic to the study of Greco-Roman antiquity,” he has written, “the production of whiteness turns on closer examination to reside in the very marrows of classics.”
  • Rather than kowtowing to criticism, Williams said, “maybe we should start defending our discipline.” She protested that it was imperative to stand up for the classics as the political, literary and philosophical foundation of European and American culture: “It’s Western civilization. It matters because it’s the West.” Hadn’t classics given us the concepts of liberty, equality and democracy?
  • ...46 more annotations...
  • Williams ceded the microphone, and Padilla was able to speak. “Here’s what I have to say about the vision of classics that you outlined,” he said. “I want nothing to do with it. I hope the field dies that you’ve outlined, and that it dies as swiftly as possible.”
  • “I believe in merit. I don’t look at the color of the author.” She pointed a finger in Padilla’s direction. “You may have got your job because you’re Black,” Williams said, “but I would prefer to think you got your job because of merit.”
  • What he did find was a slim blue-and-white textbook titled “How People Lived in Ancient Greece and Rome.” “Western civilization was formed from the union of early Greek wisdom and the highly organized legal minds of early Rome,” the book began. “The Greek belief in a person’s ability to use his powers of reason, coupled with Roman faith in military strength, produced a result that has come to us as a legacy, or gift from the past.” Thirty years later, Padilla can still recite those opening lines.
  • In 2017, he published a paper in the journal Classical Antiquity that compared evidence from antiquity and the Black Atlantic to draw a more coherent picture of the religious life of the Roman enslaved. “It will not do merely to adopt a pose of ‘righteous indignation’ at the distortions and gaps in the archive,” he wrote. “There are tools available for the effective recovery of the religious experiences of the enslaved, provided we work with these tools carefully and honestly.”
  • Padilla sensed that his pursuit of classics had displaced other parts of his identity, just as classics and “Western civilization” had displaced other cultures and forms of knowledge. Recovering them would be essential to dismantling the white-supremacist framework in which both he and classics had become trapped. “I had to actively engage in the decolonization of my mind,” he told me.
  • He also gravitated toward contemporary scholars like José Esteban Muñoz, Lorgia García Peña and Saidiya Hartman, who speak of race not as a physical fact but as a ghostly system o
  • In response to rising anti-immigrant sentiment in Europe and the United States, Mary Beard, perhaps the most famous classicist alive, wrote in The Wall Street Journal that the Romans “would have been puzzled by our modern problems with migration and asylum,” because the empire was founded on the “principles of incorporation and of the free movement of people.”
  • In November 2015, he wrote an essay for Eidolon, an online classics journal, clarifying that in Rome, as in the United States, paeans to multiculturalism coexisted with hatred of foreigners. Defending a client in court, Cicero argued that “denying foreigners access to our city is patently inhumane,” but ancient authors also recount the expulsions of whole “suspect” populations, including a roundup of Jews in 139 B.C., who were not considered “suitable enough to live alongside Romans.”
  • The job of classicists is not to “point out the howlers,” he said on a 2017 panel. “To simply take the position of the teacher, the qualified classicist who knows things and can point to these mistakes, is not sufficient.”
  • Dismantling structures of power that have been shored up by the classical tradition will require more than fact-checking; it will require writing an entirely new story about antiquity, and about who we are today
  • To find that story, Padilla is advocating reforms that would “explode the canon” and “overhaul the discipline from nuts to bolts,” including doing away with the label “classics” altogether.
  • . “What I want to be thinking about in the next few weeks,” he told them, “is how we can be telling the story of the early Roman Empire not just through a variety of sources but through a variety of persons.” He asked the students to consider the lives behind the identities he had assigned them, and the way those lives had been shaped by the machinery of empire, which, through military conquest, enslavement and trade, creates the conditions for the large-scale movement of human beings.
  • ultimately, he decided that leaving enslaved characters out of the role play was an act of care. “I’m not yet ready to turn to a student and say, ‘You are going to be a slave.’”
  • Privately, even some sympathetic classicists worry that Padilla’s approach will only hasten the field’s decline. “I’ve spoken to undergrad majors who say that they feel ashamed to tell their friends they’re studying classics,”
  • “I very much admire Dan-el’s work, and like him, I deplore the lack of diversity in the classical profession,” Mary Beard told me via email. But “to ‘condemn’ classical culture would be as simplistic as to offer it unconditional admiration.”
  • In a 2019 talk, Beard argued that “although classics may become politicized, it doesn’t actually have a politics,” meaning that, like the Bible, the classical tradition is a language of authority — a vocabulary that can be used for good or ill by would-be emancipators and oppressors alike.
  • Over the centuries, classical civilization has acted as a model for people of many backgrounds, who turned it into a matrix through which they formed and debated ideas about beauty, ethics, power, nature, selfhood, citizenship and, of course, race
  • Anthony Grafton, the great Renaissance scholar, put it this way in his preface to “The Classical Tradition”: “An exhaustive exposition of the ways in which the world has defined itself with regard to Greco-Roman antiquity would be nothing less than a comprehensive history of the world.”
  • Classics as we know it today is a creation of the 18th and 19th centuries. During that period, as European universities emancipated themselves from the control of the church, the study of Greece and Rome gave the Continent its new, secular origin story. Greek and Latin writings emerged as a competitor to the Bible’s moral authority, which lent them a liberatory power
  • Historians stress that such ideas cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions. “The whiter the body is, the more beautiful it is,” Winkelmann wrote.
  • While Renaissance scholars were fascinated by the multiplicity of cultures in the ancient world, Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below.
  • Jefferson, along with most wealthy young men of his time, studied classics at college, where students often spent half their time reading and translating Greek and Roman texts. “Next to Christianity,” writes Caroline Winterer, a historian at Stanford, “the central intellectual project in America before the late 19th century was classicism.
  • Of the 2.5 million people living in America in 1776, perhaps only 3,000 had gone to college, but that number included many of the founders
  • They saw classical civilization as uniquely educative — a “lamp of experience,” in the words of Patrick Henry, that could light the path to a more perfect union. However true it was, subsequent generations would come to believe, as Hannah Arendt wrote in “On Revolution,” that “without the classical example … none of the men of the Revolution on either side of the Atlantic would have possessed the courage for what then turned out to be unprecedented action.”
  • Comparisons between the United States and the Roman Empire became popular as the country emerged as a global power. Even after Latin and Greek were struck from college-entrance exams, the proliferation of courses on “great books” and Western civilization, in which classical texts were read in translation, helped create a coherent national story after the shocks of industrialization and global warfare.
  • even as the classics were pulled apart, laughed at and transformed, they continued to form the raw material with which many artists shaped their visions of modernity.
  • Over the centuries, thinkers as disparate as John Adams and Simone Weil have likened classical antiquity to a mirror. Generations of intellectuals, among them feminist, queer and Black scholars, have seen something of themselves in classical texts, flashes of recognition that held a kind of liberatory promise
  • The language that is used to describe the presence of classical antiquity in the world today — the classical tradition, legacy or heritage — contains within it the idea of a special, quasi-genetic relationship. In his lecture “There Is No Such Thing as Western Civilization,” Kwame Anthony Appiah (this magazine’s Ethicist columnist) mockingly describes the belief in such a kinship as the belief in a “golden nugget” of insight — a precious birthright and shimmering sign of greatness — that white Americans and Europeans imagine has been passed down to them from the ancients.
  • To see classics the way Padilla sees it means breaking the mirror; it means condemning the classical legacy as one of the most harmful stories we’ve told ourselves
  • Padilla is wary of colleagues who cite the radical uses of classics as a way to forestall change; he believes that such examples have been outmatched by the field’s long alliance with the forces of dominance and oppression.
  • Classics and whiteness are the bones and sinew of the same body; they grew strong together, and they may have to die together. Classics deserves to survive only if it can become “a site of contestation” for the communities who have been denigrated by it in the past.
  • if classics fails his test, Padilla and others are ready to give it up. “I would get rid of classics altogether,” Walter Scheidel, another of Padilla’s former advisers at Stanford, told me. “I don’t think it should exist as an academic field.”
  • One way to get rid of classics would be to dissolve its faculties and reassign their members to history, archaeology and language departments.
  • many classicists are advocating softer approaches to reforming the discipline, placing the emphasis on expanding its borders. Schools including Howard and Emory have integrated classics with Ancient Mediterranean studies, turning to look across the sea at Egypt, Anatolia, the Levant and North Africa. The change is a declaration of purpose: to leave behind the hierarchies of the Enlightenment and to move back toward the Renaissance model of the ancient world as a place of diversity and mixture.
  • Ian Morris put it more bluntly. “Classics is a Euro-American foundation myth,” Morris said to me. “Do we really want that sort of thing?”
  • There’s a more interesting story to be told about the history of what we call the West, the history of humanity, without valorizing particular cultures in it,” said Josephine Quinn, a professor of ancient history at Oxford. “It seems to me the really crucial mover in history is always the relationship between people, between cultures.”
  • “In some moods, I feel that this is just a moment of despair, and people are trying to find significance even if it only comes from self-accusation,” he told me. “I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.”
  • “One of the dubious successes of my generation is that it did break the canon,” Richlin told me. “I don’t think we could believe at the time that we would be putting ourselves out of business, but we did.” She added: “If they blew up the classics departments, that would really be the end.”
  • Padilla, like Douglass, now sees the moment of absorption into the classical, literary tradition as simultaneous with his apprehension of racial difference; he can no longer find pride or comfort in having used it to bring himself out of poverty.
  • “Claiming dignity within this system of structural oppression,” Padilla has said, “requires full buy-in into its logic of valuation.” He refuses to “praise the architects of that trauma as having done right by you at the end.”
  • Last June, as racial-justice protests unfolded across the nation, Padilla turned his attention to arenas beyond classics. He and his co-authors — the astrophysicist Jenny Greene, the literary theorist Andrew Cole and the poet Tracy K. Smith — began writing their open letter to Princeton with 48 proposals for reform. “Anti-Blackness is foundational to America,” the letter began. “Indifference to the effects of racism on this campus has allowed legitimate demands for institutional support and redress in the face of microaggression and outright racist incidents to go long unmet.”
  • Padilla believes that the uproar over free speech is misguided. “I don’t see things like free speech or the exchange of ideas as ends in themselves,” he told me. “I have to be honest about that. I see them as a means to the end of human flourishing.”
  • “There is a certain kind of classicist who will look on what transpired and say, ‘Oh, that’s not us,’” Padilla said when we spoke recently. “What is of interest to me is why is it so imperative for classicists of a certain stripe to make this discursive move? ‘This is not us.’
  • Joel Christensen, the Brandeis professor, now feels that it is his “moral and ethical and intellectual responsibility” to teach classics in a way that exposes its racist history. “Otherwise we’re just participating in propaganda,”
  • Christensen, who is 42, was in graduate school before he had his “crisis of faith,” and he understands the fear that many classicists may experience at being asked to rewrite the narrative of their life’s work. But, he warned, “that future is coming, with or without Dan-el.”
  • On Jan. 6, Padilla turned on the television minutes after the windows of the Capitol were broken. In the crowd, he saw a man in a Greek helmet with TRUMP 2020 painted in white. He saw a man in a T-shirt bearing a golden eagle on a fasces — symbols of Roman law and governance — below the logo 6MWE, which stands for “Six Million Wasn’t Enough,
Javier E

History News Network | How the NCSS Sold Out Social Studies and History - 0 views

  • As a historian, I was influenced by E. H. Carr’s thinking about the past and present as part of a continuum that stretches into the future. In What is History? (1961), Carr argued that concern with the future is what really motivates the study of the past. As a teacher and political activist, I also strongly believe that schools should promote active citizenship as essential for maintaining a democratic society.
  • in an effort to survive, the NCSS has largely abandoned its commitment to these ideas, twisting itself into a pretzel to adapt to national Common Core standards and to satisfy influential conservative organizations that they are not radical, or even liberal. I suspect, but cannot document, that the organization’s membership has precipitously declined during the past two decades and it has increasingly depended financial support for its conferences and publications from deep-pocketed traditional and rightwing groups who advertise and have display booths.
  • No Child Left Behind (NCLB). Since the introduction of NCLB, there has been a steady reduction in the amount of time spent in the teaching of social studies, with the most profound decline noticed in the elementary grades.”
  • ...10 more annotations...
  • In an effort to counter the Common Core push for detextualized skill-based instruction and assessment that has further marginalized social studies education, the NCSS is promoting what it calls “College, Career, and Civic Life (C3) Framework,”
  • through its choice of partners, its rigid adherence to Common Core lesson guidelines, and the sample material it is promoting, the NCSS has virtually abandoned not just meaningful social studies education, but education for democracy and citizenship as well.
  • My biggest problem with the C3 Framework as presented in this new document on instruction is its attempt to adopt a fundamentally flawed Common Core approach to social studies and history based on the close reading of text without exploring historical context
  • how Common Core as it is being implemented will mean the end of history.
  • In the C3 Framework inquiry approach, students start in Dimension 1 by “developing questions and planning inquiries,” however the inquiry is really already “planned” because material they use is pre-selected. It is also not clear what their questions will be based on since they do not necessarily have any background on the subject. In Dimension 3 students evaluate sources using evidence, but again, devoid of historical context. Dimension 4 is supposed to be C3’s chief addition to Common Core. In Dimension 4 students are supposed to plan activities and become involved in civic life, although of course their options have again already been pre-prescribed.
  • In Dimension 2, as they read the text, which sixth graders and many eighth graders will find difficult, students discuss “How was citizenship revolutionary in 1776?” The question requires them to assume that colonists had already formulated a concept of citizenship, which I do not believe they had, a concept of nation, which they definitely did not, and an understanding that somehow what they were doing was “revolutionary,” which was still being debated.
  • Some of the organizations involved in writing the C3 Frameworks have positions so politically skewed to the right that NCSS should be embarrassed about including them in the project. In this category I include Gilder Lehrman, The Bill of Rights Institute, and The Center for Economic Education and Entrepreneurship (CEEE).
  • Conspicuously missing from the group of contributors is the Zinn Education Project which would have provided a radically different point of view.
  • What we have in the C3 Framework is standard teaching at best but a lot of poor teaching and propaganda as well.
  • Instead of challenging Common Core, the NCSS begs to be included. Instead of presenting multiple perspectives, it sells advertising in the form of lessons to its corporate and foundation sponsors. But worst in their own terms, in a time of mass protest against police brutality by high school and college students across the United States, active citizenship in a democratic society is stripped of meaning and becomes little more than idle discussion and telling student to vote when they are eighteen.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
caelengrubb

Why economics needs economic history | VOX, CEPR Policy Portal - 1 views

  • The current economic and financial crisis has given rise to a vigorous debate about the state of economics, and the training which graduate and undergraduates economics students are receiving.
  • Employers are increasingly complaining that young economists don’t understand how the financial system actually works, and are ill-prepared to think about appropriate policies at a time of crisis.
  • Knowledge of economic and financial history is crucial in thinking about the economy in several ways.
  • ...6 more annotations...
  • A second, related point is that economic history teaches students the importance of context
  • Third, economic history is an unapologetically empirical field, exclusively dedicated to understanding the real world.
  • Fourth, economic history is a rich source of informal theorising about the real world, which can help motivate more formal theoretical work later on (Wren-Lewis 2013).
  • ifth, even once the current economic and financial crisis has passed, the major long run challenges facing the world will still remain.
  • Sixth, economic theory itself has been emphasising – for well over 20 years now – that path dependence is ubiquitous (David 1985)
  • Finally, and perhaps most importantly from the perspective of an undergraduate economics instructor, economic history is a great way of convincing undergraduates that the theory they are learning in their micro and macro classes is useful in helping them make sense of the real world.
anonymous

Excellence Runs in the Family. Her Novel's Heroine Wants Something Else. - The New York... - 0 views

  • Excellence Runs in the Family. Her Novel’s Heroine Wants Something Else
  • Kaitlyn Greenidge and her sisters achieved success in their respective fields
  • In her historical novel, “Libertie,” she focuses on a Black woman who doesn’t yearn to be the first or only one of anything.
  • ...23 more annotations...
  • Kaitlyn Greenidge learned about the first Black woman to become a doctor in New York. “I filed it away and thought, if I ever got a chance to write a novel, I would want it to be about this,” she said.
  • Libertie, the rebellious heroine of Kaitlyn Greenidge’s new novel, comes from an extraordinary family, but longs to be ordinary.
  • As a young Black woman growing up in Reconstruction-era Brooklyn, Libertie is expected to follow in the footsteps of her trailblazing mother, a doctor who founded a women’s clinic.
  • “So much of Black history is focused on exceptional people,”
  • I wanted to explore is, what’s the emotional and psychological toll of being an exception, of being exceptional, and also, what about the people who just want to have a regular life and find freedom and achievement in being able to live in peace with their family — which is what Libertie wants?”
  • “If you come from a marginalized community, one of the ways you are marginalized is people telling you that you don’t have any history, or that your history is somehow diminished, or it’s very flat, or it’s not somehow as rich as the dominant history.”
  • “That idea of being the first and the only was a big piece of our experience,”
  • They are engaged in ongoing conversations about their writing, though they draw the line at reading and editing drafts of one another’s work.
  • Libertie
  • The novel has drawn praise from writers like Jacqueline Woodson, Mira Jacob and Garth Greenwell, who wrote in a blurb that Greenidge “adds an indelible new sound to American literature, and confirms her status as one of our most gifted young writers.”
  • raised by a single mother who struggled to support the family on her social worker’s salary,
  • “I’ve always been interested in the histories of things that are lesser known,”
  • “There’s a really powerful lyricism that feels new in this voice,”
  • Greenidge and her sisters developed a reverence for storytelling and history early on, when their parents and grandparents would tell stories about their ancestors and what life was like during the civil rights movement.
  • “That fracture was really formative for me,” she said. “It made me hyper aware of inequality and the doublespeak that goes on in America around the American dream and American exceptionalism, because that was proven to me not to be true.”
  • Greenidge was collecting stories from people whose ancestors had lived there, and tracked down a woman named Ellen Holly, who was the first Black actress to have a lead, recurring role on daytime TV, in “One Life to Live.”
  • Greenidge filed the family’s saga away in her mind, thinking she had the premise for a novel. When she got a writing fellowship, she was able to quit her side jobs and immerse herself in the research the novel required.
  • The resulting story feels both epic and intimate. As she reimagined the lives of the doctor and her daughter, Greenidge wove in other historical figures and events.
  • In one horrific scene, Libertie and her mother tend to Black families who fled Manhattan during the New York City draft riots.
  • Greenidge also drew on her own family history, and her experience of being a new mother.
  • Her daughter, Mavis, was born days after she finished a second draft of the book, and is now 18 months old. She finished revisions while living in a multigenerational household with her own mother and sisters.
  • “Mother-daughter relationships are like the central relationships in my life,”
  • “I cannot think of a greater freedom than raising you,”
marleen_ueberall

Knowledge formation and its history - Research themes - CLUE+ Research Institute for Cu... - 0 views

  • Knowledge is a key concept and an instrument of high value in contemporary society.
  • The ambition illustrates the idea that knowledge is conceived as a goal and an ideal for education and development of individuals and groups, including academic education and research, as well as a profitable instrument for researchers, employers, entrepreneurs, and government institutions in our society.
  • And these discussions of knowledge
  • ...5 more annotations...
  • preconditions for attributing these functions to knowledge include critical reflections of the nature of knowledge, its possibilities and limitations.
  • cover the sources of knowledge, its history, and justification narratives, its strengths, limitations, and the distinctive levels of knowledge,
  • The formation of knowledge is taken to refer to the construction of normative orders
  • These normative orders are formed by distinctions such as true-untrue, good-bad, justified-unjustified, beautiful-ugly, as in science, art, law, history, philosophy, economy, sociology, anthropology, and religion.
  • The discussions of these normative orders cover the analyses of their formation, embodiment, application and education, their histories and their constituents.
Javier E

History News Network | Why Historians Need Imagination - 2 views

  • There are two types of imagination: Fantasy-directed imagination, and Reality-directed imagination.
  • Fantasy-directed imagination is aimed at depicting a scenario that goes beyond reality. An example of fantasy-directed imagination would be the creation of Mickey Mouse.
  • Reality-directed imagination, on the other hand, is aimed at depicting a scenario that reflectsreality, whether as it is known at present or as it is known to have existed in the past. An example of reality-directed imagination would be the study of Napoleon.
  • ...7 more annotations...
  • Reality-directed imagination, for its part, endeavours to re-create, in the intellectual realm, actions and events that have existed or have taken place, which we may have plenty or partial information about.
  • Reality-directed imagination is thus a means to retain a solid sense of reality rather than to submerge into the everlasting landscape of fantasy. We imagine what was and try to afford it life.  
  • In the study of history we make use of reality-directed imagination as we depict in our minds the characters of individuals or the nature of events. We even try to fill the gaps by resorting to our imagination ever vigilant not to lose sight of reality as it was. In other words, we attempt to imagine the unknown by resorting to the known.
  • By resorting to reality-directed imagination we are able intellectually to disconnect ourselves from the present; to visualize, like a landscape gradually making its appearance as we move backwards in time, the setting in which an event occurred or the personal features of an individual we follow. We are able emotionally to connect ourselves to the prevailing conditions or to a person's thoughts.
  • Without reality-directed imagination, on the other hand, the study of history would be well-nigh impossible.
  • Without imagination as a study-device, the learning of history becomes well-nigh impossible, for the information furnished to us is rendered unintelligible. We are unable to relate to it in any meaningful manner. We assess it in a mechanical way, devoid of image, sound and feel. Our attempt to understand it leads to a dead-end for we cannot leap forward from the stale fact before us and relate it to other facts beyond it.
  • Without imagination we cannot compare, distinguish and separate; we cannot know the difference between the particular and the general. In order to study history we need to avoid the mechanical, on the one hand, and the fantastic, on the other. In other words, we ought to eschew both lack of imagination and fantasy-directed imagination; the first does not allow us to proceed forward while the latter leads us to the realm of the unreal.
charlottedonoho

Beware Eurosceptic versions of history and science| Rebekah Higgitt | Science | The Gua... - 1 views

  • Readers of the Guardian Science pages may not have noticed the group called Historians for Britain, or a recent piece in History Today by David Abulafia asserting their belief “that Britain’s unique history sets it apart from the rest of Europe”.
  • It requires critical scrutiny from everyone with an interest in Britain’s relationship with the rest of the world, and in evidence-based political discussion.
  • Abilafia’s article is a classic example of an old-fashioned “Whiggish” narrative. It claims a uniquely moderate and progressive advance toward the development of British institutions, traced continuously from Magna Carta and isolated from the rages and radicalism of the Continent.
  • ...3 more annotations...
  • The answer is not “because Britain is better and unique” but “because I am British and these are the stories I have been brought up on” at school, university, on TV and elsewhere. Go to another country and you will see that they have their own, equally admirable, pantheon of greats.
  • The area that I have been working on, the eighteenth-century search for longitude, likewise reveals the need to challenge nationalistic assumptions.
  • Historians and readers of history both need to be aware of the biases of our education and literature. Accounts of British exceptionalism, especially those that lump the rest of Europe or the world into an amorphous group of also-rans, are more the result of national tradition and wishful thinking than a careful reading of the sources.
markfrankel18

Twitter histories of events are vanishing - Salon.com - 1 views

  • As the Technology Review reported:A significant proportion of the websites that this social media [around the Arab Spring] points to has disappeared. And the same pattern occurs for other culturally significant events, such as the the H1N1 virus outbreak, Michael Jackson’s death and the Syrian uprising. In other words, our history, as recorded by social media, is slowly leaking away.
  • So it seems that social media sites like Twitter do not remain as fecund a resource over time as they do in real time. But no historian has ever worked on the assumption that all, or even most, information about an event is preserved, let alone even recorded. Not even Twitter has changed that.
  • Nowadays, we’re very good at telling history in real time. Live-tweeting, livestreaming, Instagraming, link sharing, instant commenting — everyday lives and major events are recorded and narrated from every angle as they happen. A new study has found, however, that these minutes-old histories may not be built to last.
qkirkpatrick

Jim Harrington: Beware Revising American History Education - The Patriot Post - 0 views

  • The College Board, a nonprofit corporation founded in 1926 to make higher education accessible to more Americans, introduced the SAT (Scholastic Aptitude Test) and, after World War II, the advanced placement (AP) test.
  • Students who score three of five points on the AP test are credited with completing a two-semester introductory college course in subjects like English, math and history. But revisions are coming, and they’re not going to result in better education.
  • Periodically, the College Board publishes new frameworks to alert schools to changes in AP tests. The latest American history framework has raised a raging controversy over what should be taught in AP U.S. history classes.
  • ...1 more annotation...
  • “By obscuring this nation’s founding principles and promise, the College Board’s U.S. history guidelines will erode the next generation’s disposition to preserve what is best in the American political tradition.
  •  
    Using standardized teaching methods in schools and how it is shaping what is taught and how it is taught in schools
dpittenger

Why Oklahoma Lawmakers Want to Ban AP US History -- NYMag - 0 views

  •  
    Oklahoma lawmakers want to ban AP US History because it has too negative of an outlook on the history of the US.
carolinewren

Book Review: 'A New History of Life' by Peter Ward and Joe Kirschvink - WSJ - 0 views

  • I imagine that physicists are similarly deluged with revelations about how to build a perpetual-motion machine or about the hitherto secret truth behind relativity. And so I didn’t view the arrival of “A New History of Life” with great enthusiasm.
  • subtitle breathlessly promises “radical new discoveries about the origins and evolution of life on earth,” while the jacket copy avers that “our current paradigm for understanding the history of life on Earth dates back to Charles Darwin’s time, yet scientific advances of the last few decades have radically reshaped that aging picture.”
  • authors Peter Ward and Joe Kirschvink are genuine scientists—paleontologists, to be exact. And they can write.
  • ...16 more annotations...
  • even genuine scientists are human and as such susceptible to the allure of offering up new paradigms (as the historian of science Thomas Kuhn put it)
  • paleontologist Stephen Jay Gould insisted that his conception of “punctuated equilibria” (a kind of Marxist biology that blurred the lines between evolution and revolution), which he developed along with fellow paleontologist Niles Eldredge, upended the traditional Darwinian understanding of how natural selection works.
  • This notion doesn’t constitute a fundamental departure from plain old evolution by natural selection; it simply italicizes that sometimes the process is comparatively rapid, other times slower.
  • In addition, they have long had a peculiar perspective on evolution, because of the limitations of the fossil record
  • Darwin was a pioneering geologist as well as the greatest of all biologists, and his insights were backgrounded by the key concept of uniformitarianism, as advocated by Charles Lyell, his friend and mentor
  • previously regnant paradigm among geologists had been “catastrophism
  • fossil record was therefore seen as reflecting the creation and extinction of new species by an array of dramatic and “unnatural” dei ex machina.
  • Of late, however, uniformitarianism has been on a losing streak. Catastrophism is back, with a bang . . . or a flood, or a burst of extraterrestrial radiation, or an onslaught of unpleasant, previously submerged chemicals
  • This emphasis on catastrophes is the first of a triad of novelties on which “A New History of Life” is based. The second involves an enhanced role for some common but insufficiently appreciated inorganic molecules, notably carbon dioxide, oxygen and hydrogen sulfide.
  • Life didn’t so much unfold smoothly over hundreds of millions of years as lurch chaotically in response to diverse crises and opportunities: too much oxygen, too little carbon dioxide, too little oxygen, too much carbon dioxide, too hot, too cold
  • So far, so good, except that in their eagerness to emphasize what is new and different, the authors teeter on the verge of the same trap as Gould: exaggerating the novelty of their own ideas.
  • Things begin to unravel when it comes to the third leg of Messrs. Ward and Kirschvink’s purported paradigmatic novelty: a supposed role for ecosystems—rain forests, deserts, rivers, coral reefs, deep-sea vents—as units of evolutionary change
  • “While the history of life may be populated by species,” they write, “it has been the evolution of ecosystems that has been the most influential factor in arriving at the modern-day assemblage of life. . . . [W]e know that on occasion in the deep past entirely new ecosystems appear, populated by new kinds of life.” True enough, but it is those “new kinds of life,” not whole ecosystems, upon which natural selection acts.
  • One of the most common popular misconceptions about evolution is that it proceeds “for the good of the species.”
  • The problem is that smaller, nimbler units are far more likely to reproduce differentially than are larger, clumsier, more heterogeneous ones. Insofar as ecosystems are consequential for evolution—and doubtless they are—it is because, like occasional catastrophes, they provide the immediate environment within which something not-so-new is acted out.
  • This is natural selection doing its same-old, same-old thing: acting by a statistically potent process of variation combined with selective retention and differential reproduction, a process that necessarily operates within the particular ecosystem that a given lineage occupies.
Javier E

Economics: Economic history should be at the heart of economics instruction - 0 views

  • ONE of the stranger myths about the recent financial crisis is that no one saw it coming. In fact quite a lot of economists saw it coming, and for years had been writing with dread about the growing global imbalances and the necessary financial adjustments.
  • It didn't require a very sophisticated understanding of economics, just some knowledge of history. Every previous globalisation cycle except one (the one cut short in 1914) ended that way, and nothing in the current cycle seemed fundamentally different from what had happened before.
  • while many economists, bankers and policymakers were caught flatfooted by the crisis, most economists with real knowledge of economic and financial history—and by history I do not mean the last twenty years or thirty years—thought a crisis almost inevitable and broadly understood how it was going to occur.
  • ...1 more annotation...
  • So how should the teaching of economics change? That's easy. While mathematical fluency is very useful, it should not be at the heart of economics instruction. That place should be reserved for economic history.
Javier E

Economic history: When did globalisation start? | The Economist - 0 views

  • economic historians reckon the question of whether the benefits of globalisation outweigh the downsides is more complicated than this. For them, the answer depends on when you say the process of globalisation started.
  • it is impossible to say how much of a “good thing” a process is in history without first defining for how long it has been going on.
  • Although Adam Smith himself never used the word, globalisation is a key theme in the Wealth of Nations. His description of economic development has as its underlying principle the integration of markets over time. As the division of labour enables output to expand, the search for specialisation expands trade, and gradually, brings communities from disparate parts of the world together
  • ...6 more annotations...
  • Smith had a particular example in mind when he talked about market integration between continents: Europe and America.
  • Kevin O’Rourke and Jeffrey Williamson argued in a 2002 paper that globalisation only really began in the nineteenth century when a sudden drop in transport costs allowed the prices of commodities in Europe and Asia to converge
  • But there is one important market that Mssrs O’Rourke and Williamson ignore in their analysis: that for silver. As European currencies were generally based on the value of silver, any change in its value would have had big effects on the European price level.
  • The impact of what historians have called the resulting “price revolution” dramatically changed the face of Europe. Historians attribute everything from the dominance of the Spanish Empire in Europe to the sudden increase in witch hunts around the sixteenth century to the destabilising effects of inflation on European society. And if it were not for the sudden increase of silver imports from Europe to China and India during this period, European inflation would have been much worse than it was. Price rises only stopped in about 1650 when the price of silver coinage in Europe fell to such a low level that it was no longer profitable to import it from the Americas.
  • The German historical economist, Andre Gunder Frank, has argued that the start of globalisation can be traced back to the growth of trade and market integration between the Sumer and Indus civilisations of the third millennium BC. Trade links between China and Europe first grew during the Hellenistic Age, with further increases in global market convergence occuring when transport costs dropped in the sixteenth century and more rapidly in the modern era of globalisation, which Mssrs O’Rourke and Williamson describe as after 1750.
  • it is clear that globalisation is not simply a process that started in the last two decades or even the last two centuries. It has a history that stretches thousands of years, starting with Smith’s primitive hunter-gatherers trading with the next village, and eventually developing into the globally interconnected societies of today. Whether you think globalisation is a “good thing” or not, it appears to be an essential element of the economic history of mankind.
Javier E

Donald Trump's despicable words - The Washington Post - 0 views

  • “We condemn in the strongest possible terms this egregious display of hatred, bigotry and violence on many sides. On many sides,” he said Saturday.
  • It is important when you consider the situation of a man whose face has been crushed by a boot to wonder if any damage might have been done to the boot.
  • At what point can we stop giving people the benefit of the doubt? “Gotta Hear Both Sides” is carved over the entrance to Hell. How long must we continue to hear from idiots who are wrong? I don’t want to hear debate unless there is something legitimately to be debated, and people’s rights to life, liberty and the pursuit of happiness are not among those things. They are self-evident, or used to seem so.
  • ...9 more annotations...
  • Of course they gathered with torches, because the only liberty they have lost is the liberty to gather with torches and decide whose house to visit with terror. That is the right that is denied them: the right to other people’s possessions, the right to be the only person in the room, the right to be the only person that the world is made for. (These are not rights. They are wrongs.)
  • You are sad because your toys have been taken, but they were never toys to begin with. They were people. It is the ending of the fairy tale; because you were a beast, you did not see that the things around you were people and not objects that existed purely for your pleasure. You should not weep that the curse is broken and you can see that your footstool was a human being.
  • so little good is unmixed. History contains heroes, but no one is a hero entirely, and no one is a hero for very long. You can be brilliant in some ways and despicable in others. You can be a clean, upright, moral individual in your private life who never swears, treats women with respect, and speaks highly of duty and honor– and go out every day and dedicate yourself to a cause that makes the world worse.
  • A truth that murder mysteries get right about human nature is that even when you find a man stabbed before the soup course, someone always wants to finish the soup.
  • All right: You are not a murderer. You are a good person. But that does not mean that what you have was not ill-gotten. That does not mean that you deserve everything you have. You have to look at your history and see it, all of it.
  • We must cherish our history. (Somewhere, a dog whimpers.) Can we be a little more specific about what history? Can we be a little more specific about any of this? The specifics are where the principles are. What will we cherish, and what will we disavow? What are we putting on a pedestal, and what are we putting in a museum? Not all history is created equal.
  • we have always been a country where things like this can happen. It is just harder not to notice now. And it is possible, sometimes, to be angrier at the person who makes you notice than at the thing you are seeing.
  • Who would stand over the body of someone who died protesting a hateful, violence, racist ideology and say that “we have to come together”? That we have to find common ground? I am sure there is common ground to be found with the people who say that some are not fit to be people. The man who thinks I ought not to exist — maybe we can compromise and agree that I will get to exist on alternate Thursdays. Let us only burn some of the villagers at the stake. We can eat just three of the children. All ideas deserve a fair hearing. Maybe we can agree that some people are only three-fifths of people, while we are at it. As long as we are giving a hearing to all views.
  • Only someone with no principles would think that such a compromise was possible. Only someone with no principles would think that such a compromise was desirable. At some point you have to judge more than just the act of fighting. You have to judge what the fighting is for. Some principles are worth fighting for, and others are not
sissij

Have we got Machiavelli all wrong? | Books | The Guardian - 0 views

  • The quickest way, it says, is to have fortune on your side from the outset, with plenty of inherited money and a leg up through family connections.
  • Make the people your best friend. Promise to protect their interests against predatory elites and foreigners.
  • he recommended them – that he himself is the original Machiavellian, the first honest teacher of dishonest politics.
  • ...4 more annotations...
  • But what if we’re overlooking Machiavelli’s less obvious messages, his deeper insights into politics?
  • Yes, he made sinister excuses for violence and hypocrisy. But his reasons were patriotic, well-meaning, human.
  • But he also says – in a passage most scholars pass over – that “victories are never secure without some respect, especially for justice”. For every cynical Machiavellian precept, I found two or three others that clashed with it.
  • If we look again at how he lived his life and how that life shaped his thoughts, it looks as if we’ve got Machiavelli all wrong.
  •  
    Overinterpretation is very tricky because we never know certain of what other people are trying to say. Especially when we are studying the words of people in the history. People are never consistent with themselves, and I think that's why life is so interesting. People love the coming of age stories because you never know where the character will go. The same with people in the history. They are not black or white. They are a mix. But our confirmation bias always make us unconsciously select the words from them that support our opinion. Although we always say we can learn from the history, what we actually are doing is just agreeing on stuff that we have already agreed with. This is not real learning. --Sissi (3/4/2017)
‹ Previous 21 - 40 of 744 Next › Last »
Showing 20 items per page