Skip to main content

Home/ TOK Friends/ Group items matching ""Civil War"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

The Philosopher Redefining Equality | The New Yorker - 0 views

  • The bank experience showed how you could be oppressed by hierarchy, working in an environment where you were neither free nor equal. But this implied that freedom and equality were bound together in some way beyond the basic state of being unenslaved, which was an unorthodox notion. Much social thought is rooted in the idea of a conflict between the two.
  • If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.
  • What if they weren’t opposed, Anderson wondered, but, like the sugar-phosphate chains in DNA, interlaced in a structure that we might not yet understand?
  • ...54 more annotations...
  • At fifty-nine, Anderson is the chair of the University of Michigan’s department of philosophy and a champion of the view that equality and freedom are mutually dependent, enmeshed in changing conditions through time.
  • She has built a case, elaborated across decades, that equality is the basis for a free society
  • Because she brings together ideas from both the left and the right to battle increasing inequality, Anderson may be the philosopher best suited to this awkward moment in American life. She builds a democratic frame for a society in which people come from different places and are predisposed to disagree.
  • she sketched out the entry-level idea that one basic way to expand equality is by expanding the range of valued fields within a society.
  • The ability not to have an identity that one carries from sphere to sphere but, rather, to be able to slip in and adopt whatever values and norms are appropriate while retaining one’s identities in other domains?” She paused. “That is what it is to be free.”
  • How do you move from a basic model of egalitarian variety, in which everybody gets a crack at being a star at something, to figuring out how to respond to a complex one, where people, with different allotments of talent and virtue, get unequal starts, and often meet with different constraints along the way?
  • The problem, she proposed, was that contemporary egalitarian thinkers had grown fixated on distribution: moving resources from lucky-seeming people to unlucky-seeming people, as if trying to spread the luck around.
  • Egalitarians should agree about clear cases of blameless misfortune: the quadriplegic child, the cognitively impaired adult, the teen-ager born into poverty with junkie parents. But Anderson balked there, too. By categorizing people as lucky or unlucky, she argued, these egalitarians set up a moralizing hierarchy.
  • In Anderson’s view, the way forward was to shift from distributive equality to what she called relational, or democratic, equality: meeting as equals, regardless of where you were coming from or going to.
  • By letting the lucky class go on reaping the market’s chancy rewards while asking others to concede inferior status in order to receive a drip-drip-drip of redistributive aid, these egalitarians were actually entrenching people’s status as superior or subordinate.
  • To the ugly and socially awkward: . . . Maybe you won’t be such a loser in love once potential dates see how rich you are.
  • . To the stupid and untalented: Unfortunately, other people don’t value what little you have to offer in the system of production. . . . Because of the misfortune that you were born so poorly endowed with talents, we productive ones will make it up to you: we’ll let you share in the bounty of what we have produced with our vastly superior and highly valued abilities. . . 
  • she imagined some citizens getting a state check and a bureaucratic letter:
  • This was, at heart, an exercise of freedom. The trouble was that many people, picking up on libertarian misconceptions, thought of freedom only in the frame of their own actions.
  • To be truly free, in Anderson’s assessment, members of a society had to be able to function as human beings (requiring food, shelter, medical care), to participate in production (education, fair-value pay, entrepreneurial opportunity), to execute their role as citizens (freedom to speak and to vote), and to move through civil society (parks, restaurants, workplaces, markets, and all the rest).
  • Anderson’s democratic model shifted the remit of egalitarianism from the idea of equalizing wealth to the idea that people should be equally free, regardless of their differences.
  • A society in which everyone had the same material benefits could still be unequal, in this crucial sense; democratic equality, being predicated on equal respect, wasn’t something you could simply tax into existence. “People, not nature, are responsible for turning the natural diversity of human beings into oppressive hierarchies,”
  • Her first book, “Value in Ethics and Economics,” appeared that year, announcing one of her major projects: reconciling value (an amorphous ascription of worth that is a keystone of ethics and economics) with pluralism (the fact that people seem to value things in different ways).
  • Philosophers have often assumed that pluralistic value reflects human fuzziness—we’re loose, we’re confused, and we mix rational thought with sentimental responses.
  • She offered an “expressive” theory: in her view, each person’s values could be various because they were socially expressed, and thus shaped by the range of contexts and relationships at play in a life. Instead of positing value as a basic, abstract quality across society (the way “utility” functioned for economists), she saw value as something determined by the details of an individual’s history.
  • Like her idea of relational equality, this model resisted the temptation to flatten human variety toward a unifying standard. In doing so, it helped expand the realm of free and reasoned economic choice.
  • Anderson’s model unseated the premises of rational-choice theory, in which individuals invariably make utility-maximizing decisions, occasionally in heartless-seeming ways. It ran with, rather than against, moral intuition. Because values were plural, it was perfectly rational to choose to spend evenings with your family, say, and have guilt toward the people you left in the lurch at work.
  • The theory also pointed out the limits on free-market ideologies, such as libertarianism.
  • In ethics, it broke across old factional debates. The core idea “has been picked up on by people across quite a range of positions,” Peter Railton, one of Anderson’s longtime colleagues, says. “Kantians and consequentialists alike”—people who viewed morality in terms of duties and obligations, and those who measured the morality of actions by their effects in the world—“could look at it and see something important.”
  • Traditionally, the discipline is taught through a-priori thought—you start with basic principles and reason forward. Anderson, by contrast, sought to work empirically, using information gathered from the world, identifying problems to be solved not abstractly but through the experienced problems of real people.
  • “Dewey argued that the primary problems for ethics in the modern world concerned the ways society ought to be organized, rather than personal decisions of the individual,”
  • In 2004, the Stanford Encyclopedia of Philosophy asked Anderson to compose its entry on the moral philosophy of John Dewey, who helped carry pragmatist methods into the social realm. Dewey had an idea of democracy as a system of good habits that began in civil life. He was an anti-ideologue with an eye for pluralism.
  • She started working with historians, trying to hone her understanding of ideas by studying them in the context of their creation. Take Rousseau’s apparent support of direct democracy. It’s rarely mentioned that, at the moment when he made that argument, his home town of Geneva had been taken over by oligarchs who claimed to represent the public. Pragmatism said that an idea was an instrument, which naturally gave rise to such questions as: an instrument for what, and where, and when?
  • In “What Is the Point of Equality?,” Anderson had already started to drift away from what philosophers, following Rawls, call ideal theory, based on an end vision for a perfectly just society. As Anderson began a serious study of race in America, though, she found herself losing faith in that approach entirely.
  • Broadly, there’s a culturally right and a culturally left ideal theory for race and society. The rightist version calls for color blindness. Instead of making a fuss about skin and ethnicity, its advocates say, society should treat people as people, and let the best and the hardest working rise.
  • The leftist theory envisions identity communities: for once, give black people (or women, or members of other historically oppressed groups) the resources and opportunities they need, including, if they want it, civil infrastructure for themselves.
  • In “The Imperative of Integration,” published in 2010, Anderson tore apart both of these models. Sure, it might be nice to live in a color-blind society, she wrote, but that’s nothing like the one that exists.
  • But the case for self-segregation was also weak. Affinity groups provided welcome comfort, yet that wasn’t the same as power or equality, Anderson pointed out. And there was a goose-and-gander problem. Either you let only certain groups self-segregate (certifying their subordinate status) or you also permitted, say, white men to do it,
  • Anderson’s solution was “integration,” a concept that, especially in progressive circles, had been uncool since the late sixties. Integration, by her lights, meant mixing on the basis of equality.
  • in attending to these empirical findings over doctrine, she announced herself as a non-ideal theorist: a philosopher with no end vision of society. The approach recalls E. L. Doctorow’s description of driving at night: “You can see only as far as the headlights, but you can make the whole trip that way.”
  • or others, though, a white woman making recommendations on race policy raised questions of perspective. She was engaging through a mostly white Anglo-American tradition. She worked from the premise that, because she drew on folders full of studies, the limits of her own perspective were not constraining.
  • Some philosophers of color welcomed the book. “She’s taking the need for racial justice seriously, and you could hardly find another white political philosopher over a period of decades doing that,”
  • Recently, Anderson changed the way she assigns undergraduate essays: instead of requiring students to argue a position and fend off objections, doubling down on their original beliefs, she asks them to discuss their position with someone who disagrees, and to explain how and why, if at all, the discussion changed their views.
  • The challenge of pluralism is the challenge of modern society: maintaining equality amid difference in a culture given to constant and unpredictable change.
  • Rather than fighting for the ascendancy of certain positions, Anderson suggests, citizens should fight to bolster healthy institutions and systems—those which insure that all views and experiences will be heard. Today’s righteous projects, after all, will inevitably seem fatuous and blinkered from the vantage of another age.
  • Smith saw the markets as an escape from that order. Their “most important” function, he explained, was to bring “liberty and security” to those “who had before lived almost in a continual state of war with their neighbours, and of servile dependency upon their superiors.”
  • Anderson zeroed in on Adam Smith, whose “The Wealth of Nations,” published in 1776, is taken as a keystone of free-market ideology. At the time, English labor was subject to uncompensated apprenticeships, domestic servitude, and some measure of clerical dominion.
  • Smith, in other words, was an egalitarian. He had written “The Wealth of Nations” in no small part to be a solution to what we’d now call structural inequality—the intractable, compounding privileges of an arbitrary hierarchy.
  • It was a historical irony that, a century later, writers such as Marx pointed to the market as a structure of dominion over workers; in truth, Smith and Marx had shared a socioeconomic project. And yet Marx had not been wrong to trash Smith’s ideas, because, during the time between them, the world around Smith’s model had changed, and it was no longer a useful tool.
  • mages of free market society that made sense prior to the Industrial Revolution continue to circulate today as ideals, blind to the gross mismatch between the background social assumptions reigning in the seventeenth and eighteenth centuries, and today’s institutional realities. We are told that our choice is between free markets and state control, when most adults live their working lives under a third thing entirely: private government.
  • Today, people still try to use, variously, both Smith’s and Marx’s tools on a different, postindustrial world:
  • The unnaturalness of this top-heavy arrangement, combined with growing evidence of power abuses, has given many people reason to believe that something is fishy about the structure of American equality. Socialist and anti-capitalist models are again in vogue.
  • Anderson offers a different corrective path. She thinks it’s fine for some people to earn more than others. If you’re a brilliant potter, and people want to pay you more than the next guy for your pottery, great!
  • The problem isn’t that talent and income are distributed in unequal parcels. The problem is that Jeff Bezos earns more than a hundred thousand dollars a minute, while Amazon warehouse employees, many talented and hardworking, have reportedly resorted to urinating in bottles in lieu of a bathroom break. That circumstance reflects some structure of hierarchical oppression. It is a rip in the democratic fabric, and it’s increasingly the norm.
  • Andersonism holds that we don’t have to give up on market society if we can recognize and correct for its limitations—it may even be our best hope, because it’s friendlier to pluralism than most alternatives are.
  • we must be flexible. We must remain alert. We must solve problems collaboratively, in the moment, using society’s ears and eyes and the best tools that we can find.
  • “You can see that, from about 1950 to 1970, the typical American’s wages kept up with productivity growth,” she said. Then, around 1974, she went on, hourly compensation stagnated. American wages have been effectively flat for the past few decades, with the gains of productivity increasingly going to shareholders and to salaries for big bosses.
  • What changed? Anderson rattled off a constellation of factors, from strengthened intellectual-property law to winnowed antitrust law. Financialization, deregulation. Plummeting taxes on capital alongside rising payroll taxes. Privatization, which exchanged modest public-sector salaries for C.E.O. paydays. She gazed into the audience and blinked. “So now we have to ask: What has been used to justify this rather dramatic shift of labor-share of income?”
  • It was no wonder that industrial-age thinking was riddled with contradictions: it reflected what Anderson called “the plutocratic reversal” of classical liberal ideas. Those perversely reversed ideas about freedom were the ones that found a home in U.S. policy, and, well, here we were.
Javier E

The GOP civil war is coming, and Trump will continue to destroy the party - The Washington Post - 1 views

  • the battle lines will roughly divide between GOP leaders, party strategists, and establishment figures who are urging one set of lessons to be drawn from the defeat (that the party needs to make peace with cultural and demographic change), and Trump supporters who are urging that a very different set of lessons be drawn (that the party must embrace Trump’s species of ethno-nationalism and xenophobic, America First populism).
kushnerha

This is why the Paris attacks have gotten more news coverage than other terrorist attacks - The Washington Post - 1 views

  • probably bias in the coverage. People are more likely to be concerned about victims they can identify with. Research tells us that U.S. media outlets are more likely to cover terrorist attacks with U.S. victims. The news media are more likely to cover disasters in wealthier countries. And tragedies that are physically closer to the United States are more likely to appear in U.S. news
  • First, “news” is generally considered to be something especially unusual. The journalism truism is that “dog bites man” is not a story, but “man bites dog” is. That’s not a judgment on whether dog bites matter; it’s a judgment about what’s surprising.
  • news outlets are influenced by their consumers. Human beings are especially interested in events that might affect them personally.
  • ...10 more annotations...
  • One reason the attack drew so much international attention was that France doesn’t experience nearly as much terrorism as countries with comparable recent attacks, such as Lebanon or Kenya.
  • attacks were unusually terrifying precisely because they did not target a particular class of people — such as only Christians, university students, or government officials. They targeted anyone and everyone. A life lost in this manner is not “more tragic” than a life lost in a civil war. However, it might be more newsworthy, because it’s unusual
  • Terrorists, of course, seek out such targets. Attacking tourism hot spots is excellent for drawing attention to their cause.
  • The attack on Paris also shocked observers around the world because many have been there, or plan to visit. France is the most visited country in the world. This creates an “it could happen to me” factor, and also suggests that terrorism could affect someone we know.
  • The Paris attack also stands out for the tactics used by the perpetrators. This attack played out over time in multiple public locations. It also seemed to target everyone, instead of a specific group.
  • drawing international attention because it suggests a new outward turn for the Islamic State
  • The Islamic State leadership apparently directed the attack, according to French officials. That would set it apart from attacks that were only inspired by the group, such as the few killings that have occurred in Western countries in the past year.
  • One reason why we often see lone-actor attacks in high-capability states is that organized terror is difficult to accomplish in these countries
  • realization that the Islamic State is apparently willing and able to carry out complex, coordinated attacks in developed countries outside of its home region has European security services worried. Beyond Europe, what other targets might be next? This further adds to the global interest
  • Paris attack shocked the world for many reasons. It’s true that terrorism in less-developed countries is worth our attention as well. Crises, such as the Syrian civil war, deserve much more media coverage and policy focus. But the Paris attack continues to draw interest because of the relative rarity of terrorism in France, the fact that the country receives visitors from around the globe, the shocking nature of the attack, and the potential implications for the Islamic State’s future plans.
Javier E

Predicting the Future Is Easier Than It Looks - By Michael D. Ward and Nils Metternich | Foreign Policy - 0 views

  • The same statistical revolution that changed baseball has now entered American politics, and no one has been more successful in popularizing a statistical approach to political analysis than New York Times blogger Nate Silver, who of course cut his teeth as a young sabermetrician. And on Nov. 6, after having faced a torrent of criticism from old-school political pundits -- Washington's rough equivalent of statistically illiterate tobacco chewing baseball scouts -- the results of the presidential election vindicated Silver's approach, which correctly predicted the electoral outcome in all 50 states.
  • Today, there are several dozen ongoing, public projects that aim to in one way or another forecast the kinds of things foreign policymakers desperately want to be able to predict: various forms of state failure, famines, mass atrocities, coups d'état, interstate and civil war, and ethnic and religious conflict. So while U.S. elections might occupy the front page of the New York Times, the ability to predict instances of extreme violence and upheaval represent the holy grail of statistical forecasting -- and researchers are now getting close to doing just that.
  • In 2010 scholars from the Political Instability Task Force published a report that demonstrated the ability to correctly predict onsets of instability two years in advance in 18 of 21 instances (about 85%)
  • ...5 more annotations...
  • Let's consider a case in which Ulfelder argues there is insufficient data to render a prediction -- North Korea. There is no official data on North Korean GDP, so what can we do? It turns out that the same data science approaches that were used to aggregate polls have other uses as well. One is the imputation of missing data. Yes, even when it is all missing. The basic idea is to use the general correlations among data that you do have to provide an aggregate way of estimating information that we don't have.
  • As it turned out, in this month's election public opinion polls were considerably more precise than the fundamentals. The fundamentals were not always providing bad predictions, but better is better.
  • In 2012 there were two types of models: one type based on fundamentals such as economic growth and unemployment and another based on public opinion surveys
  • There is a tradition in world politics to go either back until the Congress of Vienna (when there were fewer than two dozen independent countries) or to the early 1950s after the end of the Second World War. But in reality, there is no need to do this for most studies.
  • Ulfelder tells us that "when it comes to predicting major political crises like wars, coups, and popular uprisings, there are many plausible predictors for which we don't have any data at all, and much of what we do have is too sparse or too noisy to incorporate into carefully designed forecasting models." But this is true only for the old style of models based on annual data for countries. If we are willing to face data that are collected in rhythm with the phenomena we are studying, this is not the case
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

Can We Improve? - The New York Times - 1 views

  • are we capable of substantial moral improvement? Could we someday be much better ethically than we are now? Is it likely that members of our species could become, on average, more generous or more honest, less self-deceptive or less self-interested?
  • I’d like to focus here on a more recent moment: 19th-century America, where the great optimism and idealism of a rapidly rising nation was tempered by a withering realism.
  • Emerson thought that “the Spirit who led us hither” would help perfect us; others have believed the agent of improvement to be evolution, or the inevitable progress of civilization. More recent advocates of our perfectibility might focus on genetic or neurological interventions, or — as in Ray Kurzweil’s “When Singularity Is Near” — information technologies.
  • ...10 more annotations...
  • One reason that a profound moral improvement of humankind is hard to envision is that it seems difficult to pull ourselves up morally by our own bootstraps; our attempts at improvement are going to be made by the unimproved
  • People and societies occasionally improve, managing to enfranchise marginalized groups, for example, or reduce violence, but also often degenerate into war, oppression or xenophobia. It is difficult to improve and easy to convince yourself that you have improved, until the next personality crisis, the next bad decision, the next war, the next outbreak of racism, the next “crisis” in educatio
  • It’s difficult to teach your children what you yourself do not know, and it’s difficult to be good enough actually to teach your children to be good.
  • Plans for our improvement have resulted in progress here and there, but they’ve also led to many disasters of oppression, many wars and genocides.
  • One thing that Twain is saying is that many forms of evil — envy, for example, or elaborate dishonesty — appear on earth only with human beings and are found wherever we are. Creatures like us can’t see clearly what we’d be making progress toward.
  • His story “The Imp of the Perverse” shows another sort of reason that humans find it difficult to improve. The narrator asserts that a basic human impulse is to act wrongly on purpose, or even to do things because we know they’re wrong: “We act, for the reason that we should not,” the narrator declares. This is one reason that human action tends to undermine itself; our desires are contradictory.
  • Perhaps, then if we cannot improve systematically, we can improve inadvertently — or even by sheer perversity
  • As to evolution, it, too, is as likely to end in our extinction as our flourishing; it has of course extinguished most of the species to which it has given rise, and it does not clearly entail that every or any species gets better in any dimension over time
  • Our technologies may, as Kurzweil believes, allow us to transcend our finitude. On the other hand, they may end in our or even the planet’s total destruction.
  • “I have no faith in human perfectibility. I think that human exertion will have no appreciable effect on humanity. Man is … not more happy — nor more wise, than he was 6,000 years ago.”
  •  
    are we capable of substantial moral improvement? Could we someday be much better ethically than we are now? Is it likely that members of our species could become, on average, more generous or more honest, less self-deceptive or less self-interested?
catbclark

How We Learned to Kill - NYTimes.com - 0 views

  • “There are two people digging by the side of the road. Can we shoot them?”
  • In war, of course, there are many ways to kill. I did so by giving orders. I never fired my weapon in combat, but I ordered countless others to
  • My initial reaction was to ask the question to someone higher up the chain of command.
  • ...12 more annotations...
  • I wanted confirmation from a higher authority to do the abhorrent, something I’d spent my entire life believing was evil.
  • I realized it was my role as an officer to provide that validation to the Marine on the other end who would pull the trigger.
  • I also received affirmation to a more sinister question: Yes, I could kill.
  • The primary factors that affect an individual’s ability to kill are the demands of authority, group absolution, the predisposition of the killer, the distance from the victim and the target attractiveness of the victim.
  • Were the men in their sights irrigating their farmland or planting a roadside bomb?
  • Before killing the first time there’s a reluctance that tempers the desire to know whether you are capable of doing it
  • . Despite the rhetoric I internalized from the newspapers back home about why we were in Afghanistan, I ended up fighting for different reasons once I got on the ground — a mix of loyalty to my Marines, habit and the urge to survive.
  • The more I thought about the enemy, the harder it was to view them as evil or subhuman. But killing requires a motivation
  • If someone is shooting at me, I have a right to fire back
  • Until that moment, our deployment in Afghanistan had been exhilarating because we felt invulnerable. This invulnerability in an environment of death was the most powerful sensation I’d ever experienced.
  • The fog of war doesn’t just limit what you can know; it creates doubt about everything you’re certain that you know.
  • The madness of war is that while this system is in place to kill people, it may actually be necessary for the greater good. We live in a dangerous world where killing and torture exist and where the persecution of the weak by the powerful is closer to the norm than the civil society where we get our Starbucks.
Javier E

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
Javier E

Guns, Germs, and The Future of Us - Wyatt Edward Gates - Medium - 0 views

  • ared Daimond’s seminal work Guns, Germs, and Steel has many flaws, but it provides some useful anecdotes about how narrative and consciousness shapes human organization progresses
  • Past critical transformations of thought can help us see how we need to transform ourselves now in order to survive the future.
  • something both ancient and immediate: the way we define who is in our tribe plays a critical role in what kind of social organization we can build and maintain
  • ...25 more annotations...
  • You can’t have a blood family of 300 million, nor even a large enough one to do things like build an agrarian society
  • In order to have large cities built on agrarianism it was necessary not only to innovate technology, but to transform our very consciousness as it related to how we defined what a person was, both ourselves and others
  • Instead of needing to have real, flowing blood with common DNA from birth, it was merely necessary to be among the same abstract family organized under a king of some kind — a kind of stand in for the father or patriarch. We developed law and law enforcement as abstract disembodied voices of the father. This allowed total strangers without any family ties to interact in the same society in a constructive and organized way. Thus: civilization as we know it
  • Those ancient polities have developed finally into the Nation, a kind of tribe so fully abstracted that you can be of any blood and language and religion and still function within it.
  • So, too, are all other forms of human separation — and the opposition and conflicts they spawn — illusory in nature. We moved beyond blood, but then it was language or religion or fealty that made it impossible to work together, and we warred over that
  • we’re told these borders mean everything, that they are real and urgent and demand constant sacrifice to maintain.
  • why is that border there? Why borders?
  • We’re stuck in a mode of thinking that’s no longer sensible. There isn’t a reason for borders. There never really was, but now more than ever we have no utility for them, no need for them
  • What humanity has to do is wake up to the reality of post-tribalism. This means seeing through all these invented borders to the truth that we are all people, we are all fundamentally the same, and we can all learn to live with one another.
  • It was the idea of necessary conflict based on blood that preceded the fights that appeared to justify the belief in that blood-based conflict.
  • Nations have saturated the entire globe. There are no more frontiers. It’s all Nations butting up against one another.
  • We are all people of a similar nature and we do have the option to relate to one another as people for the sake of saving our shared homes and futures. We all hunger and thirst and become lonely, we all laugh and weep in the same language. Stripped of confounding symbols we are undivided.
  • There are a lot of people upset about the illusion of borders. They want a different reality, one in which there are Good Tribes (their tribe) and Bad Tribes (all the other ones).
  • but the world is already so mixed together they can’t draw those borders anymore. Hence: fascism.
  • There are no firm foundations for defining this tribe, however, so he’s left to cobble together some kind of ad hoc notion of in- and out-group. Like a magpie he collects ways of dividing people as appeals to his caprice: race, sex, Nation, etc., but there’s no greater sense to it, so it’s all arbitrary, all a mess.
  • No amount of magical thinking from conservatives can change the reality of globalism, however; what one Nation does to pollute will affect us all, and that is according to the laws of physics. No political movement can change those physics. We have to adapt or perish.
  • a key part of it is a simple lack of imagination. He just doesn’t realize there’s an option to not have borders, because his entire consciousness is married to the idea of of-me and not-of-me, Us and Them, and if there is no Them there can’t be an Us, and therefore life stops making sense
  • What has to be true if there are no tribes? We have no need to discriminate among who we may love. Loving and caring for all people as if they were blood family is the path forward
  • There needs to be a new story for us to share. It’s not enough to stop believing in the old way of borders, we have to actively seek out a new way of thinking and speaking and living that reflects the world as it is and as it can be.
  • there are others who have more tangible investments in borders: Those who have grown fat off the conflicts driven by these invented borders don’t want us to see how pointless it all is. These billionaires and presidents and kings want us to keep fighting against one another over the borders they so lazily define because it gives them a means of power and control.
  • We have to be ready for their opposition, however. They’ll do what they can to force us to act as if their borders are real. We don’t need to listen, though we do need to be ready to sacrifice.
  • Without a globally-coordinated response we can’t resolve a globally-driven problem such as climate change. If we can grant the humanity of all people we can start to imagine ways of relating to one another that aren’t opposed and antagonistic, but which are cooperative and aimed at harmony.
  • This transformation of consciousness must happen in our own hearts and minds before it can happen in concert.
  • the Nation has already been shown to be unnecessary because of social globalism. Pick a major city on earth and you’ll find every kind of person living together in peace! Not perfect peace, but not constant and unavoidable war, and that is what counts.
  • We can’t keep pretending as if borders matter when we can so clearly see that they don’t, but we can’t just have no story at all, there must be a way of contextualizing a future without borders. I don’t know what that story is, exactly, but I believe it is something like love writ large. Once we’re ready to start telling it we can start living it.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

A Nasty New World « The Dish - 0 views

  • Bernard Bailyn The Barbarous Years, which details the “little-remembered” brutality of life in the American colonies during the 17th century:
  • Bailyn has not painted a pretty picture. Little wonder he calls it The Barbarous Years and spares us no details of the terror, desperation, degradation and widespread torture—do you really know what being “flayed alive” means?
  • yet somehow amid the merciless massacres were elements that gave birth to the rudiments of civilization—or in Bailyn’s evocative phrase, the fragile “integument of civility”—that would evolve 100 years later into a virtual Renaissance culture,
  • ...1 more annotation...
  • It’s a grand drama in which the glimmers of enlightenment barely survive the savagery, what Yeats called “the blood-dimmed tide,” the brutal establishment of slavery, the race wars with the original inhabitants that Bailyn is not afraid to call “genocidal,” the full, horrifying details of which have virtually been erased.
Duncan H

Severe Conservative Syndrome - NYTimes.com - 0 views

  • Mr. Romney “described conservatism as if it were a disease.” Indeed. Mark Liberman, a linguistics professor at the University of Pennsylvania, provided a list of words that most commonly follow the adverb “severely”; the top five, in frequency of use, are disabled, depressed, ill, limited and injured.
  • That’s clearly not what Mr. Romney meant to convey. Yet if you look at the race for the G.O.P. presidential nomination, you have to wonder whether it was a Freudian slip.
  • Rick Santorum, who, according to Public Policy Polling, is the clear current favorite among usual Republican primary voters, running 15 points ahead of Mr. Romney. Anyone with an Internet connection is aware that Mr. Santorum is best known for 2003 remarks about homosexuality, incest and bestiality. But his strangeness runs deeper than that.
  • ...4 more annotations...
  • last year Mr. Santorum made a point of defending the medieval Crusades against the “American left who hates Christendom.” Historical issues aside (hey, what are a few massacres of infidels and Jews among friends?), what was this doing in a 21st-century campaign?
  • Nor is this only about sex and religion: he has also declared that climate change is a hoax, part of a “beautifully concocted scheme” on the part of “the left” to provide “an excuse for more government control of your life.” You may say that such conspiracy-theorizing is hardly unique to Mr. Santorum, but that’s the point: tinfoil hats have become a common, if not mandatory, G.O.P. fashion accessory.
  • Then there’s Ron Paul, who came in a strong second in Maine’s caucuses despite widespread publicity over such matters as the racist (and conspiracy-minded) newsletters published under his name in the 1990s and his declarations that both the Civil War and the Civil Rights Act were mistakes. Clearly, a large segment of his party’s base is comfortable with views one might have thought were on the extreme fringe.
  • Finally, there’s Mr. Romney, who will probably get the nomination despite his evident failure to make an emotional connection with, well, anyone. The truth, of course, is that he was not a “severely conservative” governor. His signature achievement was a health reform identical in all important respects to the national reform signed into law by President Obama four years later. And in a rational political world, his campaign would be centered on that achievement.
Javier E

But What Would the End of Humanity Mean for Me? - James Hamblin - The Atlantic - 0 views

  • Tegmark is more worried about much more immediate threats, which he calls existential risks. That’s a term borrowed from physicist Nick Bostrom, director of Oxford University’s Future of Humanity Institute, a research collective modeling the potential range of human expansion into the cosmos
  • "I am finding it increasingly plausible that existential risk is the biggest moral issue in the world, even if it hasn’t gone mainstream yet,"
  • Existential risks, as Tegmark describes them, are things that are “not just a little bit bad, like a parking ticket, but really bad. Things that could really mess up or wipe out human civilization.”
  • ...17 more annotations...
  • The single existential risk that Tegmark worries about most is unfriendly artificial intelligence. That is, when computers are able to start improving themselves, there will be a rapid increase in their capacities, and then, Tegmark says, it’s very difficult to predict what will happen.
  • Tegmark told Lex Berko at Motherboard earlier this year, "I would guess there’s about a 60 percent chance that I’m not going to die of old age, but from some kind of human-caused calamity. Which would suggest that I should spend a significant portion of my time actually worrying about this. We should in society, too."
  • "Longer term—and this might mean 10 years, it might mean 50 or 100 years, depending on who you ask—when computers can do everything we can do," Tegmark said, “after that they will probably very rapidly get vastly better than us at everything, and we’ll face this question we talked about in the Huffington Post article: whether there’s really a place for us after that, or not.”
  • "This is very near-term stuff. Anyone who’s thinking about what their kids should study in high school or college should care a lot about this.”
  • Tegmark and his op-ed co-author Frank Wilczek, the Nobel laureate, draw examples of cold-war automated systems that assessed threats and resulted in false alarms and near misses. “In those instances some human intervened at the last moment and saved us from horrible consequences,” Wilczek told me earlier that day. “That might not happen in the future.”
  • there are still enough nuclear weapons in existence to incinerate all of Earth’s dense population centers, but that wouldn't kill everyone immediately. The smoldering cities would send sun-blocking soot into the stratosphere that would trigger a crop-killing climate shift, and that’s what would kill us all
  • “We are very reckless with this planet, with civilization,” Tegmark said. “We basically play Russian roulette.” The key is to think more long term, “not just about the next election cycle or the next Justin Bieber album.”
  • “There are several issues that arise, ranging from climate change to artificial intelligence to biological warfare to asteroids that might collide with the earth,” Wilczek said of the group’s launch. “They are very serious risks that don’t get much attention.
  • a widely perceived issue is when intelligent entities start to take on a life of their own. They revolutionized the way we understand chess, for instance. That’s pretty harmless. But one can imagine if they revolutionized the way we think about warfare or finance, either those entities themselves or the people that control them. It could pose some disquieting perturbations on the rest of our lives.”
  • Wilczek’s particularly concerned about a subset of artificial intelligence: drone warriors. “Not necessarily robots,” Wilczek told me, “although robot warriors could be a big issue, too. It could just be superintelligence that’s in a cloud. It doesn’t have to be embodied in the usual sense.”
  • it’s important not to anthropomorphize artificial intelligence. It's best to think of it as a primordial force of nature—strong and indifferent. In the case of chess, an A.I. models chess moves, predicts outcomes, and moves accordingly. If winning at chess meant destroying humanity, it might do that.
  • Even if programmers tried to program an A.I. to be benevolent, it could destroy us inadvertently. Andersen’s example in Aeon is that an A.I. designed to try and maximize human happiness might think that flooding your bloodstream with heroin is the best way to do that.
  • “It’s not clear how big the storm will be, or how long it’s going to take to get here. I don’t know. It might be 10 years before there’s a real problem. It might be 20, it might be 30. It might be five. But it’s certainly not too early to think about it, because the issues to address are only going to get more complex as the systems get more self-willed.”
  • Even within A.I. research, Tegmark admits, “There is absolutely not a consensus that we should be concerned about this.” But there is a lot of concern, and sense of lack of power. Because, concretely, what can you do? “The thing we should worry about is that we’re not worried.”
  • Tegmark brings it to Earth with a case-example about purchasing a stroller: If you could spend more for a good one or less for one that “sometimes collapses and crushes the baby, but nobody’s been able to prove that it is caused by any design flaw. But it’s 10 percent off! So which one are you going to buy?”
  • “There are seven billion of us on this little spinning ball in space. And we have so much opportunity," Tegmark said. "We have all the resources in this enormous cosmos. At the same time, we have the technology to wipe ourselves out.”
  • Ninety-nine percent of the species that have lived on Earth have gone extinct; why should we not? Seeing the biggest picture of humanity and the planet is the heart of this. It’s not meant to be about inspiring terror or doom. Sometimes that is what it takes to draw us out of the little things, where in the day-to-day we lose sight of enormous potentials.
Javier E

The Foolish, Historically Illiterate, Incredible Response to Obama's Prayer Breakfast Speech - The Atlantic - 0 views

  • Inveighing against the barbarism of ISIS, the president pointed out that it would be foolish to blame Islam, at large, for its atrocities. To make this point he noted that using religion to brutalize other people is neither a Muslim invention nor, in America, a foreign one: Lest we get on our high horse and think this is unique to some other place, remember that during the Crusades and the Inquisition, people committed terrible deeds in the name of Christ. In our home country, slavery and Jim Crow all too often was justified in the name of Christ.
  • The "all too often" could just as well be "almost always." There were a fair number of pretexts given for slavery and Jim Crow, but Christianity provided the moral justification
  • Christianity did not "cause" slavery, anymore than Christianity "caused" the civil-rights movement. The interest in power is almost always accompanied by the need to sanctify that power. That is what the Muslims terrorists in ISIS are seeking to do today, and that is what Christian enslavers and Christian terrorists did for the lion's share of American history.
  • ...3 more annotations...
  • Stephens went on to argue that the "Christianization of the barbarous tribes of Africa" could only be accomplished through enslavement. And enslavement was not made possible through Robert's Rules of Order, but through a 250-year reign of mass torture, industrialized murder, and normalized rape—tactics which ISIS would find familiar. Its moral justification was not "because I said so," it was "Providence," "the curse against Canaan," "the Creator," "and Christianization." In just five years, 750,000 Americans died because of this peculiar mission of "Christianization." Many more died before, and many more died after. In his "Segregation Now" speech, George Wallace invokes God 27 times and calls the federal government opposing him "a system that is the very opposite of Christ."
  • That this relatively mild, and correct, point cannot be made without the comments being dubbed, "the most offensive I’ve ever heard a president make in my lifetime,” by a former Virginia governor gives you some sense of the limited tolerance for any honest conversation around racism in our politics.
  • related to that is the need to infantilize and deify our history. Pointing out that Americans have done, on their own soil, in the name of their own God, something similar to what ISIS is doing now does not make ISIS any less barbaric, or any more correct.
Javier E

Losing Our Touch - NYTimes.com - 0 views

  • Are we losing our senses? In our increasingly virtual world, are we losing touch with the sense of touch itself? And if so, so what?
  • Tactility is not blind immediacy — not merely sensorial but cognitive, too. Savoring is wisdom; in Latin, wisdom is “sapientia,” from “sapere,” to taste. These carnal senses make us human by keeping us in touch with things, by responding to people’s pain
  • But Aristotle did not win this battle of ideas. The Platonists prevailed and the Western universe became a system governed by “the soul’s eye.” Sight came to dominate the hierarchy of the senses, and was quickly deemed the appropriate ally of theoretical ideas.
  • ...6 more annotations...
  • Western philosophy thus sprang from a dualism between the intellectual senses, crowned by sight, and the lower “animal” senses, stigmatized by touch.
  • opto-centrism prevailed for over 2,000 years, culminating in our contemporary culture of digital simulation and spectacle. The eye continues to rule in what Roland Barthes once called our “civilization of the image.” The world is no longer our oyster, but our screen.
  • our current technology is arguably exacerbating our carnal alienation. While offering us enormous freedoms of fantasy and encounter, digital eros may also be removing us further from the flesh
  • The move toward excarnation is apparent in what is becoming more and more a fleshless society. In medicine, “bedside manner” and hand on pulse has ceded to the anonymous technologies of imaging in diagnosis and treatment. In war, hand-to-hand combat has been replaced by “targeted killing” via remote-controlled drones.
  • certain cyber engineers now envisage implanting transmission codes in brains so that we will not have to move a finger — or come into contact with another human being — to get what we want.
  • We need to return from head to foot, from brain to fingertip, from iCloud to earth. To close the distance, so that eros is more about proximity than proxy. So that soul becomes flesh, where it belongs. Such a move, I submit, would radically alter our “sense” of sex in our digital civilization. It would enhance the role of empathy, vulnerability and sensitivity in the art of carnal love, and ideally, in all of human relations. Because to love or be loved truly is to be able to say, “I have been touched.”
runlai_jiang

Black Representation in Government - 0 views

  • Although the 15th Amendment passed in 1870 legally prohibited denying black men the right to vote, major efforts to disenfranchise black voters promoted the passage of the Voters Rights Act in 1965. Prior to its ratification, black voters were subject to literacy test, false voting dates, and physical violence.
  • Constance Baker Motley was born in New Haven, Connecticut in 1921. Motley became interested in civil rights matters after she was banned from a public beach for being black. She sought to understand the laws that were being used to oppress her. At an early age, Motley became a civil rights advocate and was motivated to improve the treatment received by black Americans. Soon after she became the president of the local NAACP youth council.
  • Harold Washington was born on April 15, 1922, in Chicago, Illinois. Washington began high school at DuSable High School but did not receive his diploma until after World War II — during which time he served as first sergeant in the Air Army Corps. He was honorably discharged in 1946 and went on to graduate from Roosevelt College (now Roosevelt University) in 1949, and Northwestern University School of Law in 1952
Javier E

Chief Rabbi: atheism has failed. Only religion can defeat the new barbarians » The Spectator - 0 views

  • reading the new atheists.
  • Where is there the remotest sense that they have grappled with the real issues, which have nothing to do with science and the literal meaning of scripture and everything to do with the meaningfulness or otherwise of human life, the existence or non-existence of an objective moral order, the truth or falsity of the idea of human freedom, and the ability or inability of society to survive without the rituals, narratives and shared practices that create and sustain the social bond?
  • religion has social, cultural and political consequences, and you cannot expect the foundations of western civilisation to crumble and leave the rest of the building intact. That is what the greatest of all atheists, Nietzsche, understood
  • ...10 more annotations...
  • The history of Europe since the 18th century has been the story of successive attempts to find alternatives to God as an object of worship, among them the nation state, race and the Communist Manifesto. After this cost humanity two world wars, a Cold War and a hundred million lives, we have turned to more pacific forms of idolatry, among them the market, the liberal democratic state and the consumer society,
  • This is what a society built on materialism, individualism and moral relativism looks like. It maximises personal freedom but at a cost.
  • This freedom, energising and exciting as it is, is also profoundly disintegrative, making it very difficult for individuals to find any stable communal support, very difficult for any community to count on the responsible participation of its individual members. It opens solitary men and women to the impact of a lowest common denominator, commercial culture.’
  • In one respect the new atheists are right. The threat to western freedom in the 21st century is not from fascism or communism but from a religious fundamentalism combining hatred of the other, the pursuit of power and contempt for human rights.
  • But the idea that this can be defeated by individualism and relativism is naive almost beyond belief. Humanity has been here before.
  • The barbarians win. They always do.
  • The new barbarians are the fundamentalists who seek to impose a single truth on a plural world. Though many of them claim to be religious, they are actually devotees of the will to power. Defeating them will take the strongest possible defence of freedom, and strong societies are always moral societies
  • That does not mean that they need be religious. It is just that, in the words of historian Will Durant, ‘There is no significant example in history, before our time, of a society successfully maintaining moral life without the aid of religion.’
  • I have not yet found a secular ethic capable of sustaining in the long run a society of strong communities and families on the one hand, altruism, virtue, self-restraint, honour, obligation and trust on the other
  • A century after a civilisation loses its soul it loses its freedom also. That should concern all of us, believers and non-believers alike.
proudsa

An Open Letter to My Friends Who Support Donald Trump - 0 views

  • But I can't understand why you would support someone as hateful, sexist, racist and ignorant as Donald Trump.
    • proudsa
       
      Is it logic or instinct that leads people to vote for Trump?
  • It's not okay to marginalize an entire race of people, saying things like all the Mexicans are lazy, that they are all stealing our jobs and bringing drugs into our country.
  • We're all human. Some humans are really bad people. Some are really good. And it doesn't matter what color they are, it makes no difference whatsoever
    • proudsa
       
      Doesn't just have to do with Trump - an important overall less
  • ...10 more annotations...
  • Trump's supporters are angry, and anger is infectious.
  • We need the kind of leader that seeks to bring us together, not tear us apart.
  • Lucky for us, this isn't Grandma's house, so feel free to punch him in the mouth in the form of getting out and making your vote count.
  • but racism isn't one of them, neither is hate, neither is the belittling of women or the judgment of others based on their appearance or their disability, or their sexual preference.
  • Do you think empowered women will suddenly quit their jobs and go back to the kitchen ? Because electing Trump won't make any of that come true. We're past that as a nation, or at least I thought we were.
  • Whatever led you to believe that racism is okay can be unlearned if you open your mind. I'm sorry that you were raised to believe that you deserve better treatment than the rest of the people on the planet that have different views than yours, worship different gods than you and have skin that isn't white.
  • I implore you to get out and vote against him. Don't let the progress of this great nation be halted. We've come too far.
  • The idea that certain religions are more dangerous than others and the idea that people should be judged based on the color of their skin rather than the content of their character.
  • And then there are just the plainly insane people who finally snap and go on shooting rampages for no discernible reason at all. They just went mad.
  • We're still healing from the damage inflicted by the Civil War, WWI, WWII, Vietnam, Iraq and the War on Terror. And it isn't just ISIS or Al-Qaeda.
proudsa

UN: World's Refugees And Displaced To Exceed 60 Million This Year - 0 views

  • The number of people forcibly displaced worldwide is likely to have "far surpassed" a record 60 million this year, mainly driven by the Syrian war and other protracted conflicts, the United Nations said on Friday.
  • Nearly 2.5 million asylum seekers have requests pending, with Germany, Russia and the United States receiving the highest numbers of the nearly one million new claims lodged in the first half of the year, it said.
  • Developing countries bordering conflict zones still host the lion's share of the refugees, the report said, warning about growing "resentment" and "politicization of refugees."
  • ...2 more annotations...
  • Syria's civil war that began in 2011 has been the main driver of mass displacement, with more than 4.2 million Syrian refugees having fled abroad and 7.6 million uprooted within their shattered homeland as of mid-year, UNHCR said.
  • Many refugees will live in exile for years to come, it said. "In effect, if you become a refugee today your chances of going home are lower than at any time in more than 30 years."
‹ Previous 21 - 40 of 67 Next › Last »
Showing 20 items per page