Skip to main content

Home/ TOK Friends/ Group items tagged history

Rss Feed Group items tagged

sissij

Teachers:What kind of students do you remember(if any) and how long for? | Yahoo Answers - 1 views

  • I remember the troublemakers, of course. Those are hard to forget. I also remember the exceptionally good ones. I teach English, so I also remember those who had something very interesting to say, regardless of how well-said, either in class or in their papers.
  • To be honest, it's the mediocre ones that I might not remember. The ones who did well, but were very quiet and had sort of normal or typical ideas.
  • I remember that most troublesome students the most because they challenged me to reflect and learn from my teaching. I always remember them everytime I run into others of the same nature in the classroom. :)
  •  
    I found this very interesting because it has a connection with the our memory. Our memory has a predilection and what we tend to remember the most is the extremes. I feel so unfair because the worst people get remember the most, and the ones who is hardworking but quiet just get forgot and lost in the history. For example, although Trump's election is very random, out of bounds, and ridiculous in a way, he still gets remembered and recorded in the history. I feel so sorry for those quiet contributors in this country that just get lost in the history. --Sissi (11/21/2016)
Javier E

Some on the Left Turn Against the Label 'Progressive' - The New York Times - 0 views

  • Christopher Lasch, the historian and social critic, posed a banger of a question in his 1991 book, “The True and Only Heaven: Progress and Its Critics.”
  • “How does it happen,” Lasch asked, “that serious people continue to believe in progress, in the face of massive evidence that might have been expected to refute the idea of progress once and for all?”
  • A review in The New York Times Book Review by William Julius Wilson, a professor at Harvard, was titled: “Where Has Progress Got Us?”
  • ...17 more annotations...
  • Essentially, Lasch was attacking the notion, fashionable as Americans basked in their seeming victory over the Soviet Union in the Cold War, that history had a direction — and that one would be wise to stay on the “right side” of it.
  • Francis Fukuyama expressed a version of this triumphalist idea in his famous 1992 book, “The End of History and the Last Man,” in which he celebrated the notion that History with a capital “H,” in the sense of a battle between competing ideas, was ending with communism left to smolder on Ronald Reagan’s famous ash heap.
  • One of Martin Luther King Jr.’s most frequently quoted lines speaks to a similar thought, albeit in a different context: “T​he arc of the moral universe is long, but it bends toward justice.” Though he had read Lasch, Obama quoted that line often, just as he liked to say that so-and-so would be “on the wrong side of history” if they didn’t live up to his ideals — whether the issue was same-sex marriage, health policy or the Russian occupation of Crimea.
  • The memo goes on to list two sets of words: “Optimistic Positive Governing Words” and “Contrasting Words,” which carried negative connotations. One of the latter group was the word “liberal,” sandwiched between “intolerant” and “lie.”
  • So what’s the difference between a progressive and a liberal?To vastly oversimplify matters, liberal usually refers to someone on the center-left on a two-dimensional political spectrum, while progressive refers to someone further left.
  • But “liberal” has taken a beating in recent decades — from both left and right.
  • In the late 1980s and 1990s, Republicans successfully demonized the word “liberal,” to the point where many Democrats shied away from it in favor of labels like “conservative Democrat” or, more recently, “progressive.”
  • “Is the story of the 20th century about the defeat of the Soviet Union, or was it about two world wars and a Holocaust?” asked Matthew Sitman, the co-host of the “Know Your Enemy” podcast, which recently hosted a discussion on Lasch and the fascination many conservatives have with his ideas. “It really depends on how you look at it.”
  • None of this was an accident. In 1996, Representative Newt Gingrich of Georgia circulated a now-famous memo called “Language: A Key Mechanism of Control.”
  • The authors urged their readers: “The words and phrases are powerful. Read them. Memorize as many as possible.”
  • Republicans subsequently had a great deal of success in associating the term “liberal” with other words and phrases voters found uncongenial: wasteful spending, high rates of taxation and libertinism that repelled socially conservative voters.
  • Many on the left began identifying themselves as “progressive” — which had the added benefit of harking back to movements of the late 19th and early 20th centuries that fought against corruption, opposed corporate monopolies, pushed for good-government reforms and food safety and labor laws and established women’s right to vote.
  • Allies of Bill Clinton founded the Progressive Policy Institute, a think tank associated with so-called Blue Dog Democrats from the South.
  • Now, scrambling the terminology, groups like the Progressive Change Campaign Committee agitate on behalf of proudly left-wing candidates
  • In 2014, Charles Murray, the polarizing conservative scholar, urged readers of The Wall Street Journal’s staunchly right-wing editorial section to “start using ‘liberal’ to designate the good guys on the left, reserving ‘progressive’ for those who are enthusiastic about an unrestrained regulatory state.”
  • As Sanders and acolytes like Representative Alexandria Ocasio-Cortez of New York have gained prominence over the last few election cycles, many on the left-wing end of the spectrum have begun proudly applying other labels to themselves, such as “democratic socialist.”
  • To little avail so far, Kazin, the Georgetown historian, has been urging them to call themselves “social democrats” instead — as many mainstream parties do in Europe.“It’s not a good way to win elections in this country, to call yourself a socialist,” he said.
Javier E

(1) A Brief History of Media and Audiences and Twitter and The Bulwark - 0 views

  • In the old days—and here I mean even as recently as 2000 or 2004—audiences were built around media institutions. The New York Times had an audience. The New Yorker had an audience. The Weekly Standard had an audience.
  • If you were a writer, you got access to these audiences by contributing to the institutions. No one cared if you, John Smith, wrote a piece about Al Gore. But if your piece about Al Gore appeared in Washington Monthly, then suddenly you had an audience.
  • There were a handful of star writers for whom this wasn’t true: Maureen Dowd, Tom Wolfe, Joan Didion. Readers would follow these stars wherever they appeared. But they were the exceptions to the rule. And the only way to ascend to such exalted status was by writing a lot of great pieces for established institutions and slowly assembling your audience from theirs.
  • ...16 more annotations...
  • The internet stripped institutions of their gatekeeping powers, thus making it possible for anyone to publish—and making it inevitable that many writers would create audiences independent of media institutions.
  • The internet destroyed the apprenticeship system that had dominated American journalism for generations. Under the old system, an aspiring writer took a low-level job at a media institution and worked her way up the ladder until she was trusted enough to write.
  • Under the new system, people started their careers writing outside of institutions—on personal blogs—and then were hired by institutions on the strength of their work.
  • In practice, these outsiders were primarily hired not on the merits of their work, but because of the size of their audience.
  • what it really did was transform the nature of audiences. Once the internet existed it became inevitable that institutions would see their power to hold audiences wane while individual writers would have their power to build personal audiences explode.
  • this meant that institutions would begin to hire based on the size of a writer’s audience. Which meant that writers’ overriding professional imperative was to build an audience, since that was the key to advancement.
  • Twitter killed the blog and lowered the barrier to entry for new writers from “Must have a laptop, the ability to navigate WordPress, and the capacity to write paragraphs” to “Do you have an iPhone and the ability to string 20 words together? With or without punctuation?”
  • If you were able to build a big enough audience on Twitter, then media institutions fell all over themselves trying to hire you—because they believed that you would then bring your audience to them.2
  • If you were a writer for the Washington Post, or Wired, or the Saginaw Express, you had to build your own audience not to advance, but to avoid being replaced.
  • For journalists, audience wasn’t just status—it was professional capital. In fact, it was the most valuable professional capital.
  • Everything we just talked about was driven by the advertising model of media, which prized pageviews and unique users above all else. About a decade ago, that model started to fray around the edges,3 which caused a shift to the subscription model.
  • Today, if you’re a subscription publication, what Twitter gives you is growth opportunity. Twitter’s not the only channel for growth—there are lots of others, from TikTok to LinkedIn to YouTube to podcasts to search. But it’s an important one.
  • Twitter’s attack on Substack was an attack on the subscription model of journalism itself.
  • since media has already seen the ad-based model fall apart, it’s not clear what the alternative will be if the subscription model dies, too.
  • All of which is why having a major social media platform run by a capricious bad actor is suboptimal.
  • And why I think anyone else who’s concerned about the future of media ought to start hedging against Twitter. None of the direct hedges—Post, Mastodon, etc.—are viable yet. But tech history shows that these shifts can happen fairly quickly.
markfrankel18

Erasing History in the Internet Era - NYTimes.com - 1 views

  • Lorraine Martin, a nurse in Greenwich, was arrested in 2010 with her two grown sons when police raided her home and found a small stash of marijuana, scales and plastic bags. The case against her was tossed out when she agreed to take some drug classes, and the official record was automatically purged. It was, the law seemed to assure her, as if it had never happened.
  • Defamation is the publication of information that is both damaging and false. The arrest story was obviously true when it was first published. But Connecticut’s erasure law has already established that truth can be fungible. Martin, her suit says, was “deemed never to have been arrested.” And therefore the news story had metamorphosed into a falsehood.
  • They debate the difference between “historical fact” and “legal fact.” They dispute whether something that was true when it happened can become not just private but actually untrue, so untrue you can swear an oath that it never happened and, in the eyes of the law, you’ll be telling the truth.
  • ...7 more annotations...
  • Google’s latest transparency report shows a sharp rise in requests from governments and courts to take down potentially damaging material.
  • In Europe, where press freedoms are less sacred and the right to privacy is more ensconced, the idea has taken hold that individuals have a “right to be forgotten,” and those who want their online particulars expunged tend to have the government on their side. In Germany or Spain, Lorraine Martin might have a winning case.
  • The Connecticut case is just one manifestation of an anxious backlash against the invasive power of the Internet, a world of Big Data and ever more powerful search engines, in which it seems almost everything is permanently recorded and accessible to almost anyone — potential employers, landlords, dates, predators
  • The Times’s policy is not to censor history, because it’s history. The paper will update an arrest story if presented with evidence of an acquittal or dismissal, completing the story but not deleting the story.
  • Owen Tripp, a co-founder of Reputation.com, which has made a business out of helping clients manage their digital profile, advocated a “right to be forgotten” in a YouTube video. Tripp said everyone is entitled to a bit of space to grow up, to experiment, to make mistakes.
  • “This is not just a privacy problem,” said Viktor Mayer-Schönberger, a professor at the Oxford Internet Institute, and author of “Delete: The Virtue of Forgetting in the Digital Age.” “If we are continually reminded about people’s mistakes, we are not able to judge them for who they are in the present. We need some way to put a speed-brake on the omnipresence of the past.”
  • would like to see search engine companies — the parties that benefit the most financially from amassing our information — offer the kind of reputation-protecting tools that are now available only to those who can afford paid services like those of Reputation.com. Google, he points out, already takes down five million items a week because of claims that they violate copyrights. Why shouldn’t we expect Google to give users an option — and a simple process — to have news stories about them down-ranked or omitted from future search results? Good question. What’s so sacred about a search algorithm, anyway?
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
Javier E

Beating History: Why Today's Rising Powers Can't Copy the West - Heather Horn - Interna... - 0 views

  • For the BRIC rising economies -- Brazil, Russia, India, and China -- what can be learned by looking at the rise of powers throughout history?
  • production in "all organic economies was set by the annual cycle of plant growth" -- it limits food, fuel, fiber, and building materials. Coal changed all that. By digging into the earth to get minerals instead of growing fuel on the earth, you get a vastly more efficient source of fuel and completely change the rules of the game. You've shifted from an organic economy, as he calls it, to an energy-rich economy. But the economic reprieve the fossil fuels offered could be nearing an end, as global supply becomes more competitive.
  • Historians still debate the nature and causes of the Industrial Revolution, but one thing they seem to agree on is that the it wasn't just industrial -- it was demographic and agricultural as well. Prior to the Industrial Revolution, populations all over the globe had non-negotiable checks on their growth: too many people and you get ecological crises and famines to push the number back down. In the 18th and 19th centuries, England managed to solve this problem, with tremendous leaps in population and urbanization as a result.
  • ...3 more annotations...
  • What the rise of the BRICs symbolizes to both panicked individuals in the West and optimistic ones elsewhere is a radical shift in the geography of power -- a catch-up or reversal of the Western global dominance that was established with the Industrial Revolution.
  • developing countries won't be able to follow the West's path to becoming rich, because that path required certain things that were largely particular to that one period in history.
  • The challenge ahead for the BRICs, then, is to figure out how to maintain growth in a world where the vast new frontier opened up by the Industrial Revolution appears to be closing. The BRICs can play the West's game better than the West, both through technological innovation and population growth, but only for so long. The whole world has to figure out a way of dealing with energy and agriculture.
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

In New Textbook, the Story of Singapore Begins 500 Years Earlier - NYTimes.com - 0 views

  • Why did it take 30 years to change the story? “It takes overwhelming evidence to shift the mind-set of a people from one image of its past to another,”
  • Professor Miksic gives credit for the new history lesson to former students who have reached positions of authority in academia and in the Ministry of Education.
  • Professor Heng surmised that one reason it had taken so long to change the narrative may have been the government’s fears of communal conflict in the 1960s and ’70s.
  • ...4 more annotations...
  • “If Singapore before 1800 was a sleepy backwater, the Chinese majority could say, ‘We built Singapore; before it was a blank slate,”’ he said.
  • Other factors also may help explain the timing of the rewrite. “Now is a good time,” Professor Heng said. “There’s a need to develop a collective social memory. It’s become a political issue.”
  • “Every generation has to rewrite its history,” he said. While it used to suit Singapore to see itself as a city-state with a British heritage, modern Singapore needs a different interpretation of history to reinforce a more global perspective, he suggested.
  • Professor Miksic goes a step further. “A short history puts a nation on shaky ground; a shallowly rooted place could be overturned quickly,” he said. “If you can show a long cohabitation between the Malays and the Chinese, it proves you have a pretty stable arrangement.”
Javier E

The Glut of Unqualified History Teachers - 0 views

  • The most recent data from the National Center for Education Statistics (1998) show that 55.6 percent of those who teach two or more classes of social studies in grades 9-12 do not have either a major or a minor in history. (There is no reason to believe that the situation has improved since these data were collected.
  • it takes a well-educated teacher to bring these developments to life, to enable students to understand the connection among events, and to truly grasp the importance of the post-Civil War constitutional amendments.
  • there is no getting away from the fact that teachers will not be able to bring the study of history to life unless they know enough history to get beyond whatever the textbook says.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

History of Lynchings in the South Documents Nearly 4,000 Names - NYTimes.com - 0 views

  • The authors of the report compiled an inventory of 3,959 victims of “racial terror lynchings” in 12 Southern states from 1877 to 1950.
  • Next comes the process of selecting lynching sites where the organization plans to erect markers and memorials, which will involve significant fund-raising, negotiations with distrustful landowners and, almost undoubtedly, intense controversy.The process is intended, Mr. Stevenson said, to force people to reckon with the narrative through-line of the country’s vicious racial history, rather than thinking of that history in a short-range, piecemeal way.
  • Around the country, there are only a few markers noting the sites of lynchings. In several of those places, like Newnan, Ga., attempts to erect markers were met with local resistance. But in most places, no one has tried to put up a marker.
  • ...3 more annotations...
  • Among Professor Beck’s findings were that the number of lynchings did not rise or fall in proportion to the number of state-sanctioned executions, underscoring what Mr. Stevenson said was a crucial point: that these brutal deaths were not about administering popular justice, but terrorizing a community.
  • “Many of these lynchings were not executing people for crimes but executing people for violating the racial hierarchy,” he said, meaning offenses such as bumping up against a white woman or wearing an Army uniform.
  • But, he continued, even when a major crime was alleged, the refusal to grant a black man a trial — despite the justice system’s near certain outcome — and the public extravagance of a lynching were clearly intended as a message to other African-Americans.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
katedriscoll

History | TOKTalk.net - 0 views

  • inking the different Areas of Knowledge (AOK) with different Ways of Knowing (WOK) can be quite challenging at times. I now attempted to link History with Language, Logics, Emotion and Sense Perception.
  • Does the way (the language) that certain historical events are presented in history books influence the way that the reader understands these events? What role does loaded language play when talking about historical events? What role do connotation and denotation play when talking about historical events? How can language introduce bias into historical accounts? How does language help or hinder the interpretation of historical facts?
  • recently read an interesting poem by the German poet and playwright Berthold Brecht – a poem which got me thinking. You see, this is one of the TOK illnesses, you start to see TOK everywhere, and also in poetry. Continue reading »
caelengrubb

History Is About Stories. Here's Why We Get Them Wrong | Time - 1 views

  • Science comes hard to most of us because it can’t really take that form. Instead it’s equations, models, theories and the data that support them. But ironically, science offers an explanation of why we love stories.
  • It starts with a challenge posed in human evolution — but the more we come to understand about that subject, the more we see that our storytelling instinct can lead us astray, especially when it comes to how most of us understand history.
  • Many animals have highly developed mind-reading instinct, a sort of tracking-device technique shared with creatures that have no language, not even a language of thought.
  • ...14 more annotations...
  • It’s what they use to track prey and avoid predation.
  • The theory of mind is so obvious it’s nearly invisible: it tells us that behavior is the result of the joint operation of pairs of beliefs and desires.
  • The desires are about the ways we want things to turn out in the future. The beliefs are about the way things are now.
  • The theory of mind turns what people do into a story with a plot by pairing up the content of beliefs and desires, what they are about.
  • Psycholinguistics has shown that the theory of mind is necessary for learning language and almost anything else our parents teach us.
  • Imitating others requires using the theory to figure out what they want us to do and in what order. Without it, you can’t learn much beyond what other higher primates can.
  • The theory of mind makes us construct stories obsessively, and thus encourages us to see the past as a set of them.
  • When popular historians seek to know why Hitler declared war on the U.S. (when he didn’t have to), they put the theory of mind to work: What did he believe and what was it that he wanted that made him do such a foolish thing?
  • he trouble is that the theory of mind is completely wrong about the way the mind, i.e. the brain, actually works. We can’t help but use it to guess what is going on in other people’s minds, and historians rely on it, but the evidence from neuroscience shows that in fact what’s “going on” in anyone’s mind is not decision about what to do in the light of beliefs and desire, but rather a series of neural circuitry firings.
  • The wrongness of the theory of mind is so profound it makes false all the stories we know and love, in narrative history (and in historical novels).
  • Neuroscience reveals that the brain is not organized even remotely to work the way the theory of mind says it does. The fact that narrative histories give persistently different answers to questions historians have been asking for centuries should be evidence that storytelling is not where the real answers can be found.
  • Crucially, they discovered that while different parts of the brain control different things, the neurons’ electrical signals don’t differ in “content”; they are not about different subjects. They are not about anything at all. Each neuron is just in a different part of the mid-brain, doing its job in exactly the same way all other neurons do, sending the same electrochemical oscillations.
  • There is nothing in our brains to vindicate the theory’s description of how anyone ever makes up his or her mind. And that explains a lot about how bad the theory of mind is at predicting anything much about the future, or explaining anything much about the past.
  • If we really want historical knowledge we’ll need to use the same tools scientists use — models and theories we can quantify and test. Guessing what was going through Hitler’s mind, and weaving it into a story is no substitute for empirical science.
Javier E

How Do You Know When Society Is About to Fall Apart? - The New York Times - 1 views

  • Tainter seemed calm. He walked me through the arguments of the book that made his reputation, “The Collapse of Complex Societies,” which has for years been the seminal text in the study of societal collapse, an academic subdiscipline that arguably was born with its publication in 1988
  • It is only a mild overstatement to suggest that before Tainter, collapse was simply not a thing.
  • His own research has moved on; these days, he focuses on “sustainability.”
  • ...53 more annotations...
  • He writes with disarming composure about the factors that have led to the disintegration of empires and the abandonment of cities and about the mechanism that, in his view, makes it nearly certain that all states that rise will one day fall
  • societal collapse and its associated terms — “fragility” and “resilience,” “risk” and “sustainability” — have become the objects of extensive scholarly inquiry and infrastructure.
  • Princeton has a research program in Global Systemic Risk, Cambridge a Center for the Study of Existential Risk
  • even Tainter, for all his caution and reserve, was willing to allow that contemporary society has built-in vulnerabilities that could allow things to go very badly indeed — probably not right now, maybe not for a few decades still, but possibly sooner. In fact, he worried, it could begin before the year was over.
  • Plato, in “The Republic,” compared cities to animals and plants, subject to growth and senescence like any living thing. The metaphor would hold: In the early 20th century, the German historian Oswald Spengler proposed that all cultures have souls, vital essences that begin falling into decay the moment they adopt the trappings of civilization.
  • that theory, which became the heart of “The Collapse of Complex Societies.” Tainter’s argument rests on two proposals. The first is that human societies develop complexity, i.e. specialized roles and the institutional structures that coordinate them, in order to solve problems
  • All history since then has been “characterized by a seemingly inexorable trend toward higher levels of complexity, specialization and sociopolitical control.”
  • Eventually, societies we would recognize as similar to our own would emerge, “large, heterogeneous, internally differentiated, class structured, controlled societies in which the resources that sustain life are not equally available to all.”
  • Something more than the threat of violence would be necessary to hold them together, a delicate balance of symbolic and material benefits that Tainter calls “legitimacy,” the maintenance of which would itself require ever more complex structures, which would become ever less flexible, and more vulnerable, the more they piled up.
  • Social complexity, he argues, is inevitably subject to diminishing marginal returns. It costs more and more, in other words, while producing smaller and smaller profits.
  • Take Rome, which, in Tainter's telling, was able to win significant wealth by sacking its neighbors but was thereafter required to maintain an ever larger and more expensive military just to keep the imperial machine from stalling — until it couldn’t anymore.
  • This is how it goes. As the benefits of ever-increasing complexity — the loot shipped home by the Roman armies or the gentler agricultural symbiosis of the San Juan Basin — begin to dwindle, Tainter writes, societies “become vulnerable to collapse.”
  • haven’t countless societies weathered military defeats, invasions, even occupations and lengthy civil wars, or rebuilt themselves after earthquakes, floods and famines?
  • Only complexity, Tainter argues, provides an explanation that applies in every instance of collapse.
  • Complexity builds and builds, usually incrementally, without anyone noticing how brittle it has all become. Then some little push arrives, and the society begins to fracture.
  • A disaster — even a severe one like a deadly pandemic, mass social unrest or a rapidly changing climate — can, in Tainter’s view, never be enough by itself to cause collapse
  • The only precedent Tainter could think of, in which pandemic coincided with mass social unrest, was the Black Death of the 14th century. That crisis reduced the population of Europe by as much as 60 percent.
  • Whether any existing society is close to collapsing depends on where it falls on the curve of diminishing returns.
  • The United States hardly feels like a confident empire on the rise these days. But how far along are we?
  • Scholars of collapse tend to fall into two loose camps. The first, dominated by Tainter, looks for grand narratives and one-size-fits-all explanations
  • The second is more interested in the particulars of the societies they study
  • Patricia McAnany, who teaches at the University of North Carolina at Chapel Hill, has questioned the usefulness of the very concept of collapse — she was an editor of a 2010 volume titled “Questioning Collapse” — but admits to being “very, very worried” about the lack, in the United States, of the “nimbleness” that crises require of governments.
  • We’re too vested and tied to places.” Without the possibility of dispersal, or of real structural change to more equitably distribute resources, “at some point the whole thing blows. It has to.”
  • In Turchin’s case the key is the loss of “social resilience,” a society’s ability to cooperate and act collectively for common goals. By that measure, Turchin judges that the United States was collapsing well before Covid-19 hit. For the last 40 years, he argues, the population has been growing poorer and more unhealthy as elites accumulate more and more wealth and institutional legitimacy founders. “The United States is basically eating itself from the inside out,
  • Inequality and “popular immiseration” have left the country extremely vulnerable to external shocks like the pandemic, and to internal triggers like the killings of George Floyd
  • Societies evolve complexity, he argues, precisely to meet such challenges.
  • Eric H. Cline, who teaches at the George Washington University, argued in “1177 B.C.: The Year Civilization Collapsed” that Late Bronze Age societies across Europe and western Asia crumbled under a concatenation of stresses, including natural disasters — earthquakes and drought — famine, political strife, mass migration and the closure of trade routes. On their own, none of those factors would have been capable of causing such widespread disintegration, but together they formed a “perfect storm” capable of toppling multiple societies all at once.
  • Collapse “really is a matter of when,” he told me, “and I’m concerned that this may be the time.”
  • In “The Collapse of Complex Societies,” Tainter makes a point that echoes the concern that Patricia McAnany raised. “The world today is full,” Tainter writes. Complex societies occupy every inhabitable region of the planet. There is no escaping. This also means, he writes, that collapse, “if and when it comes again, will this time be global.” Our fates are interlinked. “No longer can any individual nation collapse. World civilization will disintegrate as a whole.”
  • If it happens, he says, it would be “the worst catastrophe in history.”
  • The quest for efficiency, he wrote recently, has brought on unprecedented levels of complexity: “an elaborate global system of production, shipping, manufacturing and retailing” in which goods are manufactured in one part of the world to meet immediate demands in another, and delivered only when they’re needed. The system’s speed is dizzying, but so are its vulnerabilities.
  • A more comprehensive failure of fragile supply chains could mean that fuel, food and other essentials would no longer flow to cities. “There would be billions of deaths within a very short period,” Tainter says.
  • If we sink “into a severe recession or a depression,” Tainter says, “then it will probably cascade. It will simply reinforce itself.”
  • Tainter tells me, he has seen “a definite uptick” in calls from journalists: The study of societal collapse suddenly no longer seems like a purely academic pursuit
  • Turchin is keenly aware of the essential instability of even the sturdiest-seeming systems. “Very severe events, while not terribly likely, are quite possible,” he says. When he emigrated from the U.S.S.R. in 1977, he adds, no one imagined the country would splinter into its constituent parts. “But it did.”
  • He writes of visions of “bloated bureaucracies” becoming the basis of “entire political careers.” Arms races, he observes, presented a “classic example” of spiraling complexity that provides “no tangible benefit for much of the population” and “usually no competitive advantage” either.
  • It is hard not to read the book through the lens of the last 40 years of American history, as a prediction of how the country might deteriorate if resources continued to be slashed from nearly every sector but the military, prisons and police.
  • The more a population is squeezed, Tainter warns, the larger the share that “must be allocated to legitimization or coercion.
  • And so it was: As U.S. military spending skyrocketed — to, by some estimates, a total of more than $1 trillion today from $138 billion in 1980 — the government would try both tactics, ingratiating itself with the wealthy by cutting taxes while dismantling public-assistance programs and incarcerating the poor in ever-greater numbers.
  • “As resources committed to benefits decline,” Tainter wrote in 1988, “resources committed to control must increase.”
  • The overall picture drawn by Tainter’s work is a tragic one. It is our very creativity, our extraordinary ability as a species to organize ourselves to solve problems collectively, that leads us into a trap from which there is no escaping
  • Complexity is “insidious,” in Tainter’s words. “It grows by small steps, each of which seems reasonable at the time.” And then the world starts to fall apart, and you wonder how you got there.
  • Perhaps collapse is not, actually, a thing. Perhaps, as an idea, it was a product of its time, a Cold War hangover that has outlived its usefulness, or an academic ripple effect of climate-change anxiety, or a feedback loop produced by some combination of the two
  • if you pay attention to people’s lived experience, and not just to the abstractions imposed by a highly fragmented archaeological record, a different kind of picture emerges.
  • Tainter’s understanding of societies as problem-solving entities can obscure as much as it reveals
  • Plantation slavery arose in order to solve a problem faced by the white landowning class: The production of agricultural commodities like sugar and cotton requires a great deal of backbreaking labor. That problem, however, has nothing to do with the problems of the people they enslaved. Which of them counts as “society”?
  • Since the beginning of the pandemic, the total net worth of America’s billionaires, all 686 of them, has jumped by close to a trillion dollars.
  • If societies are not in fact unitary, problem-solving entities but heaving contradictions and sites of constant struggle, then their existence is not an all-or-nothing game.
  • Collapse appears not as an ending, but a reality that some have already suffered — in the hold of a slave ship, say, or on a long, forced march from their ancestral lands to reservations faraway — and survived.
  • The current pandemic has already given many of us a taste of what happens when a society fails to meet the challenges that face it, when the factions that rule over it tend solely to their own problems
  • the real danger comes from imagining that we can keep living the way we always have, and that the past is any more stable than the present.
  • If you close your eyes and open them again, the periodic disintegrations that punctuate our history — all those crumbling ruins — begin to fade, and something else comes into focus: wiliness, stubbornness and, perhaps the strongest and most essential human trait, adaptability.
  • When one system fails, we build another. We struggle to do things differently, and we push on. As always, we have no other choice.
runlai_jiang

What If Martin Luther King Jr. Was Never Assassinated? - 0 views

  • King’s contemporaries’ real lives can bridge the gap between reality and fiction—especially the life of Coretta Scott King, his wife and fellow activist. “The scope of her activism and the breadth of the issues she was working on are an indication of where [Martin Luther King] would be,” suggests Jeanne Theoharis, a Brooklyn College political science professor and author of A More Beautiful and Terrible History: The Uses and Misuses of Civil Rights History.
  • King, if he had lived, would very likely have taken up those same banners—perhaps even marching them into the White House, speculates Komozi Woodard, a professor of history, public policy, and Africana studies at Sarah Lawrence College. “Hopefully, White America would have matured to the point of decriminalizing Dr. King as time went on” just as Nelson Mandela’s image shifted from terrorist to savior in South Africa, Woodard tells National Geographic. “Dr. King may have successfully run for president as Mandela did.”
  • A Living King One difference is a little easier to imagine, though it speaks volumes: we might not celebrate Martin Luther King Day in a world where he wasn’t assassinated.
  • ...4 more annotations...
  • , was vastly unpopular among political leaders and white Americans at the time of his death. “FBI leadership at that time saw King’s stature as a ‘black messiah’ in criminal terms,” Woodard says, describing how King was “alienated, isolated and eliminated” by the Johnson administration.
  • Woodard describes how King’s views were becoming more aligned with those of the more radical Malcolm X. “That doesn’t mean that King would have abandoned nonviolent protest,” he says, “but it means that King was increasingly militant in his anti-poverty agenda.”
  • a ladder of legacies Woodard traces from King through Stokely Carmichael, to whose youthful Black Power movement Woodard imagines King would have added “a core of stability.”
  • If King had lived, his presumed connection to—or involvement with—today’s polarizing racial justice activists would counter, as Theoharis puts it, America’s “fable that we’ve gotten past the race problem.”
tongoscar

Yes, America needs a National Women's History Museum - 0 views

  • With a solidly bipartisan vote of 374-37, the US House of Representatives this month passed a bill to establish a National Women’s History Museum. Here’s hoping the Senate follows suit.
  • After all, this year marks the 100th anniversary of the 19th Amendment, guaranteeing women’s right to vote — a constitutional change that was the culmination of decades of work by the suffragist movement, which famously dates to the 1848 women’s rights convention in New York’s own Seneca Falls.
  • “For too long, women’s history has been left out of the telling of our nation’s history,” she and her fellow lead co-sponsors note. “Representation matters. Let’s make sure that every child can see themselves in their heroes and role models.”
  • ...1 more annotation...
  • The bill would establish a council to make recommendations to the Board of Regents of the Smithsonian Museum, tasking it with designating a site for the museum on or near the National Mall. Getting anything passed into law in a bitter election year is tricky. Let’s hope the same bipartisan spirit will move the Senate to get this done.
Javier E

'The Fourth Turning' Enters Pop Culture - The New York Times - 0 views

  • According to “fourth turning” proponents, American history goes through recurring cycles. Each one, which lasts about 80 to 100 years, consists of four generation-long seasons, or “turnings.” The winter season is a time of upheaval and reconstruction — a fourth turning.
  • The theory first appeared in “The Fourth Turning,” a work of pop political science that has had a cult following more or less since it was published in 1997. In the last few years of political turmoil, the book and its ideas have bubbled into the mainstream.
  • According to “The Fourth Turning,” previous crisis periods include the American Revolution, the Civil War and World War II. America entered its latest fourth turning in the mid-2000s. It will culminate in a crisis sometime in the 2020s — i.e., now.
  • ...13 more annotations...
  • One of the book’s authors, Neil Howe, 71, has become a frequent podcast guest. A follow-up, “The Fourth Turning Is Here,” comes out this month.
  • The play’s author, Will Arbery, 33, said he heard about “The Fourth Turning” while researching Stephen K. Bannon, the right-wing firebrand and former adviser to President Donald J. Trump, who is a longtime fan of the book and directed a 2010 documentary based on its ideas.
  • He described it as “this almost fun theory about history,” but added: “And yet there’s something deeply menacing about it.”
  • Mr. Arbery, who said he does not subscribe to the theory, sees parallels between the fourth turning and other nonscientific beliefs. “I modeled the way that Teresa talks about the fourth turning on the way that young liberals talk about astrology,” he said.
  • The book’s outlook on the near future has made it appealing to macro traders and crypto enthusiasts, and it is frequently cited on the podcasts “Macro Voices,” “Wealthion” and “On the Margin.”
  • In the new book, he describes what a coming civil war or geopolitical conflict might look like — though he shies away from casting himself as a modern-day Nostradamus.
  • “The Fourth Turning” captured a mood of decline in recent American life. “I remember feeling safe in the ’90s, and then as soon as 9/11 hit, the world went topsy-turvy,” he said. “Every time my cohort got to the point where we were optimistic, another crisis happened. When I read the book, I was like, ‘That makes sense.’”
  • “The Fourth Turning” was conceived during a period of relative calm. In the late 1980s, Mr. Howe, a Washington, D.C., policy analyst, teamed with William Strauss, a founder of the political satire troupe the Capitol Steps.
  • Their first book, “Generations,” told a story of American history through generational profiles going back to the 1600s. The book was said to have influenced Bill Clinton to choose a fellow baby boomer, Al Gore, as his running mate
  • when the 2008 financial crisis hit at almost exactly the point when the start of the fourth turning was predicted, it seemed to many that the authors might have been onto something. Recent events — the pandemic, the storming of the Capitol — have seemingly provided more evidence for the book’s fans.
  • Historically, a fourth turning crisis has always translated into a civil war, a war of great nations, or both, according to the book. Either is possible over the next decade, Mr. Howe said. But he is a doomsayer with an optimistic streak: Each fourth turning, in his telling, kicks off a renaissance in civic life.
  • “I’ve read ‘The Fourth Turning,’ and indeed found it useful from a macroeconomic investing perspective,” Lyn Alden, 35, an investment analyst, wrote in an email. “History doesn’t repeat, but it kind of gives us a loose framework to work with.”
  • “This big tidal shift is arriving,” Mr. Howe said. “But if you’re asking me which wave is going to knock down the lighthouse, I can’t do that. I can just tell you that this is the time period. It gives you a good idea of what to watch for.”
criscimagnael

6 Surprising Discoveries From Medieval Times - HISTORY - 0 views

  • According to the Israel Antiquities Authority, the weapon is 900 years old, and belonged to a knight who came to the Middle East to fight in the Crusades, in which European Christian armies fought Muslims over control of Jerusalem and other sites.
  • “First, it dates to just before—or possibly around the very start of—Christianization in Ireland. St. Patrick, writing about a hundred years after the idol was made, in the fifth century, condemned “pagan” figures like this one. Second, it was found in a bog; bogs were special sites, neither water nor land, where people dumped sacrifices and the bodies of executed victims. This figure was found with animal remains and a dagger, thus clearly part of a ritual. Third, all this suggests something about religious practices in Ireland before people turned Christian.”
Javier E

No rides, but lots of rows: 'reactionary' French theme park plots expansion | France | ... - 0 views

  • Nicolas de Villiers said the theme park – whose subject matter includes Clovis, king of the Franks, and a new €20m (£17m) show about the birth of modern cinema – was not about politics. He said: “What we want when an audience leaves our shows – which are works of art and were never history lessons – is to feel better and bigger, because the hero has brought some light into their hearts … Puy du Fou is more about legends than a history book.”
  • He said the park’s trademark high-drama historical extravaganzas worked because, at a time of global crisis, people had a hunger to understand their roots and traditions. “The artistic language we invented corresponds to the era we live in. People have a thirst for their roots, a thirst to understand what made them what they are today, which means their civilisation. They want to understand what went before them.” He called it a “profound desire to rediscover who we are”.
  • e added: “People who come here don’t have an ideology, they come here and say it’s beautiful, it’s good, I liked it.”
  • ...4 more annotations...
  • Guillaume Lancereau, Max Weber fellow at the European University Institute in Florence, was part of a group of historians who published the book Puy du Faux (Puy of Fakes), analysing the park’s take on history. They viewed the park as having a Catholic slant, questionable depictions of nobility and a presentation of rural peasants as unchanged through the ages.
  • Lancereau did not question the park’s entertainment value. But he said: “Professional historians have repeatedly criticised the park for taking liberties with historical events and characters and, more importantly, for distorting the past to serve a nationalistic, religious and conservative political agenda. This raises important questions about the contemporary entanglement between entertainment, collective memory and politically oriented historical production …
  • “At a time when increasing numbers of undergraduates are acquiring their historical knowledge from popular culture and historical reenactments, the Puy du Fou’s considerable expansion calls for further investigation of a phenomenon that appears to be influencing the making of historical memory in contemporary Europe.”
  • Outside the park’s musketeers show, André, 76, had driven 650km (400 miles) from Burgundy with his wife and grandson. “We came because we’re interested in history,” he said. “The shows are technically brilliant and really make you think. You can tell it’s a bit on the right – the focus on war, warriors and anti-revolution – but I don’t think that matters.”
« First ‹ Previous 41 - 60 of 744 Next › Last »
Showing 20 items per page