Skip to main content

Home/ TOK Friends/ Group items tagged slave

Rss Feed Group items tagged

Javier E

What Would Plato Tweet? - NYTimes.com - 1 views

  • In a mere couple of centuries, Greek speakers went from anomie and illiteracy, lacking even an alphabet, to Aeschylus and Aristotle. They invented not only the discipline of philosophy, but also science, mathematics, the study of history (as opposed to mere chronicles) and that special form of government they called democracy — literally rule of the people (though it goes without saying that “the people” didn’t include women and slaves). They also produced timeless art, architecture, poetry and drama.
  • The more outstanding you were, the more mental replication of you there would be, and the more replication, the more you mattered.
  • Kleos lay very near the core of the Greek value system. Their value system was at least partly motivated, as perhaps all value systems are partly motivated, by the human need to feel as if our lives matter
  • ...8 more annotations...
  • what they wanted was the attention of other mortals. All that we can do to enlarge our lives, they concluded, is to strive to make of them things worth the telling
  • Greek philosophy also represented a departure from its own culture. Mattering wasn’t acquired by gathering attention of any kind, mortal or immortal. Acquiring mattering was something people had to do for themselves, cultivating such virtuous qualities of character as justice and wisdom. They had to put their own souls in order.
  • the Ivrim, the Hebrews, apparently from their word for “over,” since they were over on the other side of the Jordan
  • the one and only God, the Master of the Universe, providing the foundation for both the physical world without and the moral world within. From his position of remotest transcendence, this god nevertheless maintains a rapt interest in human concerns, harboring many intentions directed at us, his creations, who embody nothing less than his reasons for going to the trouble of creating the world ex nihilo
  • what the Greeks had called kleos. The word comes from the old Homeric word for “I hear,” and it meant a kind of auditory renown. Vulgarly speaking, it was fame. But it also could mean the glorious deed that merited the fame, as well as the poem that sang of the deed and so produced the fame.
  • Over the centuries, philosophy, perhaps aided by religion, learned to abandon entirely the flawed Greek presumption that only extraordinary lives matter. This was progress of the philosophical variety, subtler than the dazzling triumphs of science, but nevertheless real. Philosophy has laboriously put forth arguments that have ever widened the sphere of mattering.
  • We’ve come a long way from the kleos of Greeks, with its unexamined presumption that mattering is inequitably distributed among us, with the multireplicated among us mattering more.
  • our culture has, with the dwindling of theism, returned to the answer to the problem of mattering that Socrates and Plato judged woefully inadequate.
Javier E

Common Core and the End of History | Alan Singer - 0 views

  • On Monday October 20, 2014, the Regents, as part of their effort to promote new national Common Core standards and mystically prepare students for non-existing 21st century technological careers, voted unanimously that students did not have to pass both United States and Global History exams in order to graduate from high school and maintained that they were actually raising academic standards.
  • The Global History exam will also be modified so that students will only be tested on events after 1750, essentially eliminating topics like the early development of civilizations, ancient empires, the rise of universal religions, the Columbian Exchange, and trans-Atlantic Slave Trade from the test.
  • As a result, social studies is no longer taught in the elementary school grades
  • ...12 more annotations...
  • Students will be able to substitute a tech sequence and local test for one of the history exams, however the Regents did not present, design, or even describe what the tech alternative will look like. Although it will be implemented immediately, the Regents left all the details completely up to local initiative.
  • Under the proposal, students can substitute career-focused courses in subjects such as carpentry, advertising or hospitality management rather than one of two history Regents exams that are now required
  • In June 2010 the Regents eliminated 5th and 8th grade social studies, history, and geography assessments so teachers and schools could concentrate on preparing students for high-stakes Common Core standardized reading and math assessments.
  • Mace reports his middle school students have no idea which were the original thirteen colonies, where they were located, or who were the founders and settlers. The students in his honors class report that all they studied in elementary school was English and math. Morning was math; afternoon was ELA. He added, "Teachers were worried that this would happen, and it has."
  • Debate over the importance of teaching history and social studies is definitely not new. During World War I, many Americans worried that new immigrants did not understand and value the history and government of the United States so new high school classes and tests that developed into the current classes and tests were put in place.
  • Mace describes his students as the "common core kids, inundated with common core, but they do not know the history of the United States." The cardinal rule of public education in the 21st Century seems to be that which gets tested is important and that which does not is dropped.
  • "By making state social studies exams optional, we have come to a point where our nation's own history has been marginalized in the classroom and, with it, the means to understand ourselves and the world around us. America's heritage is being eliminated as a requirement for graduation.
  • I am biased. I am a historian, a former social studies teacher, and I help to prepare the next generation of social studies teachers.
  • But these decisions by the Regents are politically motivated, lower graduation standards, and are outright dangerous.
  • The city is under a lot of pressure to support the revised and lower academic standards because in the next few weeks it is required to present plans to the state for turning around as many as 250 schools that are labeled as "failing."
  • Merryl Tisch, Chancellor of the State Board of Regents, described the change as an effort to "back-fill opportunities for students with different interests, with different opportunities, with different choice."
  • The need to educate immigrants and to understand global issues like ISIS and Ebola remain pressing, but I guess not for New York State high school students. Right now, it looks like social studies advocates have lost the battle and we are finally witnessing the end of history.
demetriar

How Pattern Recognition Gives You an Edge | Anna Clark - 0 views

  • Although pattern recognition is commonly associated with computer science and engineering, it also applies to nature, people and social systems. In fact, even animals and babies are born with the ability to recognize patterns. Sharpening our pattern recognition ability helps us cultivate vision, which is crucial for gaining an edge in a rapidly changing world.
  • (Unfortunately, technology also allows powerful interests to recognize patterns in big data to manipulate voters and consumers, but that's another story.)
  • We can become slaves to patterns. Extrapolate this tendency broadly and you can see how a society becomes fixed in its ways.
  • ...2 more annotations...
  • Pattern recognition only serves as an edge when you know how to use it to your advantage.
  • A kaleidoscope of perspectives also adds luster to life, which sometimes gets dulled by the force of our own habits.
Javier E

Is Huckleberry Finn's ending really lacking? Not if you're talking psychology. | Litera... - 0 views

  • What is it exactly that critics of the novel’s final chapters object to?
  • As Leo Marx put it in a 1953 essay, when Tom enters the picture, Huck falls “almost completely under his sway once more, and we are asked to believe that the boy who felt pity for the rogues is now capable of making Jim’s capture the occasion for a game. He becomes Tom’s helpless accomplice, submissive and gullible.” And to Marx, this regressive transformation is as unforgiveable as it is unbelievable.
  • psychologically, the reversion is as sound as it gets, despite the fury that it inspires. Before we rush to judge Huck—and to criticize Twain for veering so seemingly off course—we’d do well to consider a few key elements of the situations.
  • ...10 more annotations...
  • Huck is a thirteen (or thereabouts)-year-old boy. He is, in other words, a teenager. What’s more, he is a teenager from the antebellum South. Add to that the disparity between his social standing and education and Tom Sawyer’s, and you get a picture of someone who is quite different from a righteous fifty-something (or even thirty-something) literary critic who is writing in the twentieth century for a literary audience. And that someone has to be judged appropriately for his age, background, and social context—and his creator, evaluated accordingly.
  • There are a few important issues at play. Huck is not an adult. Tom Sawyer is not a stranger. The South is not a psychology lab. And slavery is not a bunch of lines projected on a screen. Each one of these factors on its own is enough to complicate the situation immensely—and together, they create one big complicated mess, that makes it increasingly likely that Huck will act just as he does, by conforming to Tom’s wishes and reverting to their old group dynamic.
  • Tom is a part of Huck’s past, and there is nothing like context to cue us back to past habitual behavior in a matter of minutes. (That’s one of the reasons, incidentally, that drug addicts often revert back to old habits when back in old environments.)
  • Jim is an adult—and an adult who has become a whole lot like a parent to Huck throughout their adventures, protecting him and taking care of him (and later, of Tom as well) much as a parent would. And the behavior that he wants from Huck, when he wants anything at all, is prosocial in the extreme (an apology, to take the most famous example, for playing a trick on him in the fog; not much of an ask, it seems, unless you stop to consider that it’s a slave asking a white boy to acknowledge that he was in the wrong). Tom, on the other hand, is a peer. And his demands are far closer to the anti-social side of the scale. Is it so surprising, then, that Huck sides with his old mate?
  • Another crucial caveat to Huck’s apparent metamorphosis: we tend to behave differently in private versus public spheres.
  • behavior is highly contextual—especially when it comes to behaviors that may not be as socially acceptable as one might hope. Huck and Jim’s raft is akin to a private sphere. It is just them, alone on the river, social context flowing away. And when does Huck’s behavior start to shift? The moment that he returns to a social environment, when he joins the Grangerfords in their family feud.
  • When the researchers looked at conformity to parents, they found a steady decrease in conforming behavior. Indeed, for the majority of measures, peer and parental conformity were negatively correlated. And what’s more, the sharpest decline was in conformity to pro-social behaviors.
  • On the raft, Jim was in a new environment, where old rules need not apply—especially given its private nature. But how quickly old ways kick back in, irrespective of whether you were a Huck or a Jim in that prior context.
  • there is a chasm, she points out, between Huck’s stated affection for Jim and his willingness to then act on it, especially in these final episodes. She blames the divide on Twain’s racism. But wouldn’t it be more correct to blame Huck’s only too real humanity?
  • Twain doesn’t make Huck a hero. He makes him real. Can we blame the book for telling it like it is?
Javier E

Julian Assange on Living in a Surveillance Society - NYTimes.com - 0 views

  • Describing the atomic bomb (which had only two months before been used to flatten Hiroshima and Nagasaki) as an “inherently tyrannical weapon,” he predicts that it will concentrate power in the hands of the “two or three monstrous super-states” that have the advanced industrial and research bases necessary to produce it. Suppose, he asks, “that the surviving great nations make a tacit agreement never to use the atomic bomb against one another? Suppose they only use it, or the threat of it, against people who are unable to retaliate?”
  • The likely result, he concludes, will be “an epoch as horribly stable as the slave empires of antiquity.” Inventing the term, he predicts “a permanent state of ‘cold war,"’ a “peace that is no peace,” in which “the outlook for subject peoples and oppressed classes is still more hopeless.”
  • the destruction of privacy widens the existing power imbalance between the ruling factions and everyone else, leaving “the outlook for subject peoples and oppressed classes,” as Orwell wrote, “still more hopeless.
  • ...10 more annotations...
  • At present even those leading the charge against the surveillance state continue to treat the issue as if it were a political scandal that can be blamed on the corrupt policies of a few bad men who must be held accountable. It is widely hoped that all our societies need to do to fix our problems is to pass a few laws.
  • The cancer is much deeper than this. We live not only in a surveillance state, but in a surveillance society. Totalitarian surveillance is not only embodied in our governments; it is embedded in our economy, in our mundane uses of technology and in our everyday interactions.
  • The very concept of the Internet — a single, global, homogenous network that enmeshes the world — is the essence of a surveillance state. The Internet was built in a surveillance-friendly way because governments and serious players in the commercial Internet wanted it that way. There were alternatives at every step of the way. They were ignored.
  • Unlike intelligence agencies, which eavesdrop on international telecommunications lines, the commercial surveillance complex lures billions of human beings with the promise of “free services.” Their business model is the industrial destruction of privacy. And yet even the more strident critics of NSA surveillance do not appear to be calling for an end to Google and Facebook
  • At their core, companies like Google and Facebook are in the same business as the U.S. government’s National Security Agency. They collect a vast amount of information about people, store it, integrate it and use it to predict individual and group behavior, which they then sell to advertisers and others. This similarity made them natural partners for the NSA
  • there is an undeniable “tyrannical” side to the Internet. But the Internet is too complex to be unequivocally categorized as a “tyrannical” or a “democratic” phenomenon.
  • It is possible for more people to communicate and trade with others in more places in a single instant than it ever has been in history. The same developments that make our civilization easier to surveil make it harder to predict. They have made it easier for the larger part of humanity to educate itself, to race to consensus, and to compete with entrenched power groups.
  • If there is a modern analogue to Orwell’s “simple” and “democratic weapon,” which “gives claws to the weak” it is cryptography, the basis for the mathematics behind Bitcoin and the best secure communications programs. It is cheap to produce: cryptographic software can be written on a home computer. It is even cheaper to spread: software can be copied in a way that physical objects cannot. But it is also insuperable — the mathematics at the heart of modern cryptography are sound, and can withstand the might of a superpower. The same technologies that allowed the Allies to encrypt their radio communications against Axis intercepts can now be downloaded over a dial-up Internet connection and deployed with a cheap laptop.
  • It is too early to say whether the “democratizing” or the “tyrannical” side of the Internet will eventually win out. But acknowledging them — and perceiving them as the field of struggle — is the first step toward acting effectively
  • Humanity cannot now reject the Internet, but clearly we cannot surrender it either. Instead, we have to fight for it. Just as the dawn of atomic weapons inaugurated the Cold War, the manifold logic of the Internet is the key to understanding the approaching war for the intellectual center of our civilization
Javier E

Ta-Nehisi Coates defines a new race beat - Columbia Journalism Review - 0 views

  • “The Case for Reparations,” Coates’ 16,000-word cover story for The Atlantic, where he is a national correspondent. Published online in May, it was a close look at housing discrimination, such as redlining, that was really about the need for America to take a brutally honest look in the mirror and acknowledge its deep racial divisions.
  • The story broke a single-day traffic record for a magazine story on The Atlantic’s website, and in its wake, Politico named him to its list of 50 thinkers changing American politics
  • Coates believes that if there is an answer to contemporary racism, it lies in confronting the pas
  • ...24 more annotations...
  • For Coates, true equality means “black people in this country have the right to be as mediocre as white people,” he says. “Not that individual black people will be as excellent, or more excellent, than other white people.”
  • he came to see black respectability—the idea that, to succeed, African-Americans must stoically prevail against the odds and be “twice as good” as white people to get the same rights—as deeply immoral.
  • He is no soothsayer, telling people what to think from on high, but rather is refreshingly open about what he doesn’t know, inviting readers to learn with him. Coates is not merely an ivory-tower pontificator or a shiny Web 2.0 brand. He is a public intellectual for the digital age.
  • we miss the real question of why there is a systemic, historical difference in the way police treat blacks versus whites.
  • Another term for that road is “white supremacy.” This refers not so much to hate groups, but, as Coates defines it, a system of policies and beliefs that aims to keep African-Americans as “a peon class.”
  • To be “white” in this sense does not refer merely to skin color but to the degree that someone qualifies as “normal,” and thus worthy of the same rights as all Americans
  • The pool where all these ideas eventually arrive is a question: “How big-hearted can democracy be?” he says. “How many people can it actually include and sustain itself? That is the question I’m asking over and over again.”
  • it is a question of empathy. Are humans capable of forming a society where everyone can flourish?
  • there was the coverage of Michael Brown (or Jordan Davis, or Renisha McBride, or Eric Garner): unarmed African-Americans killed by police or others under controversial circumstances. In each case, the storyline was that these horrific encounters were caused either by genuine provocation, or by race-fueled fear or hatred. Either way, they were stories of personal failings.
  • When an event becomes news, there is often an implication that it is an exception—that the world is mostly working as it should and this event is newsworthy because it’s an aberration. If the race-related stories we see most often in the media are about personal bigotry, then our conception of racism is limited to the bigoted remarks or actions—racism becomes little more than uttering the n-word.
  • he cites research that in 1860 slaves were the largest asset in the US economy. “It is almost impossible to think of democracy, as it was formed in America, without the enslavement of African-Americans,” he says. “Not that these things were bumps in the road along the way, but that they were the road.”
  • a lack of historical perspective in the media’s approach to race. “Journalism privileges what’s happening now over the long reasons for things happening,” he says. “And for African-Americans, that has a particular effect.”
  • Even the very existence of racism is questioned: A recent study published by the Association of Psychological Science has shown that whites think they are discriminated against due to race as much if not more than blacks.
  • “So when you’re talking about something like institutional racism and prejudice, how do you talk about that as an objective reality?”
  • Coates’ strength is in connecting contemporary problems to historical scholarship. “I think if I bring anything to the table it’s the ability to synthesize all of that into something that people find emotionally moving,” he says. The irony of the reparations piece, as unoriginal as it may have been to scholars, is that it was news to many people.
  • Reporting on race requires simultaneously understanding multiple, contradictory worlds, with contradictory narratives. Widespread black poverty exists; so do a black middle class and a black president
  • Progress is key to the myth of American Exceptionalism, and the notion that America is built on slavery and on freedom are discordant ideas that threaten any simple storyline. Coates, together with others who join him, is trying to claim the frontier of a new narrative.
  • reading Coates is like building a worldview, piece by piece, on an area of contemporary life that’s otherwise difficult to grasp.
  • “To come and tell someone may not be as effective in convincing them as allowing them to learn on their own. If you believe you come to a conclusion on your own, you’re more likely to agree.”
  • It’s brave to bare yourself intellectually on the Web, and to acknowledge mistakes, especially when the capital that public intellectuals appear to have is their ability to be “right.”
  • Coates is equally demanding of his followers. Online he is blunt, and willing to call people out. He cares enough to be rigorous
  • despite being a master of online engagement, Coates insists he does not write for others, an idea he explained in a recent post: “I have long believed that the best part of writing is not the communication of knowledge to other people, but the acquisition and synthesizing of knowledge for oneself. The best thing I can say about the reparations piece is that I now understand.”
  • To him, it’s an open question whether or not America will ever be capable of fostering true equality. “How big-hearted can democracy be? It points to a very ugly answer: maybe not that big-hearted at all. That in fact America is not exceptional. That it’s just like every other country. That it passes its democracy and it passes all these allegedly big-hearted programs [the New Deal, the G.I. Bill] but still excludes other people,
  • In a 2010 post about antebellum America, Coates mentioned feminist and abolitionist Angelina Grimke. “Suffice to say that much like Abe Lincoln, and Ulysses Grant, Angelina Grimke was a Walker,” he wrote. “What was the Walker reference?” Rosemartian asked in the comments section. “Just someone who spends their life evolving, or, walking,” Coates replied. “Grant and Lincoln fit in there for me. Malcolm X was another Walker. Walkers tend to be sometimes—even often—wrong. But they are rarely bigots, in the sense of nakedly clinging to ignorance.”
kushnerha

Buying begets buying: how stuff has consumed the average American's life | Life and sty... - 0 views

  • Our addiction to consuming things is a vicious cycle, and buying a bigger house to store it all isn’t the answer.
  • personal storage industry rakes in $22bn each year, and it’s only getting bigger. Why?
  • We shop because we’re bored, anxious, depressed or angry, and we make the mistake of buying material goods and thinking they are treats which will fill the hole, soothe the wound, make us feel better. The problem is, they’re not treats, they’re responsibilities and what we own very quickly begins to own us.
  • ...10 more annotations...
  • because of our stuff. What kind of stuff? Who cares!
  • don’t do anything but take up space and look pretty for a season or two before being replaced by other, newer things – equally pretty and equally useless.
  • if you have more stuff than you do space to easily store it, your life will be spent a slave to your possessions.
  • So, if our houses have tripled in size while the number of people living in them has shrunk, what, exactly, are we doing with all of this extra space? And why the billions of dollars tossed to an industry that was virtually nonexistent a generation or two ago?
  • when you buy something, you’re also taking on the task of disposing of it (responsibly or not) when you’re done with it. Our addiction to consumption is a vicious one, and it’s stressing us out.
  • A study published by UCLA showed that women’s stress hormones peaked during the times they were dealing with their possessions and material goods.
  • Our current solution to having too much stuff is as short-sighted as it is ineffective: when we run out of space, we simply buy a bigger house.
  • So if bigger homes aren’t the solution, what is? I suggest heading in the exact opposite direction: deliberately choose a life with less. Buy less and instantly you have less to store; you use less space. Eventually you can work less to pay for all of this stuff. Soon you will stress less too and, above all, your life will involve less waste.
  • wondering where to begin? Don’t. You know exactly where this journey starts. It starts with the stuff that makes you feel guilty, stressed or overwhelmed when you look at it.
  • Because when it comes to stuff, I promise you, you don’t need more labels or better systems or complicated Pinterest tutorials – all you need is less.
Javier E

Bile, venom and lies: How I was trolled on the Internet - The Washington Post - 0 views

  • Thomas Jefferson often argued that an educated public was crucial for the survival of self-government
  • We now live in an age in which that education takes place mostly through relatively new platforms. Social networks — Facebook, Twitter, Instagram, etc. — are the main mechanisms by which people receive and share facts, ideas and opinions. But what if they encourage misinformation, rumors and lies?
  • In a comprehensive new study of Facebook that analyzed posts made between 2010 and 2014, a group of scholars found that people mainly shared information that confirmed their prejudices, paying little attention to facts and veracity. (Hat tip to Cass Sunstein, the leading expert on this topic.) The result, the report says, is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust and paranoia.
  • ...7 more annotations...
  • The authors specifically studied trolling — the creation of highly provocative, often false information, with the hope of spreading it widely. The report says that “many mechanisms cause false information to gain acceptance, which in turn generate false beliefs that, once adopted by an individual, are highly resistant to correction.”
  • in recent weeks I was the target of a trolling campaign and saw exactly how it works. It started when an obscure website published a post titled “CNN host Fareed Zakaria calls for jihad rape of white women.” The story claimed that in my “private blog” I had urged the use of American women as “sex slaves” to depopulate the white race. The post further claimed that on my Twitter account, I had written the following line: “Every death of a white person brings tears of joy to my eyes.”
  • Disgusting. So much so that the item would collapse from its own weightlessness, right? Wrong. Here is what happened next: Hundreds of people began linking to it, tweeting and retweeting it, and adding their comments, which are too vulgar or racist to repeat. A few ultra-right-wing websites reprinted the story as fact. With each new cycle, the levels of hysteria rose, and people started demanding that I be fired, deported or killed. For a few days, the digital intimidation veered out into the real world. Some people called my house late one night and woke up and threatened my daughters, who are 7 and 12.
  • The people spreading this story were not interested in the facts; they were interested in feeding prejudice. The original story was cleverly written to provide conspiracy theorists with enough ammunition to ignore evidence. It claimed that I had taken down the post after a few hours when I realized it “receive[d] negative attention.” So, when the occasional debunker would point out that there was no evidence of the post anywhere, it made little difference. When confronted with evidence that the story was utterly false, it only convinced many that there was a conspiracy and coverup.
  • conversations on Facebook are somewhat more civil, because people generally have to reveal their identities. But on Twitter and in other places — the online comments section of The Post, for example — people can be anonymous or have pseudonyms. And that is where bile and venom flow freely.
  • an experiment performed by two psychologists in 1970. They divided students into two groups based on their answers to a questionnaire: high prejudice and low prejudice. Each group was told to discuss controversial issues such as school busing and integrated housing. Then the questions were asked again. “The surveys revealed a striking pattern,” Kolbert noted. “Simply by talking to one another, the bigoted students had become more bigoted and the tolerant more tolerant.”
  • This “group polarization” is now taking place at hyper speed, around the world. It is how radicalization happens and extremism spreads.
kushnerha

BBC - Future - The secret "anti-languages" you're not supposed to know - 2 views

  • speak an English “anti-language”. Since at least Tudor times, secret argots have been used in the underworld of prisoners, escaped slaves and criminal gangs as a way of confusing and befuddling the authorities.Thieves’ Cant, Polari, and Gobbledygook (yes, it’s a real form of slang) are just a few of the examples from the past – but anti-languages are mercurial beasts that are forever evolving into new and more vibrant forms.
  • A modern anti-language could very well be spoken on the street outside your house. Unless you yourself are a member of the “anti-society”, the strange terms would sound like nonsense. Yet those words may have nevertheless influenced your swear words, the comedy you enjoy and the music on your iPod – without you even realising the shady interactions that shaped them.
  • One of the first detailed records of an anti-language comes from a 16th Century magistrate called Thomas Harman. Standing at his front door, he offered food and money to passing beggars in return for nothing more than words. “He would say 'either I throw you in prison or you give me your Cant,'”
  • ...15 more annotations...
  • “Slang may not represent us at our best, or our most admirable, but it represents us as human beings with anger, fear, self-aggrandisement, and our obsession with sex and bodily parts.”
  • This clever, playful use of metaphor would come to define anti-languages for Halliday. As you could see from the dialogue between the two Elizabethan ruffians, the strange, nonsensical words render a sentence almost impossible to comprehend for outsiders, and the more terms you have, the harder it is for an outsider to learn the code. It is the reason that selling words to the police can be heavily punished among underworld gangs.
  • All borrow the grammar of the mother language but replace words (“London”, “purse”, “money”, “alehouse”) with another, elliptical term (“Rome”, “bounge”, “lower”, “bowsing ken”). Often, the anti-language may employ dozens of terms that have blossomed from a single concept – a feature known as “over-lexicalisation”. Halliday points to at least 20 terms that Elizabethan criminals used to describe fellow thieves, for instance
  • Similarly, the Kolkata underworld had 41 words for police and more than 20 for bomb. Each anti-society may have its own way of generating new terms; often the terms are playful metaphors (such as “bawdy basket”), but they can also be formed from existing words by swapping around or inserting syllables – “face” might become “ecaf”, for instance.
  • striking similarities in the patois spoken by all three underground groups and the ways it shaped their interactions.
  • “The better you are, the higher the status between those users,” explains Martin Montgomery, author of An Introduction to Language and Society.
  • Halliday doubted that secrecy was the only motive for building an anti-language, though; he found that it also helps define a hierarchy within the “anti-society”. Among the Polish prisoners, refusing to speak the lingo could denigrate you to the lowest possible rung of the social ladder, the so-called “suckers”.
  • The concept of an anti-language throws light on many of the vibrant slangs at the edges of society, from Cockney rhyming slang and Victorian “Gobbledygook” to the “Mobspeak” of the Mafia and “Boobslang” found uniquely in New Zealand prisons. The breadth and range of the terms can be astonishing; a lexicography of Boobslang reaches more than 200 pages, with 3,000 entries covering many areas of life.
  • Consider Polari. Incorporating elements of criminal cants, the gypsy Romani language, and Italian words, it was eventually adopted by the gay community of early 20th Century Britain, when homosexuality was still illegal. (Taking a “vada” at a “bona omi” for instance, means take a look at the good-looking man). Dropping an innocent term into a conversation would have been a way of identifying another gay man, without the risk of incriminating yourself among people who were not in the know.
  • His success is a startling illustration of the power of an anti-language to subvert – using the establishment's prudish "Auntie"  to broadcast shocking scenes of gay culture, two years before the Sexual Offences Act decriminalised homosexuality. The show may have only got the green light thanks to the fact that the radio commissioners either didn’t understand the connotations
  • the song Girl Loves Me on David Bowie’s latest album was written as a combination of Polari and Nadsat, the fictional anti-language in Anthony Burgess’s A Clockwork Orange.
  • Montgomery thinks we can see a similar process in the lyrics of hip-hop music. As with the other anti-languages, you can witness the blossoming of words for the illegal activities that might accompany gang culture. “There are so many words for firearm, for different kinds of drug, for money,”
  • Again, the imaginitive terms lend themselve to artistic use. “There’s quite often a playful element you elaborate new terms for old,” Montgomery says. “To use broccoli as a word for a drug – you take a word from the mainstream and convert it to new use and it has semi-humorous twist to it.”
  • He thinks that the web will only encourage the creation of slang that share some of the qualities of anti-languages; you just need to look at the rich online vocabulary that has emerged to describe prostitution;
  • new, metaphorical forms of speech will also proliferate in areas threatened by state censorship; already, you can see a dozen euphemisms flourishing in place of every term that is blocked from a search engine or social network.  If we can learn anything from this rich history of criminal cants, it is the enormous resilience of human expression in the face of oppression.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

Opinion | Humans Are Animals. Let's Get Over It. - The New York Times - 0 views

  • The separation of people from, and the superiority of people to, members of other species is a good candidate for the originating idea of Western thought. And a good candidate for the worst.
  • Like Plato, Hobbes associates anarchy with animality and civilization with the state, which gives to our merely animal motion moral content for the first time and orders us into a definite hierarchy.
  • It is rationality that gives us dignity, that makes a claim to moral respect that no mere animal can deserve. “The moral law reveals to me a life independent of animality,” writes Immanuel Kant in “Critique of Practical Reason.” In this assertion, at least, the Western intellectual tradition has been remarkably consistent.
  • ...15 more annotations...
  • the devaluation of animals and disconnection of us from them reflect a deeper devaluation of the material universe in general
  • In this scheme of things, we owe nature nothing; it is to yield us everything. This is the ideology of species annihilation and environmental destruction, and also of technological development.
  • Further trouble is caused when the distinctions between humans and animals are then used to draw distinctions among human beings
  • Some of us, in short, are animals — and some of us are better than that. This, it turns out, is a useful justification for colonialism, slavery and racism.
  • The classical source for this distinction is certainly Aristotle. In the “Politics,” he writes, “Where then there is such a difference as that between soul and body, or between men and animals (as in the case of those whose business is to use their body, and who can do nothing better), the lower sort are by nature slaves.
  • Every human hierarchy, insofar as it can be justified philosophically, is treated by Aristotle by analogy to the relation of people to animals.
  • One difficult thing to face about our animality is that it entails our deaths; being an animal is associated throughout philosophy with dying purposelessly, and so with living meaninglessly.
  • this line of thought also happens to justify colonizing or even extirpating the “savage,” the beast in human form.
  • Our supposed fundamental distinction from “beasts, “brutes” and “savages” is used to divide us from nature, from one another and, finally, from ourselves
  • In Plato’s “Republic,” Socrates divides the human soul into two parts. The soul of the thirsty person, he says, “wishes for nothing else than to drink.” But we can restrain ourselves. “That which inhibits such actions,” he concludes, “arises from the calculations of reason.” When we restrain or control ourselves, Plato argues, a rational being restrains an animal.
  • In this view, each of us is both a beast and a person — and the point of human life is to constrain our desires with rationality and purify ourselves of animality
  • These sorts of systematic self-divisions come to be refigured in Cartesian dualism, which separates the mind from the body, or in Sigmund Freud’s distinction between id and ego, or in the neurological contrast between the functions of the amygdala and the prefrontal cortex.
  • I don’t know how to refute it, exactly, except to say that I don’t feel myself to be a logic program running on an animal body; I’d like to consider myself a lot more integrated than that.
  • And I’d like to repudiate every political and environmental conclusion ever drawn by our supposed transcendence of the order of nature
  • There is no doubt that human beings are distinct from other animals, though not necessarily more distinct than other animals are from one another. But maybe we’ve been too focused on the differences for too long. Maybe we should emphasize what all us animals have in common.
Javier E

Dengue Mosquitoes Can Be Tamed by a Common Microbe - The Atlantic - 0 views

  • Dengue fever is caused by a virus that infects an estimated 390 million people every year, and kills about 25,000; the World Health Organization has described it as one of the top 10 threats to global health.
  • It spreads through the bites of mosquitoes, particularly the species Aedes aegypti. Utarini and her colleagues have spent the past decade turning these insects from highways of dengue into cul-de-sacs. They’ve loaded the mosquitoes with a bacterium called Wolbachia, which prevents them from being infected by dengue viruses. Wolbachia spreads very quickly: If a small number of carrier mosquitoes are released into a neighborhood, almost all of the local insects should be dengue-free within a few months
  • Aedes aegypti was once a forest insect confined to sub-Saharan Africa, where it drank blood from a wide variety of animals. But at some point, one lineage evolved into an urban creature that prefers towns over forests, and humans over other animals.
  • ...10 more annotations...
  • The World Mosquito Program (WMP), a nonprofit that pioneered this technique, had run small pilot studies in Australia that suggested it could work. Utarini, who co-leads WMP Yogyakarta, has now shown conclusively that it does.
  • Carried around the world aboard slave ships, Aedes aegypti has thrived. It is now arguably the most effective human-hunter on the planet, its senses acutely attuned to the carbon dioxide in our breath, the warmth of our bodies, and the odors of our skin.
  • Wolbachia was first discovered in 1924, in a different species of mosquito. At first, it seemed so unremarkable that scientists ignored it for decades. But starting in the 1980s, they realized that it has an extraordinary knack for spreading. It passes down mainly from insect mothers to their children, and it uses many tricks to ensure that infected individuals are better at reproducing than uninfected ones. To date, it exists in at least 40 percent of all insect species, making it one of the most successful microbes on the planet.
  • The team divided a large portion of the city into 24 zones and released Wolbachia-infected mosquitoes in half of them. Almost 10,000 volunteers helped distribute egg-filled containers to local backyards. Within a year, about 95 percent of the Aedes mosquitoes in the 12 release zones harbored Wolbachia.
  • The team found that just 2.3 percent of feverish people who lived in the Wolbachia release zones had dengue, compared with 9.4 percent in the control areas. Wolbachia also seemed to work against all four dengue serotypes, and reduced the number of dengue hospitalizations by 86 percent.
  • Even then, these already remarkable numbers are likely to be underestimates. The mosquitoes moved around, carrying Wolbachia into the 12 control zones where no mosquitoes were released. And people also move: They might live in a Wolbachia release zone but be bitten and infected with dengue elsewhere. Both of these factors would have worked against the trial, weakening its results
  • The Wolbachia method does have a few limitations. The bacterium takes months to establish itself, so it can’t be “deployed to contain an outbreak today,” Vazquez-Prokopec told me. As the Yogyakarta trial showed, it works only when Wolbachia reaches a prevalence of at least 80 percent, which requires a lot of work and strong community support
  • The method has other benefits too. It is self-amplifying and self-perpetuating: If enough Wolbachia-infected mosquitoes are released initially, the bacterium should naturally come to dominate the local population, and stay that way. Unlike insecticides, Wolbachia isn’t toxic, it doesn’t kill beneficial insects (or even mosquitoes), and it doesn’t need to be reapplied, which makes it very cost-effective.
  • An analysis by Brady’s team showed that it actually saves money by preventing infections
  • Wolbachia also seems to work against the other diseases that Aedes aegypti carries, including Zika and yellow fever. It could transform this mosquito from one of the most dangerous species to humans into just another biting nuisance.
Javier E

How Do You Know When Society Is About to Fall Apart? - The New York Times - 1 views

  • Tainter seemed calm. He walked me through the arguments of the book that made his reputation, “The Collapse of Complex Societies,” which has for years been the seminal text in the study of societal collapse, an academic subdiscipline that arguably was born with its publication in 1988
  • It is only a mild overstatement to suggest that before Tainter, collapse was simply not a thing.
  • His own research has moved on; these days, he focuses on “sustainability.”
  • ...53 more annotations...
  • He writes with disarming composure about the factors that have led to the disintegration of empires and the abandonment of cities and about the mechanism that, in his view, makes it nearly certain that all states that rise will one day fall
  • societal collapse and its associated terms — “fragility” and “resilience,” “risk” and “sustainability” — have become the objects of extensive scholarly inquiry and infrastructure.
  • Princeton has a research program in Global Systemic Risk, Cambridge a Center for the Study of Existential Risk
  • even Tainter, for all his caution and reserve, was willing to allow that contemporary society has built-in vulnerabilities that could allow things to go very badly indeed — probably not right now, maybe not for a few decades still, but possibly sooner. In fact, he worried, it could begin before the year was over.
  • Plato, in “The Republic,” compared cities to animals and plants, subject to growth and senescence like any living thing. The metaphor would hold: In the early 20th century, the German historian Oswald Spengler proposed that all cultures have souls, vital essences that begin falling into decay the moment they adopt the trappings of civilization.
  • that theory, which became the heart of “The Collapse of Complex Societies.” Tainter’s argument rests on two proposals. The first is that human societies develop complexity, i.e. specialized roles and the institutional structures that coordinate them, in order to solve problems
  • All history since then has been “characterized by a seemingly inexorable trend toward higher levels of complexity, specialization and sociopolitical control.”
  • Eventually, societies we would recognize as similar to our own would emerge, “large, heterogeneous, internally differentiated, class structured, controlled societies in which the resources that sustain life are not equally available to all.”
  • Something more than the threat of violence would be necessary to hold them together, a delicate balance of symbolic and material benefits that Tainter calls “legitimacy,” the maintenance of which would itself require ever more complex structures, which would become ever less flexible, and more vulnerable, the more they piled up.
  • Social complexity, he argues, is inevitably subject to diminishing marginal returns. It costs more and more, in other words, while producing smaller and smaller profits.
  • Take Rome, which, in Tainter's telling, was able to win significant wealth by sacking its neighbors but was thereafter required to maintain an ever larger and more expensive military just to keep the imperial machine from stalling — until it couldn’t anymore.
  • This is how it goes. As the benefits of ever-increasing complexity — the loot shipped home by the Roman armies or the gentler agricultural symbiosis of the San Juan Basin — begin to dwindle, Tainter writes, societies “become vulnerable to collapse.”
  • haven’t countless societies weathered military defeats, invasions, even occupations and lengthy civil wars, or rebuilt themselves after earthquakes, floods and famines?
  • Only complexity, Tainter argues, provides an explanation that applies in every instance of collapse.
  • Complexity builds and builds, usually incrementally, without anyone noticing how brittle it has all become. Then some little push arrives, and the society begins to fracture.
  • A disaster — even a severe one like a deadly pandemic, mass social unrest or a rapidly changing climate — can, in Tainter’s view, never be enough by itself to cause collapse
  • The only precedent Tainter could think of, in which pandemic coincided with mass social unrest, was the Black Death of the 14th century. That crisis reduced the population of Europe by as much as 60 percent.
  • Whether any existing society is close to collapsing depends on where it falls on the curve of diminishing returns.
  • The United States hardly feels like a confident empire on the rise these days. But how far along are we?
  • Scholars of collapse tend to fall into two loose camps. The first, dominated by Tainter, looks for grand narratives and one-size-fits-all explanations
  • The second is more interested in the particulars of the societies they study
  • Patricia McAnany, who teaches at the University of North Carolina at Chapel Hill, has questioned the usefulness of the very concept of collapse — she was an editor of a 2010 volume titled “Questioning Collapse” — but admits to being “very, very worried” about the lack, in the United States, of the “nimbleness” that crises require of governments.
  • We’re too vested and tied to places.” Without the possibility of dispersal, or of real structural change to more equitably distribute resources, “at some point the whole thing blows. It has to.”
  • In Turchin’s case the key is the loss of “social resilience,” a society’s ability to cooperate and act collectively for common goals. By that measure, Turchin judges that the United States was collapsing well before Covid-19 hit. For the last 40 years, he argues, the population has been growing poorer and more unhealthy as elites accumulate more and more wealth and institutional legitimacy founders. “The United States is basically eating itself from the inside out,
  • Inequality and “popular immiseration” have left the country extremely vulnerable to external shocks like the pandemic, and to internal triggers like the killings of George Floyd
  • Societies evolve complexity, he argues, precisely to meet such challenges.
  • Eric H. Cline, who teaches at the George Washington University, argued in “1177 B.C.: The Year Civilization Collapsed” that Late Bronze Age societies across Europe and western Asia crumbled under a concatenation of stresses, including natural disasters — earthquakes and drought — famine, political strife, mass migration and the closure of trade routes. On their own, none of those factors would have been capable of causing such widespread disintegration, but together they formed a “perfect storm” capable of toppling multiple societies all at once.
  • Collapse “really is a matter of when,” he told me, “and I’m concerned that this may be the time.”
  • In “The Collapse of Complex Societies,” Tainter makes a point that echoes the concern that Patricia McAnany raised. “The world today is full,” Tainter writes. Complex societies occupy every inhabitable region of the planet. There is no escaping. This also means, he writes, that collapse, “if and when it comes again, will this time be global.” Our fates are interlinked. “No longer can any individual nation collapse. World civilization will disintegrate as a whole.”
  • If it happens, he says, it would be “the worst catastrophe in history.”
  • The quest for efficiency, he wrote recently, has brought on unprecedented levels of complexity: “an elaborate global system of production, shipping, manufacturing and retailing” in which goods are manufactured in one part of the world to meet immediate demands in another, and delivered only when they’re needed. The system’s speed is dizzying, but so are its vulnerabilities.
  • A more comprehensive failure of fragile supply chains could mean that fuel, food and other essentials would no longer flow to cities. “There would be billions of deaths within a very short period,” Tainter says.
  • If we sink “into a severe recession or a depression,” Tainter says, “then it will probably cascade. It will simply reinforce itself.”
  • Tainter tells me, he has seen “a definite uptick” in calls from journalists: The study of societal collapse suddenly no longer seems like a purely academic pursuit
  • Turchin is keenly aware of the essential instability of even the sturdiest-seeming systems. “Very severe events, while not terribly likely, are quite possible,” he says. When he emigrated from the U.S.S.R. in 1977, he adds, no one imagined the country would splinter into its constituent parts. “But it did.”
  • He writes of visions of “bloated bureaucracies” becoming the basis of “entire political careers.” Arms races, he observes, presented a “classic example” of spiraling complexity that provides “no tangible benefit for much of the population” and “usually no competitive advantage” either.
  • It is hard not to read the book through the lens of the last 40 years of American history, as a prediction of how the country might deteriorate if resources continued to be slashed from nearly every sector but the military, prisons and police.
  • The more a population is squeezed, Tainter warns, the larger the share that “must be allocated to legitimization or coercion.
  • And so it was: As U.S. military spending skyrocketed — to, by some estimates, a total of more than $1 trillion today from $138 billion in 1980 — the government would try both tactics, ingratiating itself with the wealthy by cutting taxes while dismantling public-assistance programs and incarcerating the poor in ever-greater numbers.
  • “As resources committed to benefits decline,” Tainter wrote in 1988, “resources committed to control must increase.”
  • The overall picture drawn by Tainter’s work is a tragic one. It is our very creativity, our extraordinary ability as a species to organize ourselves to solve problems collectively, that leads us into a trap from which there is no escaping
  • Complexity is “insidious,” in Tainter’s words. “It grows by small steps, each of which seems reasonable at the time.” And then the world starts to fall apart, and you wonder how you got there.
  • Perhaps collapse is not, actually, a thing. Perhaps, as an idea, it was a product of its time, a Cold War hangover that has outlived its usefulness, or an academic ripple effect of climate-change anxiety, or a feedback loop produced by some combination of the two
  • if you pay attention to people’s lived experience, and not just to the abstractions imposed by a highly fragmented archaeological record, a different kind of picture emerges.
  • Tainter’s understanding of societies as problem-solving entities can obscure as much as it reveals
  • Plantation slavery arose in order to solve a problem faced by the white landowning class: The production of agricultural commodities like sugar and cotton requires a great deal of backbreaking labor. That problem, however, has nothing to do with the problems of the people they enslaved. Which of them counts as “society”?
  • Since the beginning of the pandemic, the total net worth of America’s billionaires, all 686 of them, has jumped by close to a trillion dollars.
  • If societies are not in fact unitary, problem-solving entities but heaving contradictions and sites of constant struggle, then their existence is not an all-or-nothing game.
  • Collapse appears not as an ending, but a reality that some have already suffered — in the hold of a slave ship, say, or on a long, forced march from their ancestral lands to reservations faraway — and survived.
  • The current pandemic has already given many of us a taste of what happens when a society fails to meet the challenges that face it, when the factions that rule over it tend solely to their own problems
  • the real danger comes from imagining that we can keep living the way we always have, and that the past is any more stable than the present.
  • If you close your eyes and open them again, the periodic disintegrations that punctuate our history — all those crumbling ruins — begin to fade, and something else comes into focus: wiliness, stubbornness and, perhaps the strongest and most essential human trait, adaptability.
  • When one system fails, we build another. We struggle to do things differently, and we push on. As always, we have no other choice.
johnsonel7

Can Brain Science Help Us Break Bad Habits? | The New Yorker - 0 views

  • Texts arrived with the tones of a French horn and were similarly dispatched. Soon, I was reaching for the device every time it made a sound, like Pavlov’s dog salivating when it heard a bell. This started to interfere with work and conversations. The machine had seemed like a miraculous servant, but gradually I became its slave.
  • Habits, good and bad, have long fascinated philosophers and policymakers. Aristotle, in the Nicomachean Ethics, surveyed existing notions of virtue and offered this summary: “Some thinkers hold that it is by nature that people become good, others that it is by habit, and others that it is by instruction.” He concluded that habits were responsible. Cicero called habit “second nature,” a phrase that we still use.
  • Our minds, Wood explains, have “multiple separate but interconnected mechanisms that guide behavior.” But we are aware only of our decision-making ability—a phenomenon known as the “introspection illusion”—and that may be why we overestimate its power.
  • ...4 more annotations...
  • n Mischel’s marshmallow experiment, only a quarter of the subjects were able to resist eating the marshmallow for fifteen minutes. This implies that a large majority of us lack the self-control required to succeed in life.
  • They were most successful at adopting productive behaviors not when they resolved to do better, or distracted themselves from temptation, but when they altered their environment. Instead of studying on a couch in a dorm, with a TV close by, they went to the library. They ate better when they removed junk food from the dorm refrigerator. “Successful self-control,” Wood writes, “came from essentially covering up the marshmallow.”
  • Both, too, emphasize the role of conscious effort—not in resisting habit but in analyzing it, the better to formulate a strategy for reform.
  • Wood advises us to come up with new rewards as substitutes for the ones the phone provided. I listened to music on the car radio. In the evening, instead of scrolling through tweets and e-mails, I sought out authors I’d never read. At the end of each day, I felt calmer, and free.
tongoscar

'I AM the Voice of the Voiceless' seeks to share overlooked history in women's right to... - 0 views

  • It was Aug. 18, 1920 when the 19th Amendment to the Constitution was ratified, giving American women the right to vote. Later that year, on Nov. 2, more than 8 million women in our country voted in elections for the first time.
  • “From the very beginning — from Harriet Tubman, who was a slave, to major suffragette activists, there were black women who were millionaires, abolitionists, educators… who used their influence, they used their money, they used their time, they suffered,” said Executive Director Vivian Shipe. “They fought for the rights for all women to be able to have that right to vote.”
  • The “I AM the Voice of the Voiceless” organization is planning an event to honor what it’s calling “The Shades of The Suffrage Movement.”
tongoscar

Why MLK Is America's Last Founding Father - 0 views

  • The United States at large had a race problem that traced back to when the first African slaves were first imported to Virginia in 1619.
  • King’s efforts to improve conditions for blacks in the American South first attracted national attention in December 1955 when King’s Montgomery Improvement Association organized a boycott of Montgomery, Alabama’s segregated busing system to protest the arrest of Rosa Parks. She had refused to join three other blacks in giving up her seat for one white man. 
  • Hating and resenting one’s enemies was unbecoming of the organization’s members, who were activists “guided by Christian love” in their efforts to attain justice from a system of color-conscious laws. 
  • ...3 more annotations...
  • King reconciled militancy and moderation by 1963 in his “Letter From a Jail in Birmingham.”
  • As the 1950s came to an end, New York Times correspondent Michael Clark was covering raucous gatherings of black nationalists in Harlem, New York. He reported one encounter at an event organized in November 1959 by James Lawson, president of the black nationalist organization called the United African Nationalist. 
  • King understood the temptation to fight identity politics with identity politics, but refused. He preached that any form of race nationalism defied the “edicts of the Almighty God himself.” 
katherineharron

What my Florida town can teach us about racist policing (opinion) - CNN - 0 views

  • Nine days before George Floyd died an agonizing death under the knee of a white Minneapolis police officer while others watched, law enforcement officials broke up what has been described as a massive block party in my Florida hometown of DeLand and the surrounding unincorporated Volusia County.
  • this local example has lessons for all of us looking for ways to facilitate effective community policing of African American communities during the Covid-19 pandemic.
  • The mostly African American neighborhood known as Spring Hill is one of five historically underserved communities in the DeLand area where freed slaves settled to live separately after the Civil War. My elementary school — once heralded as a sign of this area's progress toward racial reconciliation when in the 1970s white students from the suburbs were bused there to implement the 1954 Brown v. Board of Education desegregation order — is still a neighborhood school for mostly black and brown students.
  • ...8 more annotations...
  • Figuring out exactly what happened that Saturday night will take time and require generous listening to reveal important details about exactly what events took place, how law enforcement became involved and whether permitting and operational procedures were followed.
  • I'm convinced that the depiction of the event and the actions of law enforcement is contrary to what was initially reported. This was not a pop-up Spring Hill block party that spontaneously became massive, disruptive and violent. Instead, it involved groups gathered for a series of events (including, among others, a car show, a concert and memorial for a former Spring Hill resident who in 2008 was a victim of gun violence) that were promoted successfully enough to attract attendees from as far away as Orlando, Tampa and Jacksonville.
  • And instead of becoming yet another incident where unarmed African Americans were shot by law enforcement officers who felt threatened based on preconceived fears and racist assumptions, there have been no reports or claims that these law enforcement officers shot, killed or inflicted life-threatening injury on any residents or visitors
  • law enforcement officers claim they were hit and injured that night by a sucker punch and the hurling of bottles, a bar stool and a mason jar; that they recovered one loaded Ruger 9 mm and other guns, some narcotics and $3,840 in cash; that they made seven arrests and issued five traffic citations. It remains the subject of further investigation and reporting to resolve community complaints in social media posts about undue provocation, escalation and unlawful business interruption. Videos of the incident shed some light but do not capture all aspects of a crowd this large -- the Volusia sheriff's office estimated it at 3,000 -- moving across multiple locations.
  • To facilitate effective community policing during this pandemic crisis, law enforcement leaders and African American leaders and residents need to further discuss and endeavor to reach consensus on four practical steps: suspending plans for any large gatherings until public health officials say they are safe; advocating for national and state leaders to put health over politics by warning about the continuing risks of asymptomatic virus transmission as the economy reopens; using social media to promote a consistent message about the danger of asymptomatic spread, especially given that the African American community is experiencing a disproportionate number of Covid-19 deaths, and ensuring that when large events are permissible organizers comply with local permitting requirements, which should be consistently enforced in ALL communities, not just in African American neighborhoods.
  • Ironically, on the same morning as the Spring Hill neighborhood events in question, I was part of a group of 19 racially, politically and socially diverse individuals from eight states and 11 cities gathered for a virtual "Color Line Roundtable."
  • participants thoughtfully discussed what values, beliefs and principles would guide their votes -- or abstentions -- in the November election. Each of us had a slightly different way of articulating those foundational beliefs, but, as one first-time participant emailed me after the discussion, it was "affirming to hear the commonality of beliefs and principles amongst a group of people who obviously also have some significant differences in opinions and positions."
  • upon further reflection, I have come to appreciate the value of our community's years-long series of roundtable discussions. Covid-19 restrictions and Floyd's murder might have complicated relations with law enforcement officials, but they offer yet another opportunity for us to talk candidly about the complex issues of effective community policing, racial diversity, equity and inclusion.
katherineharron

Ben & Jerry's statement on white supremacy is so extraordinary. Here's why - CNN - 0 views

  • The ice cream maker has called on Americans to "dismantle white supremacy" and "grapple with the sins of our past" as nationwide protests against racial injustice stretch into their eighth day.
  • Ben & Jerry's describes the death of George Floyd, an unarmed black man, at the hands of a white police officer as the result of "inhumane police brutality that is perpetuated by a culture of white supremacy."
  • "What happened to George Floyd was not the result of a bad apple; it was the predictable consequence of a racist and prejudiced system and culture that has treated Black bodies as the enemy from the beginning," said the brand, which is owned by Unilever (UL).
  • ...5 more annotations...
  • the statement from Ben & Jerry's is unusually comprehensive and direct, addressing the historical roots of discrimination in the United States and calling out systemic racism, while advocating specific policies to prevent further police abuses and redress racial inequality.
  • Ben & Jerry's, which also publicly supported the Black Lives Matter movement, called on President Donald Trump to disavow white supremacists and nationalist groups that "overtly support him."
  • The ice cream maker also called for the US Department of Justice to reinvigorate its Civil Rights Division, and for Congress to pass H.R. 40, a bill that would create a commission to study the effects of discrimination since African slaves first arrived in North America in 1619 and recommend remedies.
  • The company's sale to British-Dutch consumer goods giant Unilever (UL) in 2000 has not prevented it from speaking out on issues such as racial injustice, climate change and refugee rights. As part of the deal, Ben & Jerry's kept an independent board of directors. "We're a wholly owned subsidiary [of Unilever], but we still act according to Ben & Jerry's mission, vision and values," a spokesperson told CNN Business.
  • "Unless and until white America is willing to collectively acknowledge its privilege, take responsibility for its past and the impact it has on the present, and commit to creating a future steeped in justice, the list of names that George Floyd has been added to will never end. We have to use this moment to accelerate our nation's long journey towards justice and a more perfect union," the statement concluded.
Javier E

The New History Wars - The Atlantic - 0 views

  • Critical historians who thought they were winning the fight for control within the academy now face dire retaliation from outside the academy. The dizzying turn from seeming triumph in 2020 to imminent threat in 2022 has unnerved many practitioners of the new history. Against this background, they did not welcome it when their association’s president suggested that maybe their opponents had a smidgen of a point.
  • a background reality of the humanities in the contemporary academy: a struggle over who is entitled to speak about what. Nowhere does this struggle rage more fiercely than in anything to do with the continent of Africa. Who should speak? What may be said? Who will be hired?
  • ne obvious escape route from the generational divide in the academy—and the way the different approaches to history, presentist and antiquarian, tend to map onto it—is for some people, especially those on the older and whiter side of the divide, to keep their mouths shut about sensitive issues
  • ...15 more annotations...
  • The political and methodological stresses within the historical profession are intensified by economic troubles. For a long time, but especially since the economic crisis of 2008, university students have turned away from the humanities, preferring to major in fields that seem to offer more certain and lucrative employment. Consequently, academic jobs in the humanities and especially in history have become radically more precarious for younger faculty—even as universities have sought to meet diversity goals in their next-generation hiring by expanding offerings in history-adjacent specialties, such as gender and ethnic studies.
  • The result has produced a generational divide. Younger scholars feel oppressed and exploited by universities pressing them to do more labor for worse pay with less security than their elders; older scholars feel that overeager juniors are poised to pounce on the least infraction as an occasion to end an elder’s career and seize a job opening for themselves. Add racial difference as an accelerant, and what was intended as an interesting methodological discussion in a faculty newsletter can explode into a national culture war.
  • One of the greatest American Africanists was the late Philip Curtin. He wrote one of the first attempts to tally the exact number of persons trafficked by the transatlantic slave trade. Upon publication in 1972, his book was acclaimed as a truly pioneering work of history. By 1995, however, he was moved to protest against trends in the discipline at that time in an article in the Chronicle of Higher Education:I am troubled by increasing evidence of the use of racial criteria in filling faculty posts in the field of African history … This form of intellectual apartheid has been around for several decades, but it appears to have become much more serious in the past few years, to the extent that white scholars trained in African history now have a hard time finding jobs.
  • Much of academia is governed these days by a joke from the Soviet Union: “If you think it, don’t speak it. If you speak it, don’t write it. If you write it, don’t sign it. But if you do think it, speak it, write it, and sign it—don’t be surprised.”
  • Yet this silence has consequences, too. One of the most unsettling is the displacement of history by mythmaking
  • mythmaking is spreading from “just the movies” to more formal and institutional forms of public memory. If old heroes “must fall,” their disappearance opens voids for new heroes to be inserted in their place—and that insertion sometimes requires that new history be fabricated altogether, the “bad history” that Sweet tried to warn against.
  • If it is not the job of the president of the American Historical Association to confront those questions, then whose is it?
  • Sweet used a play on words—“Is History History?”—for the title of his complacency-shaking essay. But he was asking not whether history is finished, done with, but Is history still history? Is it continuing to do what history is supposed to do? Or is it being annexed for other purposes, ideological rather than historical ones?
  • Advocates of studying the more distant past to disturb and challenge our ideas about the present may accuse their academic rivals of “presentism.”
  • In real life, of course, almost everybody who cares about history believes in a little of each option. But how much of each? What’s the right balance? That’s the kind of thing that historians do argue about, and in the arguing, they have developed some dismissive labels for one another
  • Those who look to the more recent past to guide the future may accuse the other camp of “antiquarianism.”
  • The accusation of presentism hurts because it implies that the historian is sacrificing scholarly objectivity for ideological or political purposes. The accusation of antiquarianism stings because it implies that the historian is burrowing into the dust for no useful purpose at all.
  • In his mind, he was merely reopening one of the most familiar debates in professional history: the debate over why? What is the value of studying the past? To reduce the many available answers to a stark choice: Should we study the more distant past to explore its strangeness—and thereby jolt ourselves out of easy assumptions that the world we know is the only possible one?
  • Or should we study the more recent past to understand how our world came into being—and thereby learn some lessons for shaping the future?
  • The August edition of the association’s monthly magazine featured, as usual, a short essay by the association’s president, James H. Sweet, a professor at the University of Wisconsin at Madison. Within hours of its publication, an outrage volcano erupted on social media. A professor at Cornell vented about the author’s “white gaze.”
‹ Previous 21 - 40 of 41 Next ›
Showing 20 items per page