Skip to main content

Home/ History Readings/ Group items matching "practical" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
g-dragon

Caste System of Nepal - 0 views

  • Nepalese are known by castes A caste is an elaborate and complex social system that combines elements of occupation, endogamy, culture, social class, tribe affiliation and political power. Discrimination based on caste, as perceived by UNICEF, is prevalent mainly in parts of Asia (India, Sri Lanka, Bangladesh, Nepal, Japan) and Africa. amongst themselves essentially for their identity. It affects their family life, food, dress, occupations and culture. Basically, it determines their way of life. On the whole, caste system has an important role in social stratification in Nepal.
  • The communities living in the high mountains do not follow the caste system. They are the Tibetan migrants People from Tibet those migrate to North of Nepal. and they practice communal ownership.
  • The caste system which is the basis of feudalistic Feudalism was a set of political and military customs in medieval Europe that flourished between the ninth and fifteenth centuries. see more economic structure with the system of individual ownership system did not exist prior to the arrival of Indians and their culture in Nepal.
  • ...5 more annotations...
  • The ethnic Nepalese indigenous do not have caste system even today because they practice Buddhism Buddhism is a religion and philosophy encompassing a variety of traditions, beliefs and practices, largely based on teachings attributed to Siddhartha Gautama, commonly known as the Buddha ("the awakened one"). Buddha was borned in Lumbini, Southern part of Nepal. . Only the Indian migrants who practice Hinduism Hinduism is the predominant and indigenous religious tradition of South Asia. Hinduism is often referred to as Sanatana Dharma (a Sanskrit phrase meaning "the eternal law") by its adherents. Hinduism is formed of diverse traditions and has no single founder. follow this system.
  • Violating these rules is liable to certain punishment like social boycott. Despite the fact that castes were based on various professions, untouchability evolved later.
  • The caste of an individual basically determines his ritual status, purity, and pollution.
  • Likewise, Pollution means that the lower caste is considered polluted and thus not allowed to touch or stay close to higher caste people. They are also deprived of entering temples, funeral places, restaurants, shop and other public places.
  • The caste system in Nepal was earlier incorporated in the National law in order to incorporate people of different origin and bring them under an umbrella. Each caste has its set of family names given to the members of its community according to their professions.
Javier E

How America Went Haywire - The Atlantic - 0 views

  • You are entitled to your own opinion, but you are not entitled to your own facts.
  • Why are we like this?The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.
  • The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites.
  • ...92 more annotations...
  • Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.
  • Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.
  • Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking
  • The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.
  • The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them
  • Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.
  • we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.
  • We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.
  • For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.
  • It was a headquarters for a new religion of no religion, and for “science” containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.
  • These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is “a theory not a fact.”
  • The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.” Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.
  • And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.” If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.
  • Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible … There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.
  • During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent.
  • The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.
  • over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explained—so was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself
  • Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger.
  • then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.
  • To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.” “What is ‘real’ to a Tibetan monk may not be ‘real’ to an American businessman.”
  • In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.
  • Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”
  • Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certaint
  • Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.
  • Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies.
  • Conspiracy became the high-end Hollywood dramatic premise—Chinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream.
  • t more and more people on both sides would come to believe that an extraordinarily powerful cabal—international organizations and think tanks and big businesses and politicians—secretly ran America.
  • Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the ’60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the ’70s on the paranoid left before it became a fixture on the right.
  • Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didn’t subside, but grew and thrived—and came to seem unexceptional.
  • Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.
  • Starting in the ’80s, loving America and making money and having a family were no longer unfashionable.The sense of cultural and political upheaval and chaos dissipated—which lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.
  • For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.
  • Relativism became entrenched in academia—tenured, you could say
  • as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”
  • After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.
  • America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.
  • In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment
  • Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.
  • I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did.
  • Limbaugh’s virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day.
  • Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.
  • Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries.
  • there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the ’60s, and now the match was lit and thrown
  • After the ’60s and ’70s happened as they happened, the internet may have broken America’s dynamic balance between rational thinking and magical thinking for good.
  • Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers
  • Why did Senator Daniel Patrick Moynihan begin remarking frequently during the ’80s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say
  • Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.
  • On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.
  • Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any “alternative” theory or belief seems to generate more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of result
  • If more and more of a political party’s members hold more and more extreme and extravagantly supernatural beliefs, doesn’t it make sense that the party will be more and more open to make-believe in its politics?
  • an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.
  • Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, “Individuals’ explicit religious and paranormal beliefs” are the best predictors of their “perception of purpose in life events”—their tendency “to view the world in terms of agency, purpose, and design.”
  • Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the ’60s, that exceptional religiosity has fed the tendency to believe in conspiracies.
  • Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.
  • People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, America’s unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.
  • Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?
  • One reason, I think, is religion. The GOP is now quite explicitly Christian
  • , as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but ratherthey are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.
  • Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades.
  • The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that “a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government”? Yes, say 34 percent of Republican voters, according to Public Policy Polling.
  • starting in the ’90s, the farthest-right quarter of Americans, let’s say, couldn’t and wouldn’t adjust their beliefs to comport with their side’s victories and the dramatically new and improved realities. They’d made a god out of Reagan, but they ignored or didn’t register that he was practical and reasonable, that he didn’t completely buy his own antigovernment rhetoric.
  • Another way the GOP got loopy was by overdoing libertarianism
  • Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and don’t spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish
  • For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans
  • Karl Rove was stone-cold cynical, the Wizard of Oz’s evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how “judicious study of discernible reality [is] … not the way the world really works anymore.” These leaders were rational people who understood that a large fraction of citizens don’t bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.
  • But over the past few decades, a lot of the rabble they roused came to believe all the untruths. “The problem is that Republicans have purposefully torn down the validating institutions,”
  • “They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse.”
  • What had been the party’s fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.
  • The Christian takeover happened gradually, but then quickly in the end, like a phase change from liquid to gas. In 2008, three-quarters of the major GOP presidential candidates said they believed in evolution, but in 2012 it was down to a third, and then in 2016, just one did
  • A two-to-one majority of Republicans say they “support establishing Christianity as the national religion,” according to Public Policy Polling.
  • Although constitutionally the U.S. can have no state religion, faith of some kind has always bordered on mandatory for politicians.
  • What connects them all, of course, is the new, total American embrace of admixtures of reality and fiction and of fame for fame’s sake. His reality was a reality show before that genre or term existed
  • When he entered political show business, after threatening to do so for most of his adult life, the character he created was unprecedented—presidential candidate as insult comic with an artificial tan and ridiculous hair, shamelessly unreal and whipped into shape as if by a pâtissier.
  • Republicans hated Trump’s ideological incoherence—they didn’t yet understand that his campaign logic was a new kind, blending exciting tales with a showmanship that transcends ideology.
  • Trump waited to run for president until he sensed that a critical mass of Americans had decided politics were all a show and a sham. If the whole thing is rigged, Trump’s brilliance was calling that out in the most impolitic ways possible, deriding his straight-arrow competitors as fakers and losers and liars—because that bullshit-calling was uniquely candid and authentic in the age of fake.
  • Trump took a key piece of cynical wisdom about show business—the most important thing is sincerity, and once you can fake that, you’ve got it made—to a new level: His actual thuggish sincerity is the opposite of the old-fashioned, goody-goody sanctimony that people hate in politicians.
  • Trump’s genius was to exploit the skeptical disillusion with politics—there’s too much equivocating; democracy’s a charade—but also to pander to Americans’ magical thinking about national greatness. Extreme credulity is a fraternal twin of extreme skepticism.
  • Trump launched his political career by embracing a brand-new conspiracy theory twisted around two American taproots—fear and loathing of foreigners and of nonwhites.
  • The fact-checking website PolitiFact looked at more than 400 of his statements as a candidate and as president and found that almost 50 percent were false and another 20 percent were mostly false.
  • He gets away with this as he wouldn’t have in the 1980s or ’90s, when he first talked about running for president, because now factual truth really is just one option. After Trump won the election, he began referring to all unflattering or inconvenient journalism as “fake news.”
  • indeed, their most honest defense of his false statements has been to cast them practically as matters of religious conviction—he deeply believes them, so … there. When White House Press Secretary Sean Spicer was asked at a press conference about the millions of people who the president insists voted illegally, he earnestly reminded reporters that Trump “has believed that for a while” and “does believe that” and it’s “been a long-standing belief that he’s maintained” and “it’s a belief that he has maintained for a while.”
  • Which is why nearly half of Americans subscribe to that preposterous belief themselves. And in Trump’s view, that overrides any requirement for facts.
  • he idea that progress has some kind of unstoppable momentum, as if powered by a Newtonian law, was always a very American belief. However, it’s really an article of faith, the Christian fantasy about history’s happy ending reconfigured during and after the Enlightenment as a set of modern secular fantasies
  • I really can imagine, for the first time in my life, that America has permanently tipped into irreversible decline, heading deeper into Fantasyland. I wonder whether it’s only America’s destiny, exceptional as ever, to unravel in this way. Or maybe we’re just early adopters, the canaries in the global mine
  • I do despair of our devolution into unreason and magical thinking, but not everything has gone wrong.
  • I think we can slow the flood, repair the levees, and maybe stop things from getting any worse. If we’re splitting into two different cultures, we in reality-based America—whether the blue part or the smaller red part—must try to keep our zone as large and robust and attractive as possible for ourselves and for future generations
  • We need to firmly commit to Moynihan’s aphorism about opinions versus facts. We must call out the dangerously untrue and unreal
  • do not give acquaintances and friends and family members free passes. If you have children or grandchildren, teach them to distinguish between true and untrue as fiercely as you do between right and wrong and between wise and foolish.
  • How many Americans now inhabit alternate realities?
  • reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.
  • Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”
  • A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.
Javier E

Were American Indians the Victims of Genocide? | History News Network - 0 views

  • It is a firmly established fact that a mere 250,000 native Americans were still alive in the territory of the United States at the end of the 19th century
  • Still in scholarly contention, however, is the number of Indians alive at the time of first contact with Europeans.
  • The disparity in estimates is enormous. In 1928, the ethnologist James Mooney proposed a total count of 1,152,950 Indians in all tribal areas north of Mexico at the time of the European arrival. By 1987, in American Indian Holocaust and Survival, Russell Thornton was giving a figure of well over 5 million, nearly five times as high as Mooney’s, while Lenore Stiffarm and Phil Lane, Jr. suggested a total of 12 million. That figure rested in turn on the work of the anthropologist Henry Dobyns, who in 1983 had estimated the aboriginal population of North America as a whole at 18 million and of the present territory of the United States at about 10 million.
  • ...35 more annotations...
  • About all this there is no essential disagreement. The most hideous enemy of native Americans was not the white man and his weaponry, concludes Alfred Crosby,"but the invisible killers which those men brought in their blood and breath." It is thought that between 75 to 90 percent of all Indian deaths resulted from these killers.
  • As an example of actual genocidal conditions, Stannard points to Franciscan missions in California as"furnaces of death."
  • The missionaries had a poor understanding of the causes of the diseases that afflicted their charges, and medically there was little they could do for them. By contrast, the Nazis knew exactly what was happening in the ghettos, and quite deliberately deprived the inmates of both food and medicine; unlike in Stannard’s"furnaces of death," the deaths that occurred there were meant to occur.
  • True, too, some colonists later welcomed the high mortality among Indians, seeing it as a sign of divine providence; that, however, does not alter the basic fact that Europeans did not come to the New World in order to infect the natives with deadly diseases.
  • But Chardon's journal manifestly does not suggest that the U.S. Army distributed infected blankets, instead blaming the epidemic on the inadvertent spread of disease by a ship's passenger. And as for the"100,000 fatalities," not only does Thornton fail to allege such obviously absurd numbers, but he too points to infected passengers on the steamboat St. Peter's as the cause. Another scholar, drawing on newly discovered source material, has also refuted the idea of a conspiracy to harm the Indians.
  • Similarly at odds with any such idea is the effort of the United States government at this time to vaccinate the native population. Smallpox vaccination, a procedure developed by the English country doctor Edward Jenner in 1796, was first ordered in 1801 by President Jefferson; the program continued in force for three decades, though its implementation was slowed both by the resistance of the Indians, who suspected a trick, and by lack of interest on the part of some officials. Still, as Thornton writes:"Vaccination of American Indians did eventually succeed in reducing mortality from smallpox."
  • To sum up, European settlers came to the New World for a variety of reasons, but the thought of infecting the Indians with deadly pathogens was not one of them. As for the charge that the U.S. government should itself be held responsible for the demographic disaster that overtook the American-Indian population, it is unsupported by evidence or legitimate argument.
  • Still, even if up to 90 percent of the reduction in Indian population was the result of disease, that leaves a sizable death toll caused by mistreatment and violence. Should some or all of these deaths be considered instances of genocide?
  • Despite the colonists' own resort to torture in order to extract confessions, the cruelty of these practices strengthened the belief that the natives were savages who deserved no quarter
  • A second famous example from the colonial period is King Philip’s War (1675-76).
  • The war was also merciless, on both sides. At its outset, a colonial council in Boston had declared"that none be Killed or Wounded that are Willing to surrender themselves into Custody."
  • But these rules were soon abandoned on the grounds that the Indians themselves, failing to adhere either to the laws of war or to the law of nature, would"skulk" behind trees, rocks, and bushes rather than appear openly to do" civilized" battle. Similarly creating a desire for retribution were the cruelties perpetrated by Indians when ambushing English troops or overrunning strongholds housing women and children.
  • Before long, both colonists and Indians were dismembering corpses and displaying body parts and heads on poles. (Nevertheless, Indians could not be killed with impunity. In the summer of 1676, four men were tried in Boston for the brutal murder of three squaws and three Indian children; all were found guilty and two were executed.)
  • In 1704, this was amended in the direction of"Christian practice" by means of a scale of rewards graduated by age and sex; bounty was proscribed in the case of children under the age of ten, subsequently raised to twelve (sixteen in Connecticut, fifteen in New Jersey). Here, too, genocidal intent was far from evident; the practices were justified on grounds of self-preservation and revenge, and in reprisal for the extensive scalping carried out by Indians.
  • As the United States expanded westward, such conflicts multiplied. So far had things progressed by 1784 that, according to one British traveler,"white Americans have the most rancorous antipathy to the whole race of Indians; and nothing is more common than to hear them talk of extirpating them totally from the face of the earth, men, women, and children."
  • To force the natives into submission, Generals Sherman and Sheridan, who for two decades after the Civil War commanded the Indian-fighting army units on the Plains, applied the same strategy they had used so successfully in their marches across Georgia and in the Shenandoah Valley. Unable to defeat the Indians on the open prairie, they pursued them to their winter camps, where numbing cold and heavy snows limited their mobility. There they destroyed the lodges and stores of food, a tactic that inevitably resulted in the deaths of women and children.
  • Genocide? These actions were almost certainly in conformity with the laws of war accepted at the time. The principles of limited war and of noncombatant immunity had been codified in Francis Lieber's General Order No. 100, issued for the Union Army on April 24, 1863. But the villages of warring Indians who refused to surrender were considered legitimate military objectives.
  • According to Article II of the convention, the crime of genocide consists of a series of acts" committed with intent to destroy, in whole or in part, a national, ethnical, racial, or religious group as such" (emphases added). Practically all legal scholars accept the centrality of this clause.
  • During the deliberations over the convention, some argued for a clear specification of the reasons, or motives, for the destruction of a group. In the end, instead of a list of such motives, the issue was resolved by adding the words"as such"—i.e., the motive or reason for the destruction must be the ending of the group as a national, ethnic, racial, or religious entity. Evidence of such a motive, as one legal scholar put it,"will constitute an integral part of the proof of a genocidal plan, and therefore of genocidal intent."
  • The crucial role played by intentionality in the Genocide Convention means that under its terms the huge number of Indian deaths from epidemics cannot be considered genocide.
  • y contrast, some of the massacres in California, where both the perpetrators and their supporters openly acknowledged a desire to destroy the Indians as an ethnic entity, might indeed be regarded under the terms of the convention as exhibiting genocidal intent.
  • the convention does not address the question of what percentage of a group must be affected in order to qualify as genocide. As a benchmark, the prosecutor of the International Criminal Tribunal for the Former Yugoslavia has suggested"a reasonably significant number, relative to the total of the group as a whole," adding that the actual or attempted destruction should also relate to"the factual opportunity of the accused to destroy a group in a specific geographic area within the sphere of his control, and not in relation to the entire population of the group in a wider geographic sense."
  • If this principle were adopted, an atrocity like the Sand Creek massacre, limited to one group in a specific single locality, might also be considered an act of genocide.
  • Applying today’s standards to events of the past raises still other questions, legal and moral alike. While history has no statute of limitations, our legal system rejects the idea of retroactivity (ex post facto laws).
  • No doubt, the 19th-century idea of America’s"manifest destiny" was in part a rationalization for acquisitiveness, but the resulting dispossession of the Indians was as unstoppable as other great population movements of the past. The U.S. government could not have prevented the westward movement even if it had wanted to.
  • Morally, even if we accept the idea of universal principles transcending particular cultures and periods, we must exercise caution in condemning, say, the conduct of war during America’s colonial period, which for the most part conformed to thenprevailing notions of right and wrong.
  • The real task, then, is to ascertain the context of a specific situation and the options it presented. Given circumstances, and the moral standards of the day, did the people on whose conduct we are sitting in judgment have a choice to act differently?
  • Finally, even if some episodes can be considered genocidal—that is, tending toward genocide—they certainly do not justify condemning an entire society
  • Guilt is personal, and for good reason the Genocide Convention provides that only"persons" can be charged with the crime, probably even ruling out legal proceedings against governments.
  • noncombatants incidentally and accidentally, not purposefully." As for the larger society, even if some elements in the white population, mainly in the West, at times advocated extermination, no official of the U.S. government ever seriously proposed it. Genocide was never American policy, nor was it the result of policy.
  • The violent collision between whites and America's native population was probably unavoidable.
  • To understand all is hardly to forgive all, but historical judgment, as the scholar Gordon Leff has correctly stressed,"must always be contextual: it is no more reprehensible for an age to have lacked our values than to have lacked forks."
  • In the end, the sad fate of America's Indians represents not a crime but a tragedy, involving an irreconcilable collision of cultures and values.
  • efforts of well-meaning people in both camps, there existed no good solution to this clash. The Indians were not prepared to give up the nomadic life of the hunter for the sedentary life of the farmer. The new Americans, convinced of their cultural and racial superiority, were unwilling to grant the original inhabitants of the continent the vast preserve of land required by the Indians’ way of life.
  • To fling the charge of genocide at an entire society serves neither the interests of the Indians nor those of history.
Javier E

Op-Ed Columnist - The Genteel Nation - NYTimes.com - 0 views

  • sometime around 1800, economic growth took off — in Britain first, then elsewhere. How did this growth start? In his book “The Enlightened Economy,” Joel Mokyr of Northwestern University argues that the crucial change happened in people’s minds. Because of a series of cultural shifts, technicians started taking scientific knowledge and putting it to practical use.
  • Britain soon dominated the world. But then it declined. Again, the crucial change was in people’s minds. As the historian Correlli Barnett chronicled, the great-great-grandchildren of the empire builders withdrew from commerce, tried to rise above practical knowledge and had more genteel attitudes about how to live.
  • 65 percent of Americans believe their nation is now in decline, according to this week’s NBC/Wall Street Journal poll. And it is true
  • ...9 more annotations...
  • The first lesson from the economic historians is that we should try to understand our situation by looking for shifts in ideas and values, not just material changes.
  • After decades of affluence, the U.S. has drifted away from the hardheaded practical mentality that built the nation’s wealth in the first place. The shift is evident at all levels of society. First, the elites. America’s brightest minds have been abandoning industry and technical enterprise in favor of more prestigious but less productive fields like law, finance, consulting and nonprofit activism.
  • Then there’s the middle class. The emergence of a service economy created a large population of junior and midlevel office workers. These white-collar workers absorbed their lifestyle standards from the Huxtable family of “The Cosby Show,” not the Kramden family of “The Honeymooners.” As these information workers tried to build lifestyles that fit their station, consumption and debt levels soared. The trade deficit exploded. The economy adjusted to meet their demand — underinvesting in manufacturing and tradable goods and overinvesting in retail and housing.
  • Finally, there’s the lower class. The problem here is social breakdown. Something like a quarter to a third of American children are living with one or no parents, in chaotic neighborhoods with failing schools. A gigantic slice of America’s human capital is vastly underused, and it has been that way for a generation.
  • Most people who lived in the year 1800 were scarcely richer than people who lived in the year 100,000 B.C. Their diets were no better. They were no taller, and they did not live longer.
  • sometime around 1800, economic growth took off — in Britain first, then elsewhere. How did this growth start?
  • In his book “The Enlightened Economy,” Joel Mokyr of Northwestern University argues that the crucial change happened in people’s minds. Because of a series of cultural shifts, technicians started taking scientific knowledge and putting it to practical use.
  • the value shifts are real. Up and down society, people are moving away from commercial, productive activities and toward pleasant, enlightened but less productive ones.
  • Then there’s the middle class. The emergence of a service economy created a large population of junior and midlevel office workers. These white-collar workers absorbed their lifestyle standards from the Huxtable family of “The Cosby Show,” not the Kramden family of “The Honeymooners.” As these information workers tried to build lifestyles that fit their station, consumption and debt levels soared. The trade deficit exploded. The economy adjusted to meet their demand — underinvesting in manufacturing and tradable goods and overinvesting in retail and housing.
Javier E

The Three Degrees of Racism in America - The Atlantic - 0 views

  • Thirty years ago, my dad gave me his playbook to put racism to rest, and it inspired me to dedicate my career to executing his vision
  • Dad’s playbook included one insight that all Americans should hear, at least those who hope that when it comes to addressing racism, we can do better. As an economist, he told me that we have to “increase the cost of racist behavior.” Doing so, he said, would create the conditions for black people to harness the economic power essential to changing the narrative in white America’s mind about race.
  • To be outed as a racist is to be convicted of America’s highest moral crime. Once we align on what racist behavior looks like, we can make those behaviors costly.
  • ...14 more annotations...
  • The first step is to clarify what constitutes racist behavior. Defining it makes denying it or calling it something else that much harder.
  • Then there is opposing or turning one’s back on anti-racism efforts, often justified by the demonization of the people courageously tackling racist behavior. I call this racism in the second degree, akin to aiding and abetting
  • This is racism in the first degree.
  • The final, most pernicious category undergirds the everyday black experience. When employers, educational institutions, and governmental entities do not unwind practices that disadvantage people of color in the competition with whites for economic and career mobility, that is fundamentally racist
  • Companies that sign on will be recognized and celebrated. Senior management teams that decline to take these basic steps will no longer be able to hide, and they will struggle to recruit and retain top talent of all colors who will prefer firms that have signed on
  • Organizations cannot be meritocracies if their small number of black employees spend a third of their mental bandwidth in every meeting of every day distracted by questions of race and outcomes. Why are there not more people like me? Am I being treated differently?
  • This dimension of racism is particularly hard to root out, because many of our most enlightened white leaders do not even realize what they are doing. This is racism in the third degree, akin to involuntary manslaughter: We are not trying to hurt anyone, but we create the conditions that shatter somebody else’s future aspirations.
  • Eliminating third-degree racism is the catalyst to expanding economic power for people of color, so it merits focus at the most senior levels of education, government, and business.
  • I have not seen 10 diversity plans that have the foundational elements that organizations require everywhere else: a fact-based diagnosis of the underlying problems, quantifiable goals, prioritized areas for investment, interim progress metrics, and clear accountability for execution
  • We can increase the cost of this behavior by calling on major employers to sign on to basic practices that demonstrate that black lives matter to them. These include: (1) acknowledging what constitutes third-degree racism so there is no hiding behind a lack of understanding or fuzzy math, (2) committing to developing and executing diversity plans that meet a carefully considered and externally defined standard of rigor, and (3) delivering outcomes in which the people of color have the same opportunities to advance.
  • The most well-understood dimension involves taking actions that people of color view as overtly prejudiced—policing black citizens much differently than whites, calling the police on a black bird-watche
  • Rooting out third-degree racism is what will ultimately change the narrative about race. When white people see more black people on the same path as they are, when white people are working in diverse organizations, and when they are proximate to black leaders beyond athletes and entertainers, only then will they stop fearing and feeling superior to the black people they don’t know.
  • When these executives are challenged on hiring practices, their first excuse is always “The pipeline of qualified candidates is too small, so we can only do so much right now.” Over the past 20 years, I have not once heard an executive follow up the “pipeline is too small” defense with a quantitative analysis of that pipeline
  • Employers whose efforts to increase diversity lack the same analytical and executional rigor that is taken for granted in every other part of their business engage in practices that disadvantage black people in the competition for economic opportunity. By default, this behavior protects white people’s positions of power.
Javier E

The average doctor in the U.S. makes $350,000 a year. Why? - The Washington Post - 0 views

  • The average U.S. physician earns $350,000 a year. Top doctors pull in 10 times that.
  • The figures are nigh-on unimpeachable. They come from a working paper, newly updated, that analyzes more than 10 million tax records from 965,000 physicians over 13 years. The talented economist-authors also went to extreme lengths to protect filers’ privacy, as is standard for this type of research.
  • By accounting for all streams of income, they revealed that doctors make more than anyone thought — and more than any other occupation we’ve measured. In the prime earning years of 40 to 55, the average physician made $405,000 in 2017 — almost all of it (94 percent) from wages
  • ...29 more annotations...
  • Doctors in the top 10 percent averaged $1.3 million
  • And those in the top 1 percent averaged an astounding $4 million, though most of that (85 percent) came from business income or capital gains.
  • In certain specialties, doctors see substantially more in their peak earning years: Neurosurgeons (about $920,000), orthopedic surgeons ($789,000) and radiation oncologists ($709,000) all did especially well for themselves. Specialty incomes cover 2005 to 2017 and are expressed in 2017 dollars.
  • family-practice physicians made around $230,000 a year. General practice ($225,000) and preventive-medicine ($224,000) doctors earned even less — though that’s still enough to put them at the top of the heap among all U.S. earners.
  • “There is this sense of, well, if you show that physician incomes put them at the top of the income distribution, then you’re somehow implying that they’re instead going into medicine because they want to make money. And that narrative is uncomfortable to people.”
  • why did those figures ruffle so many physician feathers?
  • “You can want to help people and you can simultaneously want to earn money and have a nicer lifestyle and demand compensation for long hours and long training. That’s totally normal behavior in the labor market.”
  • Yale University economist Jason Abaluck notes that when he asks the doctors and future doctors in his health economics classes why they earn so much, answers revolve around the brutal training required to enter the profession. “Until they finish their residency, they’re working an enormous number of hours and their lifestyle is not the lifestyle of a rich person,” Abaluck told us.
  • why do physicians make that much?
  • On average, doctors — much like anyone else — behave in ways that just happen to drive up their income. For example, the economists found that graduates from the top medical schools, who can presumably write their own ticket to any field they want, tend to choose those that pay the most.
  • “Our analysis shows that certainly physicians respond to earnings when choosing specialties,” Polyakova told us. “And there’s nothing wrong with that, in my opinion.
  • “In general, U.S. physicians are making about 50 percent more than German physicians and about more than twice as much as U.K. physicians,
  • Grover said the widest gaps were “really driven by surgeons and a handful of procedural specialties,” doctors who perform procedures with clear outcomes, rather than preventing disease or treating chronic condition
  • “we’re not about prevention, you know?” he said, noting that his own PhD is in public health. “I wish it was different, but it ain’t!”
  • The United States has fewer doctors per person than 27 out of 31 member countries tracked by the Organization for Economic Cooperation and Development
  • In 1970, based on a slightly different measure that’s been tracked for longer, America had more licensed physicians per person than all but two of the 10 countries for which we have data. What caused the collapse?
  • the United States has far fewer residency slots than qualified med school graduates, which means thousands of qualified future physicians are annually shut out of the residency pipeline, denied their chosen career and stuck with no way to pay back those quarter-million-dollar loans.
  • “I’d like to see an in-depth analysis of the effect of the government capping the number of residency spots and how it’s created an artificial ‘physician shortage’ even though we have thousands of talented and graduated doctors that can’t practice due to not enough residency spots,”
  • Such an analysis would begin with a deeply influential 1980 report,
  • That report, by a federal advisory committee tasked with ensuring the nation had neither too few nor too many doctors, concluded that America was barreling toward a massive physician surplus. It came out just before President Ronald Reagan took office, and the new administration seemed only too eager to cut back on federal spending on doctor-training systems.
  • ssociation of American Medical Colleges (AAMC), a coalition of MD-granting medical schools and affiliated teaching hospitals, slammed the brakes on a long expansion. From 1980 to around 2004, the number of medical grads flatlined, even as the American population rose 29 percent.
  • Federal support for residencies was also ratcheted down, making it expensive or impossible for hospitals to provide enough slots for all the medical school graduates hitting the market each year. That effort peaked with the 1997 Balanced Budget Act which, among other things, froze funding for residencies — partially under the flawed assumption that HMOs would forever reduce the need for medical care in America, Orr writes. That freeze has yet to fully unwind.
  • or decades, many policymakers believed more doctors caused higher medical spending. Orr says that’s partly true, but “the early studies failed to differentiate between increased availability of valuable medical services and unnecessary treatment and services.”
  • “In reality, the greater utilization in places with more doctors represented greater availability, both in terms of expanded access to primary care and an ever-growing array of new and more advanced medical services,” he writes. “The impact of physician supply on levels of excessive treatment appears to be either small or nonexistent.”
  • “People have a narrative that physician earnings is one of the main drivers of high health-care costs in the U.S.,” Polyakova told us. “It is kind of hard to support this narrative if ultimately physicians earn less than 10 percent of national health-care expenditures.”
  • Polyakova and her collaborators find doctor pay consumes only 8.6 percent of overall health spending. It grew a bit faster than inflation over the time period studied, but much slower than overall health-care costs.
  • Regardless, the dramatic limits on medical school enrollment and residencies enjoyed strong support from the AAMC and the AMA. We were surprised to hear both organizations now sound the alarm about a doctor shortage. MD-granting medical schools started expanding again in 2005.
  • it’s because states have responded to the shortage by empowering nurse practitioners and physician assistants to perform tasks that once were the sole province of physicians. Over the past 20 years, the number of registered nurses grew almost twice as quickly as the number of doctors, and the number of physician assistants grew almost three times as rapidly, our analysis showed.
  • While there still aren’t enough residency positions, we’re getting more thanks in part to recent federal spending bills that will fund 1,200 more slots over the next few years.
Javier E

What's Left for Tech? - Freddie deBoer - 0 views

  • I gave a talk to a class at Northeastern University earlier this month, concerning technology, journalism, and the cultural professions. The students were bright and inquisitive, though they also reflected the current dynamic in higher ed overall - three quarters of the students who showed up were women, and the men who were there almost all sat moodily in the back and didn’t engage at all while their female peers took notes and asked questions. I know there’s a lot of criticism of the “crisis for boys” narrative, but it’s often hard not to believe in it.
  • we’re actually living in a period of serious technological stagnation - that despite our vague assumption that we’re entitled to constant remarkable scientific progress, humanity has been living with real and valuable but decidedly small-scale technological growth for the past 50 or 60 or 70 years, after a hundred or so years of incredible growth from 1860ish to 1960ish, give or take a decade or two on either side
  • I will recommend Robert J. Gordon’s The Rise & Fall of American Growth for an exhaustive academic (and primarily economic) argument to this effect. Gordon persuasively demonstrates that from the mid-19th to mid-20th century, humanity leveraged several unique advancements that had remarkably outsized consequences for how we live and changed our basic existence in a way that never happened before and hasn’t since. Principal among these advances were the process of refining fossil fuels and using them to power all manner of devices and vehicles, the ability to harness electricity and use it to safely provide energy to homes (which practically speaking required the first development), and a revolution in medicine that came from the confluence of long-overdue acceptance of germ theory and basic hygienic principles, the discovery and refinement of antibiotics, and the modernization of vaccines.
  • ...24 more annotations...
  • The complication that Gordon and other internet-skeptical researchers like Ha-Joon Chang have introduced is to question just how meaningful those digital technologies have been for a) economic growth and b) the daily experience of human life. It can be hard for people who stare at their phones all day to consider the possibility that digital technology just isn’t that important. But ask yourself: if you were forced to live either without your iPhone or without indoor plumbing, could you really choose the latter?
  • Certainly the improvements in medical care in the past half-century feel very important to me as someone living now, and one saved life has immensely emotional and practical importance for many people. What’s more, advances in communication sciences and computer technology genuinely have been revolutionary; going from the Apple II to the iPhone in 30 years is remarkable.
  • we can always debate what constitutes major or revolutionary change
  • Why is Apple going so hard on TITANIUM? Well, where else does smartphone development have to go?
  • continued improvements in worldwide mortality in the past 75 years have been a matter of spreading existing treatments and practices to the developing world, rather than the result of new science.
  • When you got your first smartphone, and you thought about what the future would hold, were your first thoughts about more durable casing? I doubt it. I know mine weren’t.
  • The question is, who in 2023 ever says to themselves “smartphone cameras just aren’t good enough”?
  • The elephant in the room, obviously, is AI.
  • The processors will get faster. They’ll add more RAM. They’ll generally have more power. But for what? To run what? To do what? To run the games that we were once told would replace our PlayStation and Xbox games, but didn’t?
  • Smartphone development has been a good object lesson in the reality that cool ideas aren’t always practical or worthwhile
  • There were, in those breathless early days, a lot of talk about how people simply wouldn’t own laptops anymore, how your phone would do everything. But it turns out that, for one thing, the keyboard remains an input device of unparalleled convenience and versatility.
  • We developed this technology for typewriters and terminals and desktops, it Just Works, and there’s no reason to try and “disrupt” it
  • Instead of one device to rule them all, we developed a norm of syncing across devices and cloud storage, which works well. (I always thought it was pretty funny, and very cynical, how Apple went from calling the iPhone an everything device to later marketing the iPad and iWatch.) In other words, we developed a software solution rather than a hardware one
  • I will always give it up to Google Maps and portable GPS technology; that’s genuinely life-altering, probably the best argument for smartphones as a transformative technology. But let me ask you, honestly: do you still go out looking for apps, with the assumption that you’re going to find something that really changes your life in a significant way?
  • some people are big VR partisans. I’m deeply skeptical. The brutal failures of Meta’s new “metaverse” is just one new example of a decades-long resistance to the technology among consumers
  • maybe I just don’t want VR to become popular, given the potential ugly social consequences. If you thought we had an incel problem now….
  • And as impressive as some new development in medicine has been, there’s no question that in simple terms of reducing preventable deaths, the advances seen from 1900 to 1950 dwarf those seen since. To a rem
  • It’s not artificial intelligence. It thinks nothing like a human thinks. There is no reason whatsoever to believe that it has evolved sentience or consciousness. There is nothing at present that these systems can do that human being simply can’t. But they can potentially do some things in the world of bits faster and cheaper than human beings, and that might have some meaningful consequences. But there is no reasonable, responsible claim to be made that these systems are imminent threats to conventional human life as currently lived, whether for good or for bad. IMO.
  • Let’s mutually agree to consider immediate plausible human technological progress outside of AI or “AI.” What’s coming? What’s plausible?
  • The most consequential will be our efforts to address climate change, and we have the potential to radically change how we generate electricity, although electrifying heating and transportation are going to be harder than many seem to think, while solar and wind power have greater ecological costs than people want to admit. But, yes, that’s potentially very very meaningful
  • It’s another example of how technological growth will still leave us with continuity rather than with meaningful change.
  • I kept thinking was, privatizing space… to do what? A manned Mars mission might happen in my lifetime, which is cool. But a Mars colony is a distant dream
  • This is why I say we live in the Big Normal, the Big Boring, the Forever Now. We are tragic people: we were born just too late to experience the greatest flowering of human development the world has ever seen. We do, however, enjoy the rather hefty consolation prize that we get to live with the affordances of that period, such as not dying of smallpox.
  • I think we all need to learn to appreciate what we have now, in the world as it exists, at the time in which we actually live. Frankly, I don’t think we have any other choice.
Javier E

The Orthodox Surge - NYTimes.com - 0 views

  • In the New York City area, for example, the Orthodox make up 32 percent of Jews over all. But the Orthodox make up 61 percent of Jewish children. Because the Orthodox are so fertile, in a few years, they will be the dominant group in New York Jewry.
  • For the people who shop at Pomegranate, the collective covenant with God is the primary reality and obedience to the laws is the primary obligation. They go shopping like the rest of us, but their shopping is minutely governed by an external moral order.
  • The laws, in this view, make for a decent society. They give structure to everyday life. They infuse everyday acts with spiritual significance. They build community. They regulate desires. They moderate religious zeal, making religion an everyday practical reality.
  • ...4 more annotations...
  • The laws are gradually internalized through a system of lifelong study, argument and practice. The external laws may seem, at first, like an imposition, but then they become welcome and finally seem like a person’s natural way of being.
  • At first piano practice seems like drudgery, like self-limitation, but mastering the technique gives you the freedom to play well and create new songs. Life is less a journey than it is mastering a discipline or craft.
  • there are still obligations that precede choice. For example, a young person in mainstream America can choose to marry or not. In Orthodox society, young adults have an obligation to marry and perpetuate the covenant and it is a source of deep sadness when they cannot.
  • “Marriage is about love, but it is not first and foremost about love,” Soloveichik says. “First and foremost, marriage is about continuity and transmission.”
Javier E

U.S. Practiced Torture After 9/11, Nonpartisan Review Concludes - NYTimes.com - 0 views

  • A nonpartisan, independent review of interrogation and detention programs in the years after the Sept. 11, 2001, terrorist attacks concludes that “it is indisputable that the United States engaged in the practice of torture” and that the nation’s highest officials bore ultimate responsibility for it.
martinde24

Justice Dept. report finds 'pattern or practice' of excessive force by Chicago police - 0 views

  •  
    Chicago police violated the Fourth Amendment through a pattern or practice of use of excessive force, Attorney General Loretta Lynch announced Friday, revealing the results of a wide-ranging investigation that the city's former top cop called biased from the start.
Javier E

Pressure Builds to Finish Volcker Rule on Wall St. Oversight - NYTimes.com - 0 views

  • rom the outset, the Volcker Rule was the product of compromise. The Obama administration declined to favor legislation forcing banks to spin off their turbulent Wall Street operations from their deposit-taking businesses. At the same time, it did not want regulated banks, which enjoy deposit insurance and other forms of government support, trading for their own profit. That business, known as proprietary trading, had long been a lucrative, albeit risky, business for Wall Street banks.
  • Paul A. Volcker, a former chairman of the Federal Reserve who served as an adviser to President Obama, urged that Dodd-Frank outlaw proprietary trading. And over the objections of Wall Street, the administration inserted into Dodd-Frank what became known as the Volcker Rule.
  • The rule, however, does not ban types of trading that are thought to be part of a bank’s basic business. Banks can still buy stocks and bonds for their clients — a practice called market making — and place trades that are meant to hedge their risks.
  • ...4 more annotations...
  • For regulators, the headache comes with finding practical ways to distinguish proprietary trading from the more legitimate practices. If they wrote the exemptions for market making and hedging too loosely, the banks might find loopholes. If they made them too strict, banks might not be able to engage in activities that Congress had said were permissible.
  • The final version is expected to contain a provision that requires bank chief executives to attest that they are not doing proprietary trading, officials say,  a victory for the rule’s supporters. The tougher version of this provision would have a chief executive make this certification in the bank’s public securities filings, which are audited and are expected to have a high degree of accuracy. A more modest version would have the executive attest to a bank’s board of directors.
  • The Volcker Rule also addresses traders’ compensation. The final wording is likely to require that traders engaged in market making and hedging not be paid on the basis of simply how much money their units made. Instead, the risks involved in taking positions would also have to be considered.
  • ince the Volcker Rule was first proposed in 2011, regulators have had to contend with a fierce lobbying campaign by the banks. But that effort lost momentum last year, after JPMorgan’s trading debacle revealed that its traders were placing enormous speculative bets under the guise of hedging.
grayton downing

BBC News - Merkel calls Obama about 'US spying on her phone' - 0 views

  • German Chancellor Angela Merkel has called US President Barack Obama after receiving information that the US may have spied on her mobile phone.
  • "views such practices... as completely unacceptable".
  • The White House said President Obama had told Chancellor Merkel the US was not snooping on her communications.
  • ...6 more annotations...
  • Mr Carney told reporters that Washington was examining concerns from Germany as well as France and other American allies over US intelligence practices.
  • Berlin demanded "an immediate and comprehensive explanation" from Washington about what it said "would be a serious breach of trust".
  • The statement also said that Mrs Merkel had told Mr Obama: "Such practices must be prevented immediately."
  • The German government would not elaborate on how it received the tip about the alleged US spying.
  • A number of US allies have expressed anger over the Snowden-based spying allegations
  • The Mexican government has called the alleged spying on the emails of two presidents, Enrique Pena Nieto - the incumbent - and Felipe Calderon, as "unacceptable".
Javier E

Google's Steely Foe in Europe - NYTimes.com - 0 views

  • Of her approach to her new job, she added: “Consumers depend on us to make sure that competition is fair and open, and it’s my responsibility to make that happen.”
  • The overarching issue is whether Google abused its market dominance. In some countries in Europe, Google has a 90 percent or larger market share, giving it greater dominance than in the United States.
  • Ms. Vestager’s predecessor, Joaquín Almunia, had pursued a wide-ranging investigation into the company’s practices. But he tried and failed three times to reach a settlement with Google.
  • ...5 more annotations...
  • “It was obvious that a negotiated solution was not a possibility,” Ms. Vestager said in the phone interview. “So I felt we should go in another direction.”
  • That direction was filing formal charges, called a statement of objection, accusing Google of favoring its own comparison shopping service, called Google Shopping. In practical terms, the commission found that when a consumer used Google to search for shopping-related information, the site systematically displayed the company’s own comparison product at the top of the search results — “irrespective of whether it is the most relevant response to the query,” Ms. Vestager said in a commission-issued statement about the charges.
  • In addition to the formal complaint related to Google Shopping, Ms. Vestager said her office was still looking into accusations that Google had restricted its advertising partners from using rival platforms and that it scraped online content from competitors. She also announced a separate “in-depth investigation” into accusations of anticompetitive company practices regarding Google’s relationships with device manufacturers that rely on its Android operating system.
  • Longtime observers of Ms. Vestager theorized that she had chosen to initially pursue a narrow case in which she had the most confidence, while keeping pressure on her adversary to settle by opening parallel lines of inquiry.
  • “It’s about power. Any deal she makes, it’s about how much power she has and how much power her adversary has,” says Martin Krasnik, the host of a late-night current affairs show on Danish national television who describes Ms. Vestager as the most impenetrable politician he has ever interviewed.
lenaurick

Why Republicans are debating bringing back torture - Vox - 0 views

  • Several Republicans have suggested that they'd be open to torturing suspected terrorists if elected — especially New Hampshire primary winner Donald Trump.
  • "Waterboarding is fine, and much tougher than that is fine," Trump said at a Monday campaign event in New Hampshire. "When we're with these animals, we can't be soft and weak, like our politicians."
  • Previously, Trump promised to "bring back" types of torture "a hell of a lot worse than waterboarding" during Saturday's Republican debate. The rest of the GOP field took a somewhat more nuanced position. Marco Rubio categorically refused to rule out any torture techniques, for fear of helping terrorists "practice how to evade us."
  • ...13 more annotations...
  • This debate doesn't have much to do with the merits of torture as an intelligence-gathering mechanism: The evidence that torture doesn't work is overwhelming. Rather, the debate among four leading Republicans over the practice is all about politics, both inside the Republican Party and more broadly.
  • Cruz, for example, has said that waterboarding does not constitute torture, but also that he would not "bring it back in any sort of widespread use" and has co-sponsored legislation limiting its use.
  • Well, under the definition of torture, no, it's not. Under the law, torture is excruciating pain that is equivalent to losing organs and systems, so under the definition of torture, it is not. It is enhanced interrogation, it is vigorous interrogation, but it does not meet the generally recognized definition of torture.
  • international law, under both the UN Convention Against Torture and the Geneva Conventions, considers waterboarding a form of torture and thus illegal.
  • A January 2005 Gallup poll found that 82 percent of Americans believed "strapping prisoners on boards and forcing their heads underwater until they think they are drowning" was an immoral interrogation tactic.
  • In 2007, 40 percent of Americans favored waterboarding suspected terrorists in a CNN poll, while 58 percent opposed. By 2014, 49 percent told CBS that they believed waterboarding could be at least sometimes justified, while only 36 percent said it never could be.
  • Today, 73 percent of Republicans support torturing suspected terrorists, according to Pew.
  • Any Republican who took a strong stance against waterboarding or other torture techniques could be pegged as weak on terrorism — a damning charge in a Republican primary that's been preoccupied with ISIS.
  • Reminder: Torture is morally abhorrent and also doesn't work
  • Some proponents will claim that while morally regrettable, torture is nonetheless necessary to keep us safe. But the best evidence suggests that it this is a false choice: Waterboarding, and other forms of torture, does not work.
  • In most cases, torture is used by authoritarian states to force false confessions
  • The evidence that torture did not aid the hunt for Osama bin Laden is particularly compelling.
  • In other words, some GOP candidates' pro-torture sentiment isn't just a relic of Bush-era partisan debates — it's also totally out of whack with everything we know about the practice of torture today.
maddieireland334

Money Given to Kenya, Since Stolen, Puts Nike in Spotlight - The New York Times - 0 views

  • When a Chinese clothing company swooped in and offered to sponsor Kenya’s famed runners, Nike panicked, Kenyan officials say.
  • Kenya’s athletics federation — has led to a major scandal in Kenya, a country in the midst of its biggest war against corruption in years.
  • In a contract signed several years ago, Nike agreed to pay hundreds of thousands of dollars in honorariums and a one-time $500,000 “commitment bonus,” which the former employee called a bribe.
  • ...20 more annotations...
  • money was supposed to be used to help train and support poor Kenyan athletes who dream of running their way out of poverty.
  • immediately sucked out of the federation’s bank account by a handful of Kenyan officials and kept off the books.
  • does not appear to be under investigation by the United States authorities.
  • all three Kenyan athletics officials accused of taking money from Nike have been suspended
  • For more than 20 years, Nike Inc. has been paying the Kenyan national runners’ association millions of dollars in exchange for the Kenyans wearing Nike’s signature swoosh, superb advertising in the running world.
  • Ethiopian runners, who also excel at middle- and long-distance races, have a sponsorship agreement with Adidas, but an official there said their contract contained no commitment bonus
  • In a sworn statement provided to Kenyan investigators, the former assistant said the $500,000 commitment bonus was “bribe money from Nike” so that the top officials could pay back the $200,000 from the scuttled deal with the Chinese company and then make even more by agreeing to sign up again with Nike.
  • Nearly every day there seems to be allegations of some new scandal: a government ministry buying plastic pens for $85 apiece, a Supreme Court judge taking a $2 million bribe, questions about what exactly happened to the proceeds of a multibillion-dollar bond deal.
  • Kenyan athletes were so outraged when they learned in November that hundreds of thousands of dollars from Nike had been stolen by their bigwigs that they staged a protest at their headquarters in Nairobi, with elite athletes camped out in the grass and holding up signs that read “blood sucers.” (Some of the runners never finished school.) Advertisement Continue reading the main story
  • But those complaints may have been a ruse by Kenyan officials to get out of the Nike contract so they could receive a bribe from another company, said a member of the executive board of Kenya’s track and field federation, known as Athletics Kenya.
  • The sports-marketing agent who made the payment, Papa Massata Diack, was recently banned for life by the International Association of Athletics Federations, a global governing body for track and field.
  • After they received a letter from a Nike lawyer saying there were no legal grounds to terminate the contract, the Kenyan officials abruptly changed course.
  • They negotiated a new contract in which Nike agreed to pay Athletics Kenya an annual sponsorship fee of $1.3 million to $1.5 million — plus $100,000 honorariums each year and a one-time $500,000 “commitment bonus.”
  • Nike executives refused to discuss the contract, issuing a short statement that the money paid to Athletics Kenya was supposed to support the athletes. It said that Nike conducted business with integrity and that “we are cooperating with the local authorities in their investigation,” a point the Kenyan detectives dispute.
  • Western nations have threatened sanctions, and the United States government has been especially vocal about corruption, with White House officials unveiling a 29-point plan to root it out.
  • He said corruption in the athletics federation was so ingrained and so brazen that officials routinely extorted money from athletes who failed drug tests. He also said the organization’s chairman, Isaiah Kiplagat, had asked Nike to wire the bonus directly to his personal account, a request that Nike refused.
  • Within days, according to bank records, the $500,000 was withdrawn by Athletics Kenya’s top officials. There were no major track and field activities going on at the time, and the board member and the former administrative assistant said just about all of the money had been concealed from Athletics Kenya’s executive committee, including $200,000 sent to a bank account in Hong Kong.
  • Analysts said this case was especially tricky because it did not appear to fall under the Foreign Corrupt Practices Act, the American law that covers crimes involving American companies and foreign government officials.
  • The Kenyan running association, while it receives some government money, is not a Kenyan government agency.
  • He noted that sports federations, like Athletics Kenya and FIFA, international soccer’s governing body, which is embroiled in its own corruption saga, often fell between the cracks of the rules that governed businesses, public agencies and traditional nonprofit organizations, even though sports federations have qualities of all three.
rachelramirez

Money Given to Kenya, Since Stolen, Puts Nike in Spotlight - The New York Times - 0 views

  • Money Given to Kenya, Since Stolen, Puts Nike in Spotlight
  • What followed — according to email exchanges, letters, bank records and invoices, provided by a former employee of Kenya’s athletics federation — has led to a major scandal in Kenya, a country in the midst of its biggest war against corruption in years.
  • In a contract signed several years ago, Nike agreed to pay hundreds of thousands of dollars in honorariums and a one-time $500,000 “commitment bonus,” which the former employee called a bribe.
  • ...7 more annotations...
  • John Githongo, one of Kenya’s leading voices against corruption, said the American government should pick up this case and “run with it.”
  • For more than 20 years, Nike Inc. has been paying the Kenyan national runners’ association millions of dollars in exchange for the Kenyans wearing Nike’s signature swoosh
  • Several professional runners said they had heard of signing bonuses for individual athletes, but never such a large one-time bonus for a national federation.
  • The fallout from the Nike deal hit just as Western embassies were coming down hard on Kenya for corruption.
  • Several analysts said Nike could not afford to lose the Kenyans. Running is integral to Nike’s brand
  • Several analysts said the chairman’s asking for the money to be wired to his personal account and then sending a follow-up email labeled “Urgent!!” should have been a tipoff to Nike that something was untoward.
  • Analysts said this case was especially tricky because it did not appear to fall under the Foreign Corrupt Practices Act, the American law that covers crimes involving American companies and foreign government officials.
Javier E

Andrew Sullivan: America's New Religions - 0 views

  • Everyone has a religion. It is, in fact, impossible not to have a religion if you are a human being. It’s in our genes and has expressed itself in every culture, in every age, including our own secularized husk of a society.
  • By religion, I mean something quite specific: a practice not a theory; a way of life that gives meaning, a meaning that cannot really be defended without recourse to some transcendent value, undying “Truth” or God (or gods).
  • Which is to say, even today’s atheists are expressing an attenuated form of religion. Their denial of any God is as absolute as others’ faith in God, and entails just as much a set of values to live by — including, for some, daily rituals like meditation, a form of prayer.
  • ...38 more annotations...
  • “Religion is an attempt to find meaning in events, not a theory that tries to explain the universe.” It exists because we humans are the only species, so far as we can know, who have evolved to know explicitly that, one day in the future, we will die. And this existential fact requires some way of reconciling us to it while we are alive.
  • This is why science cannot replace it. Science does not tell you how to live, or what life is about; it can provide hypotheses and tentative explanations, but no ultimate meaning
  • appreciating great art or music is ultimately an act of wonder and contemplation, and has almost nothing to say about morality and life.
  • Here’s Mill describing the nature of what he called “A Crisis in My Mental History”:
  • It is perfectly possible to see and record the absurdities and abuses of man-made institutions and rituals, especially religious ones, while embracing a way of life that these evil or deluded people preached but didn’t practice
  • Seduced by scientism, distracted by materialism, insulated, like no humans before us, from the vicissitudes of sickness and the ubiquity of early death, the post-Christian West believes instead in something we have called progress — a gradual ascent of mankind toward reason, peace, and prosperity — as a substitute in many ways for our previous monotheism
  • We have constructed a capitalist system that turns individual selfishness into a collective asset and showers us with earthly goods; we have leveraged science for our own health and comfort. Our ability to extend this material bonanza to more and more people is how we define progress; and progress is what we call meaning
  • But none of this material progress beckons humans to a way of life beyond mere satisfaction of our wants and needs. And this matters. We are a meaning-seeking species
  • Ditto history
  • So what happens when this religious rampart of the entire system is removed? I think what happens is illiberal politics. The need for meaning hasn’t gone away, but without Christianity, this yearning looks to politics for satisfaction.
  • Russell, for his part, abandoned Christianity at the age of 18, for the usual modern reasons, but the question of ultimate meaning still nagged at him. One day, while visiting the sick wife of a colleague, he described what happened: “Suddenly the ground seemed to give away beneath me, and I found myself in quite another region. Within five minutes I went through some such reflections as the following: the loneliness of the human soul is unendurable; nothing can penetrate it except the highest intensity of the sort of love that religious teachers have preached; whatever does not spring from this motive is harmful, or at best useless.”
  • Our modern world tries extremely hard to protect us from the sort of existential moments experienced by Mill and Russell
  • Netflix, air-conditioning, sex apps, Alexa, kale, Pilates, Spotify, Twitter … they’re all designed to create a world in which we rarely get a second to confront ultimate meaning — until a tragedy occurs, a death happens, or a diagnosis strikes
  • Liberalism is a set of procedures, with an empty center, not a manifestation of truth, let alone a reconciliation to mortality. But, critically, it has long been complemented and supported in America by a religion distinctly separate from politics, a tamed Christianity
  • religious impulses, once anchored in and tamed by Christianity, find expression in various political cults. These political manifestations of religion are new and crud
  • Will the house still stand when its ramparts are taken away? I’m beginning to suspect it can’t.  And won’t.
  • like almost all new cultish impulses, they demand a total and immediate commitment to save the world.
  • We have the cult of Trump on the right, a demigod who, among his worshippers, can do no wrong. And we have the cult of social justice on the left, a religion whose followers show the same zeal as any born-again Evangelical
  • They are filling the void that Christianity once owned, without any of the wisdom and culture and restraint that Christianity once provided.
  • social-justice ideology does everything a religion should. It offers an account of the whole: that human life and society and any kind of truth must be seen entirely as a function of social power structures, in which various groups have spent all of human existence oppressing other groups
  • it provides a set of practices to resist and reverse this interlocking web of oppression — from regulating the workplace and policing the classroom to checking your own sin and even seeking to control language itself.
  • “Social justice” theory requires the admission of white privilege in ways that are strikingly like the admission of original sin
  • To the belief in human progress unfolding through history — itself a remnant of Christian eschatology — it adds the Leninist twist of a cadre of heroes who jump-start the revolution.
  • many Evangelicals are among the holiest and most quietly devoted people out there. Some have bravely resisted the cult. But their leaders have turned Christianity into a political and social identity, not a lived faith, and much of their flock — a staggering 81 percent voted for Trump — has signed on. They have tribalized a religion explicitly built by Jesus as anti-tribal.
  • The terrible truth of the last three years is that the fresh appeal of a leader-cult has overwhelmed the fading truths of Christianity.
  • This is why they are so hard to reach or to persuade and why nothing that Trump does or could do changes their minds. You cannot argue logically with a religion
  • — which is why you cannot really argue with social-justice activists either
  • so we’re mistaken if we believe that the collapse of Christianity in America has led to a decline in religion. It has merely led to religious impulses being expressed by political cults.
  • both cults really do minimize the importance of the individual in favor of either the oppressed group or the leader
  • They demonstrate, to my mind, how profoundly liberal democracy has actually depended on the complement of a tolerant Christianity to sustain itself — as many earlier liberals (Tocqueville, for example) understood.
  • It is Christianity that came to champion the individual conscience against the collective, which paved the way for individual rights. It is in Christianity that the seeds of Western religious toleration were first sown. Christianity is the only monotheism that seeks no sway over Caesar, that is content with the ultimate truth over the immediate satisfaction of power. It was Christianity that gave us successive social movements, which enabled more people to be included in the liberal project, thus renewing i
  • The question we face in contemporary times is whether a political system built upon such a religion can endure when belief in that religion has become a shadow of its future self.
  • it occurred to me to put the question directly to myself: ‘Suppose that all your objects in life were realized; that all the changes in institutions and opinions that you are looking forward to, could be completely effected at this very instant; would this be a great joy and happiness to you?’ And an irrepressible self-consciousness distinctly answered: ‘No!’”
  • I think it was mainly about how the people of Britain shook off the moral decadence of the foreign policy of the 1930s, how, beneath the surface, there were depths of feeling and determination that we never saw until an existential crisis hit, and an extraordinary figure seized the moment.
  • how profoundly I yearn for something like that to reappear in America. The toll of Trump is so deep. In so many ways, he has come close to delegitimizing this country and entire West, aroused the worst instincts within us, fed fear rather than confronting it, and has been rewarded for his depravity in the most depressing way by everything that is foul on the right and nothing that is noble.
  • I want to believe in America again, its decency and freedom, its hostility, bred in its bones, toward tyranny of any kind, its kindness and generosity. I need what someone once called the audacity of hope.
  • I’ve witnessed this America ever since I arrived — especially its embrace of immigrants — which is why it is hard to see Trump tearing migrant children from their parents
  • But who, one wonders, is our Churchill? And when will he or she emerge?
Javier E

Japanese Culture: 4th Edition (Updated and Expanded) (Kindle version) (Studies of the Weatherhead East Asian Institute) (Paul Varley) - 0 views

  • It is fitting that Japan’s earliest remaining works, composed at a time when the country was so strongly under the civilizing influence of China, should be of a historical character. In the Confucian tradition, the writing of history has always been held in the highest esteem, since Confucianists believe that the lessons of the past provide the best guide for ethical rule in the present and future. In contrast to the Indians, who have always been absorbed with metaphysical and religious speculation and scarcely at all with history, the Chinese are among the world’s greatest record-keepers.
  • he wrote that it is precisely because life and nature are changeable and uncertain that things have the power to move us.
  • The turbulent centuries of the medieval age produced many new cultural pursuits that catered to the tastes of various classes of society, including warriors, merchants, and even peasants. Yet, coloring nearly all these pursuits was miyabi, reflected in a fundamental preference on the part of the Japanese for the elegant, the restrained, and the subtly suggestive.
  • ...65 more annotations...
  • “Nothing in the West can compare with the role which aesthetics has played in Japanese life and history since the Heian period”; and “the miyabi spirit of refined sensibility is still very much in evidence” in modern aesthetic criticism.9
  • there has run through history the idea that the Japanese are, in terms of their original nature (that is, their nature before the introduction from the outside of such systems of thought and religion as Confucianism and Buddhism), essentially an emotional people. And in stressing the emotional side of human nature, the Japanese have always assigned high value to sincerity (makoto) as the ethic of the emotions.
  • If the life of the emotions thus had an ethic in makoto, the evolution of mono no aware in the Heian period provided it also with an aesthetic.
  • Tsurayuki said, in effect, that people are emotional entities and will intuitively and spontaneously respond in song and verse when they perceive things and are moved. The most basic sense of mono no aware is the capacity to be moved by things, whether they are the beauties of nature or the feelings of people,
  • One of the finest artistic achievements of the middle and late Heian period was the evolution of a native style of essentially secular painting that reached its apex in the narrative picture scrolls of the twelfth century. The products of this style of painting are called “Yamato [that is, Japanese] pictures” to distinguish them from works categorized as “Chinese pictures.”
  • The Fujiwara epoch, in literature as well as the visual arts, was soft, approachable, and “feminine.” By contrast, the earlier Jōgan epoch had been forbidding, secretive (esoteric), and “masculine.”
  • Despite the apparent lust of the samurai for armed combat and martial renown, much romanticized in later centuries, the underlying tone of the medieval age in Japan was from the beginning somber, pessimistic, and despairing. In The Tale of Genji the mood shifted from satisfaction with the perfections of Heian courtier society to uncertainty about this life and a craving for salvation in the next.
  • Despite political woes and territorial losses, the Sung was a time of great advancement in Chinese civilization. Some scholars, impressed by the extensive growth in cities, commerce, maritime trade, and governmental bureaucratization in the late T’ang and Sung, have even asserted that this was the age when China entered its “early modern” phase. The Sung was also a brilliant period culturally.
  • the fortuitous combination of desire on the part of the Sung to increase its foreign trade with Japan and the vigorous initiative taken in maritime activity by the Taira greatly speeded the process of transmission.
  • The Sung period in China, on the other hand, was an exceptional age for scholarship, most notably perhaps in history and in the compilation of encyclopedias and catalogs of art works. This scholarly activity was greatly facilitated by the development of printing, invented by the Chinese several centuries earlier.
  • In addition to reviving interest in Japanese poetry, the use of kana also made possible the evolution of a native prose literature.
  • peasantry, who formed the nucleus of what came to be known as the True Sect of Pure Land Buddhism. Through the centuries, this sect has attracted one of the largest followings among the Japanese, and its founder, Shinran, has been canonized as one of his country’s most original religious thinkers.
  • True genre art, picturing all classes at work and play, did not appear in Japan until the sixteenth century. The oldest extant genre painting of the sixteenth century is a work, dating from about 1525, called “Views Inside and Outside Kyoto” (rakuchū-rakugai zu).
  • the aesthetic principles that were largely to dictate the tastes of the medieval era. We have just remarked the use of sabi. Another major term of the new medieval aesthetics was yūgen, which can be translated as “mystery and depth.” Let
  • One of the basic values in the Japanese aesthetic tradition—along with such things as perishability, naturalness, and simplicity—is suggestion. The Japanese have from earliest times shown a distinct preference for the subtleties of suggestion, intimation, and nuance, and have characteristically sought to achieve artistic effect by means of “resonances” (yojō).
  • Amidism was not established as a separate sect until the time of the evangelist Hōnen (1133–1212).
  • But even in Chōmei we can observe a tendency to transform what is supposed to be a mean hovel into something of beauty based on an aesthetic taste for “deprivation” (to be discussed later in this chapter) that evolved during medieval times.
  • Apart from the proponents of Pure Land Buddhism, the person who most forcefully propagated the idea of universal salvation through faith was Nichiren (1222–82).
  • Nichiren held that ultimate religious truth lay solely in the Lotus Sutra, the basic text of the Greater Vehicle of Buddhism in which Gautama had revealed that all beings possess the potentiality for buddhahood.
  • At the time of its founding in Japan by Saichō in the early ninth century, the Tendai sect had been based primarily on the Lotus Sutra; but, in the intervening centuries, Tendai had deviated from the Sutra’s teachings and had even spawned new sects, like those of Pure Land Buddhism, that encouraged practices entirely at variance with these teachings.
  • Declaring himself “the pillar of Japan, the eye of the nation, and the vessel of the country,”14 Nichiren seems even to have equated himself with Japan and its fate.
  • The kōan is especially favored by what the Japanese call the Rinzai sect of Zen, which is also known as the school of “sudden enlightenment” because of its belief that satori, if it is attained, will come to the individual in an instantaneous flash of insight or awareness. The other major sect of Zen, Sōtō, rejects this idea of sudden enlightenment and instead holds that satori is a gradual process to be attained primarily through seated meditation.
  • Fought largely in Kyoto and its environs, the Ōnin War dragged on for more than ten years, and after the last armies withdrew in 1477 the once lovely capital lay in ruins. There was no clear-cut victor in the Ōnin War. The daimyos had simply fought themselves into exhaustion,
  • Yoshimasa was perhaps even more noteworthy as a patron of the arts than his grandfather, Yoshimitsu. In any case, his name is just as inseparably linked with the flourishing of culture in the Higashiyama epoch (usually taken to mean approximately the last half of the fifteenth century) as Yoshimitsu’s is with that of Kitayama.
  • The tea room, as a variant of the shoin room, evolved primarily in the sixteenth century.
  • Shukō’s admonition about taking care to “harmonize Japanese and Chinese tastes” has traditionally been taken to mean that he stood, in the late fifteenth century, at a point of transition from the elegant and “aristocratic” kind of Higashiyama chanoyu just described, which featured imported Chinese articles, to a new, Japanese form of the ceremony that used native ceramics,
  • the new kind of tea ceremony originated by Shukō is called wabicha, or “tea based on wabi.” Developed primarily by Shukō’s successors during the sixteenth century, wabicha is a subject for the next chapter.
  • The Japanese, on the other hand, have never dealt with nature in their art in the universalistic sense of trying to discern any grand order or structure; much less have they tried to associate the ideal of order in human society with the harmonies of nature. Rather,
  • The Chinese Sung-style master may have admired a mountain, for example, for its enduring, fixed quality, but the typical Japanese artist (of the fifteenth century or any other age) has been more interested in a mountain for its changing aspects:
  • Zen culture of Muromachi Japan was essentially a secular culture. This seems to be strong evidence, in fact, of the degree to which medieval Zen had become secularized: its view of nature was pantheistic and its concern with man was largely psychological.
  • Nobunaga’s castle at Azuchi and Hideyoshi’s at Momoyama have given their names to the cultural epoch of the age of unification. The designation of this epoch as Azuchi-Momoyama (or, for the sake of convenience, simply Momoyama) is quite appropriate in view of the significance of castles—as represented by these two historically famous structures—in the general progress, cultural and otherwise, of these exciting years.
  • Along with architecture, painting was the art that most fully captured the vigorous and expansive spirit of the Momoyama epoch of domestic culture during the age of unification. It was a time when many styles of painting and groups of painters flourished. Of the latter, by far the best known and most successful were the Kanō,
  • Motonobu also made free use of the colorful Yamato style of native art that had evolved during the Heian period and had reached its pinnacle in the great narrative picture scrolls of the twelfth and thirteenth centuries.
  • what screen painting really called for was color, and it was this that the Kanō artists, drawing on the native Yamato tradition, added to their work with great gusto during the Momoyama epoch. The color that these artists particularly favored was gold, and compositions done in ink and rich pigments on gold-leaf backgrounds became the most characteristic works of Momoyama art.
  • there could hardly be a more striking contrast between the spirits of two ages than the one reflected in the transition from the subdued monochromatic art of Japan’s medieval era to the blazing use of color by Momoyama artists, who stood on the threshold of early modern times.
  • aware, which, as we saw in Chapter 3, connotes the capacity to be moved by things. In the period of the Shinkokinshū, when Saigyō lived, this sentiment was particularly linked with the aesthetic of sabi or “loneliness” (and, by association, sadness). The human condition was essentially one of loneliness;
  • During the sixteenth century the ceremony was further developed as wabicha, or tea (cha) based on the aesthetic of wabi. Haga Kōshirō defines wabi as comprising three kinds of beauty: a simple, unpretentious beauty; an imperfect, irregular beauty; and an austere, stark beauty.
  • The alternate attendance system also had important consequences in the cultural realm, contributing to the development for the first time of a truly national culture. Thus, for example, the daimyos and their followers from throughout the country who regularly visited Edo were the disseminators of what became a national dialect or “lingua franca” and, ultimately, the standard language of modern Japan.
  • They also fostered the spread of customs, rules of etiquette, standards of taste, fashions, and the like that gave to Japanese everywhere a common lifestyle.
  • “[Tokugawa-period] statesmen thought highly of agriculture, but not of agriculturalists.”6 The life of the average peasant was one of much toil and little joy. Organized into villages that were largely self-governing, the peasants were obliged to render a substantial portion of their farming yields—on average, perhaps 50 percent or more—to the samurai, who provided few services in return. The resentment of peasants toward samurai grew steadily throughout the Tokugawa period and was manifested in countless peasant rebellions
  • Although in the long run the seclusion policy undeniably limited the economic growth of Tokugawa Japan by its severe restrictions both on foreign trade and on the inflow of technology from overseas, it also ensured a lasting peace that made possible a great upsurge in the domestic economy, especially during the first century of shogunate rule.
  • Both samurai and peasants were dependent almost solely on income from agriculture and constantly suffered declines in real income as the result of endemic inflation; only the townsmen, who as commercialists could adjust to price fluctuations, were in a position to profit significantly from the economic growth of the age.
  • We should not be surprised, therefore, to find this class giving rise to a lively and exuberant culture that reached its finest flowering in the Genroku epoch at the end of the seventeenth and the beginning of the eighteenth centuries. The mainstays of Genroku culture were the theatre, painting (chiefly in the form of the woodblock print), and prose fiction,
  • The Japanese had, of course, absorbed Confucian thinking from the earliest centuries of contact with China, but for more than a millennium Buddhism had drawn most of their intellectual attention. Not until the Tokugawa period did they come to study Confucianism with any great zeal.
  • One of the most conspicuous features of the transition from medieval to early modern times in Japan was the precipitous decline in the vigor of Buddhism and the rise of a secular spirit.
  • The military potential and much of the remaining landed wealth of the medieval Buddhist sects had been destroyed during the advance toward unification in the late sixteenth century. And although Buddhism remained very much part of the daily lives of the people, it not only ceased to hold appeal for many Japanese intellectuals but indeed even drew the outright scorn and enmity of some.
  • it was the Buddhist church—and especially the Zen sect—that paved the way for the upsurge in Confucian studies during Tokugawa times. Japanese Zen priests had from at least the fourteenth century on assiduously investigated the tenets of Sung Neo-Confucianism, and in ensuing centuries had produced a corpus of research upon which the Neo-Confucian scholarship of the Tokugawa period was ultimately built.
  • Yamaga Sokō is generally credited as the formulator of the code of bushidō, or the “way of the warrior.”4 Certainly he was a pioneer in analyzing the role of the samurai as a member of a true ruling elite and not simply as a rough, and frequently illiterate, participant in the endless civil struggles of the medieval age.
  • The fundamental purpose of Neo-Confucian practice is to calm one’s turbid ki to allow one’s nature (ri) to shine forth. The person who achieves this purpose becomes a sage, his ri seen as one with the universal principle, known as the “supreme ultimate” (taikyoku), that governs all things.
  • Neo-Confucianism proposed two main courses to clarify ri, one objective and the other subjective.7 The objective course was through the acquisition of knowledge by means of the “investigation of things,” a phrase taken by Chu Hsi from the Chinese classic The Great Learning (Ta hsüeh). At the heart of things to investigate was history,
  • Quite apart from any practical guidance to good rulership it may have provided, this Neo-Confucian stress on historical research proved to be a tremendous spur to scholarship and learning in general during the Tokugawa period;8 and, as we will see in the next chapter, it also facilitated the development of other, heterodox lines of intellectual inquiry.
  • the subjective course appeared to have been taken almost directly from Buddhism, and in particular Zen. It was the course of “preserving one’s heart by holding fast to seriousness,” which called for the clarification of ri by means remarkably similar to Zen meditation.
  • The calendrical era of Genro ku lasted from 1688 until 1703, but the Genroku cultural epoch is usually taken to mean the span of approximately a half-century from, say, 1675 until 1725. Setting the stage for this rise of a townsman-oriented culture was nearly a century of peace and steady commercial growth.
  • places of diversion and assignation, these quarters were the famous “floating worlds” (ukiyo) of Tokugawa fact and legend. Ukiyo, although used specifically from about this time to designate such demimondes, meant in the broadest sense the insubstantial and ever-changing existence in which man is enmeshed.
  • ukiyo15 always carried the connotation that life is fundamentally sad; but, in Genroku times, the term was more commonly taken to mean a world that was pleasurable precisely because it was constantly changing, exciting, and up-to-date.
  • the Tokugawa period was not at all like the humanism that emerged in the West from the Renaissance on. Whereas modern Western humanism became absorbed with people as individuals, with all their personal peculiarities, feelings, and ways, Japanese humanism of the Tokugawa period scarcely conceived of the existence of true individuals at all; rather, it focused on “the people” and regarded them as comprising essentially types, such as samurai, farmers, and courtesans.
  • there is little in the literature as a whole of that quality—character development—that is probably the single most important feature of the modern Western novel.
  • Although shogunate authorities and Tokugawa-period intellectuals in general had relatively little interest in the purely metaphysical side of Chu Hsi’s teachings, they found his philosophy to be enormously useful in justifying or ideologically legitimizing the feudal structure of state and society that had emerged in Japan by the seventeenth century.
  • With its radical advocacy of violent irrationality—to the point of psychosis—Hagakure has shocked many people. But during Japan’s militarist years of the 1930s and World War II, soldiers and others hailed it as something of a bible of samurai behavior, and the postwar nationalist writer Mishima Yukio was even inspired to write a book in praise of its values.
  • It is significant that many of the leading prose writers, poets, and critics of the most prominent journal of Japanese romanticism, Bungakukai (The Literary World, published from 1893 until 1898), were either converts to or strongly influenced by Protestant Christianity, the only creed in late Meiji Japan that gave primacy to the freedom and spiritual independence of the individual. The absolutism embodied in the Meiji Constitution demanded strict subordination of the interests of the individual to those of the state;
  • The feeling of frustration engendered by a society that placed such preponderant stress upon obedience to the group, especially in the form of filial piety toward one’s parents and loyalty to the state, no doubt accounts for much of the sense of alienation observable in the works of so many modern Japanese writers.
  • These writers have been absorbed to an unusual degree with the individual, the world of his personal psychology, and his essential loneliness. In line with this preoccupation, novelists have perennially turned to the diary-like, confessional tale—the so-called I-novel—as their preferred medium of expression.
  • In intellectual and emotional terms, the military came increasingly to be viewed as the highest repository of the traditional Japanese spirit that was the sole hope for unifying the nation to act in a time of dire emergency.
  • The enemy that had led the people astray was identified as those sociopolitical doctrines and ideologies that had been introduced to Japan from the West during the preceding half-century or so along with the material tools of modernization.
  • If there is a central theme to this book, it is that the Japanese, within the context of a history of abundant cultural borrowing from China in premodern times and the West in the modern age, have nevertheless retained a hard core of native social, ethical, and cultural values by means of which they have almost invariably molded and adapted foreign borrowing to suit their own tastes and purposes.
knudsenlu

Who Was Recy Taylor? - The Atlantic - 0 views

  • Recy Taylor died 10 days ago, just shy of her 98th birthday. She lived as we all have lived, too many years in a culture broken by brutally powerful men. For too long, women have not been heard or believed if they dare speak the truth to the power of those men. But their time is up. Their time is up.
  • If we know that enslaved women were used for their productive and reproductive labor—if they were raped with impunity in the system of slavery—then what happened after Emancipation? Did those practices and the institutions that upheld those practices—the men and their sons and their cousins—end those practices just because of Emancipation?
  • So I started looking for cases, which were hard to find because marginalized people are hard to find in the archives. Their stories are not remembered, they’re not saved, and they’re not considered worthy of being archived so often. Those stories were hard to find, but the black press actually printed a lot of black women’s testimonies about sexual violence at the time.
  • ...3 more annotations...
  • Newkirk: Having met and spent time with her, what’s your sense of how Recy Taylor fought all this, and how she processed what happened to her? Did she see herself as an activist?McGuire: No. She was not an activist. After she was assaulted, she immediately told what happened. She told her father, her husband, the sheriff, and then she went home. And then the family was terrorized, and her house was firebombed.
  • If someone threatens to kill you in Alabama in 1944, that’s real. There’s no consequences for that. The threat is very real. Her speaking out was just incredibly brave. And when I asked her in 2009 why she spoke out—why did she say anything, wasn’t she scared?—and she looked me right in the eye and said ‘I just didn’t think that I deserved what they did to me.’ I just thought that she had an incredible sense of self-worth and dignity.
  • I was raised to believe, like too many people are today, that Rosa Parks was a tired old lady who tiptoed into history. Because she had an ‘emotional response’ to her exhaustion and it changed the world. But, in 1998 I was working on my master’s thesis, and I listened to an NPR story about Montgomery Bus Boycott veterans. The editor of The Montgomery Advertiser, Joe Azbell, was talking about the boycott and he said that Gertrude Perkins had never been mentioned in history, but she was the most important in the boycott. It took my breath away, and I didn’t know who that was.So I went looking in microfilm for the newspaper, and I found her story. She was a black woman who was kidnapped by the police in Montgomery and raped.
Javier E

Trump is no longer the worst person in government. Pence is. - The Washington Post - 0 views

  • Donald Trump, with his feral cunning, knew. The oleaginous Mike Pence, with his talent for toadyism and appetite for obsequiousness, could, Trump knew, become America’s most repulsive public figure.
  • Because his is the authentic voice of today’s lickspittle Republican Party, he clarifies this year’s elections: Vote Republican to ratify groveling as governing.
  • Henry Adams said that “practical politics consists in ignoring facts,
  • ...6 more annotations...
  • It is said that one cannot blame people who applaud Arpaio and support his rehabilitators (Trump, Pence, et al.), because, well, globalization or health-care costs or something. Actually, one must either blame them or condescend to them as lacking moral agency.
  • So, “let reverence for the laws . . . become the political religion of the nation.”
  • Lincoln lamented that throughout America, “so lately famed for love of law and order,” there was a “mobocratic spirit” among “the vicious portion of [the] population.
  • ballots cast this November will be most important as validations or repudiations of the harmonizing voices of Trump, Pence, Arpaio and the like.
  • Trump is what he is, a floundering, inarticulate jumble of gnawing insecurities and not-at-all compensating vanities, which is pathetic.
  • Pence is what he has chosen to be, which is horrifying.
‹ Previous 21 - 40 of 1123 Next › Last »
Showing 20 items per page