Skip to main content

Home/ History Readings/ Group items matching ""world war ii"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Ellie McGinnis

A Weapon Seen as Too Horrible, Even in War - NYTimes.com - 0 views

  • “If you could hear, at every jolt, the blood / Come gargling from the froth-corrupted lungs.”
  • Germany is recognized as the first to use chemical weapons on a mass scale, on April 22, 1915, at Ypres, Belgium, where 6,000 British and French troops succumbed
  • once again emerged as an issue after the massacre in Syria last month, in which the United States says nearly 1,500 people, men, women and children, were killed, many as they slept.
  • ...17 more annotations...
  • Why, it is fair to ask, does the killing of 100,000 or more with conventional weapons elicit little more than a concerned shrug, while the killing of a relative few from poison gas is enough to trigger an intervention
  • 16 million people died and 20 million were wounded during World War I
  • 2 percent of the casualties and fewer than 1 percent of the deaths are estimated to have resulted from chemical warfare
  • 1925 Geneva Protocol, which banned the use, though not the possession, of chemical and biological weapons
  • almost universally accepted and become an international norm. Syria, too, is a signatory
  • No Western army used gas on the battlefield during the global slaughter of World War II
  • Nazis were to gas noncombatant Jews, Gypsies and others.
  • Franklin D. Roosevelt stepped in and, in quiet diplomacy, “told the Japanese that we knew of the use and that there would be consequences.”
  • general revulsion against the use of poisons against human beings in warfare, going back to the Greeks,”
  • 1675, when France and the Holy Roman Empire agreed in Strasbourg not to use poisoned bullets
  • 1899 not to use “projectiles the sole objective of which is the diffusion of asphyxiating or deleterious gases,
  • few known instances of poison gas being used since 1925
  • first two cases, gas was used by authoritarian regimes against those they considered lesser races.
  • 1935-36, Mussolini used several hundred tons of mustard gas in Abyssinia, now Ethiopia
  • 1940-41, the Japanese used chemical and biological weapons widely in China
  • chemical weapons have been categorized as “weapons of mass destruction,”
  • American use of Agent Orange in Vietnam was widely criticized
g-dragon

Tibet and China: Early History - 0 views

  • For at least 1500 years, the nation of Tibet has had a complex relationship with its large and powerful neighbor to the east, China. The political history of Tibet and China reveals that the relationship has not always been as one-sided as it now appears.
  • Indeed, as with China’s relations with the Mongols and the Japanese, the balance of power between China and Tibet has shifted back and forth over the centuries.
  • The first known interaction between the two states came in 640 A.D., when the Tibetan King Songtsan Gampo married the Princess Wencheng, a niece of the Tang Emperor Taizong. He also married a Nepalese princess.
  • ...53 more annotations...
  • Tibet and China signed a peace treaty in 821 or 822, which delineated the border between the two empires. The Tibetan Empire would concentrate on its Central Asian holdings for the next several decades, before splitting into several small, fractious kingdoms.
  • Canny politicians, the Tibetans befriended Genghis Khan just as the Mongol leader was conquering the known world in the early 13th century. As a result, though the Tibetans paid tribute to the Mongols after the Hordes had conquered China, they were allowed much greater autonomy than the other Mongol-conquered lands.
  • Over time, Tibet came to be considered one of the thirteen provinces of the Mongolian-ruled nation of Yuan China.
  • The Tibetans transmitted their Buddhist faith to the eastern Mongols; Kublai Khan himself studied Tibetan beliefs with the great teacher Drogon Chogyal Phagpa.
  • When the Mongols' Yuan Empire fell in 1368 to the ethnic-Han Chinese Ming, Tibet reasserted its independence and refused to pay tribute to the new Emperor.
  • After their lifetimes, the two men were called the First and Second Dalai Lamas. Their sect, the Gelug or "Yellow Hats," became the dominant form of Tibetan Buddhism.
  • The Third Dalai Lama, Sonam Gyatso (1543-1588), was the first to be so named during his life. He was responsible for converting the Mongols to Gelug Tibetan Buddhism, and it was the Mongol ruler Altan Khan who probably gave the title “Dalai Lama” to Sonam Gyatso.
  • The Fourth Dalai Lama, Yonten Gyatso (1589-1616), was a Mongolian prince and the grandson of Altan Khan.
  • During the 1630s, China was embroiled in power struggles between the Mongols, Han Chinese of the fading Ming Dynasty, and the Manchu people of north-eastern China (Manchuria). The Manchus would eventually defeat the Han in 1644, and establish China's final imperial dynasty, the Qing (1644-1912).
  • The Dalai Lama made a state visit to the Qing Dynasty's second Emperor, Shunzhi, in 1653. The two leaders greeted one another as equals; the Dalai Lama did not kowtow. Each man bestowed honors and titles upon the other, and the Dalai Lama was recognized as the spiritual authority of the Qing Empire.
  • The Imperial Army then defeated the rebels, but the Emperor recognized that he would have to rule through the Dalai Lama rather than directly. Day-to-day decisions would be made on the local level.
  • China took advantage of this period of instability in Tibet to seize the regions of Amdo and Kham, making them into the Chinese province of Qinghai in 1724.
  • Three years later, the Chinese and Tibetans signed a treaty that laid out the boundary line between the two nations. It would remain in force until 1910.
  • In 1788, the Regent of Nepal sent Gurkha forces to invade Tibet.The Qing Emperor responded in strength, and the Nepalese retreated.The Gurkhas returned three years later, plundering and destroying some famous Tibetan monasteries. The Chinese sent a force of 17,000 which, along with Tibetan troops, drove the Gurkhas out of Tibet and south to within 20 miles of Kathmandu.
  • The Simla Convention granted China secular control over "Inner Tibet," (also known as Qinghai Province) while recognizing the autonomy of "Outer Tibet" under the Dalai Lama's rule. Both China and Britain promised to "respect the territorial integrity of [Tibet], and abstain from interference in the administration of Outer Tibet."
  • Despite this sort of assistance from the Chinese Empire, the people of Tibet chafed under increasingly meddlesome Qing rule.
  • when the Eighth Dalai Lama died, and 1895, when the Thirteenth Dalai
  • none of the incumbent incarnations of the Dalai Lama lived to see their nineteenth birthdays
  • If the Chinese found a certain incarnation too hard to control, they would poison him. If the Tibetans thought an incarnation was controlled by the Chinese, then they would poison him themselves.
  • Throughout this period, Russia and Britain were engaged in the "Great Game," a struggle for influence and control in Central Asia.
  • Russia pushed south of its borders, seeking access to warm-water sea ports and a buffer zone between Russia proper and the advancing British. The British pushed northward from India, trying to expand their empire and protect the Raj, the "Crown Jewel of the British Empire," from the expansionist Russians.
  • Tibet was an important playing piece in this game.
  • the British in India concluded a trade and border treaty with Beijing concerning the boundary between Sikkim and Tibet.However, the Tibetans flatly rejected the treaty terms.
  • The British invaded Tibet in 1903 with 10,000 men, and took Lhasa the following year. Thereupon, they concluded another treaty with the Tibetans, as well as Chinese, Nepalese and Bhutanese representatives, which gave the British themselves some control over Tibet’s affairs.
  • The 13th Dalai Lama, Thubten Gyatso, fled the country in 1904 at the urging of his Russian disciple, Agvan Dorzhiev. He went first to Mongolia, then made his way to Beijing.
  • The Chinese declared that the Dalai Lama had been deposed as soon as he left Tibet, and claimed full sovereignty over not only Tibet but also Nepal and Bhutan. The Dalai Lama went to Beijing to discuss the situation with the Emperor Guangxu, but he flatly refused to kowtow to the Emperor.
  • He returned to Lhasa in 1909, disappointed by Chinese policies towards Tibet. China sent a force of 6,000 troops into Tibet, and the Dalai Lama fled to Darjeeling, India later that same year.
  • China's new revolutionary government issued a formal apology to the Dalai Lama for the Qing Dynasty's insults, and offered to reinstate him. Thubten Gyatso refused, stating that he had no interest in the Chinese offer.
  • He then issued a proclamation that was distributed across Tibet, rejecting Chinese control and stating that "We are a small, religious, and independent nation."The Dalai Lama took control of Tibet's internal and external governance in 1913, negotiating directly with foreign powers, and reforming Tibet's judicial, penal, and educational systems.
  • Representatives of Great Britain, China, and Tibet met in 1914 to negotiate a treaty marking out the boundary lines between India and its northern neighbors.
  • According to Tibet, the "priest/patron" relationship established at this time between the Dalai Lama and Qing China continued throughout the Qing Era, but it had no bearing on Tibet's status as an independent nation. China, naturally, disagrees.
  • China walked out of the conference without signing the treaty after Britain laid claim to the Tawang area of southern Tibet, which is now part of the Indian state of Arunachal Pradesh. Tibet and Britain both signed the treaty.
  • As a result, China has never agreed to India's rights in northern Arunachal Pradesh (Tawang), and the two nations went to war over the area in 1962. The boundary dispute still has not been resolved.
  • China also claims sovereignty over all of Tibet, while the Tibetan government-in-exile points to the Chinese failure to sign the Simla Convention as proof that both Inner and Outer Tibet legally remain under the Dalai Lama's jurisdiction.
  • Soon, China would be too distracted to concern itself with the issue of Tibet.
  • China would see near-continuous civil war up to the Communist victory in 1949, and this era of conflict was exacerbated by the Japanese Occupation and World War II. Under such circumstances, the Chinese showed little interest in Tibet.The 13th Dalai Lama ruled independent Tibet in peace until his death in 1933.
  • Tenzin Gyatso, the current Dalai Lama, was taken to Lhasa in 1937 to begin training for his duties as the leader of Tibet. He would remain there until 1959, when the Chinese forced him into exile in India.
  • In 1950, the People's Liberation Army (PLA) of the newly-formed People's Republic of China invaded Tibet. With stability reestablished in Beijing for the first time in decades, Mao Zedong sought to assert China's right to rule over Tibet as well.
  • The PLA inflicted a swift and total defeat on Tibet's small army, and China drafted the "Seventeen Point Agreement" incorporating Tibet as an autonomous region of the People's Republic of China.Representatives of the Dalai Lama's government signed the agreement under protest, and the Tibetans repudiated the agreement nine years later.
  • On March 1, 1959, the Dalai Lama received an odd invitation to attend a theater performance at PLA headquarters near Lhasa.
  • The guards immediately publicized this rather ham-handed attempted abduction, and the following day an estimated crowd of 300,000 Tibetans surrounded Potala Palace to protect their leader.
  • Tibetan troops were able to secure a route for the Dalai Lama to escape into India on March 17. Actual fighting began on March 19, and lasted only two days before the Tibetan troops were defeated.
  • An estimated 800 artillery shells had pummeled Norbulingka, and Lhasa's three largest monasteries were essentially leveled. The Chinese rounded up thousands of monks, executing many of them. Monasteries and temples all over Lhasa were ransacked.
  • In the days after the 1959 Uprising, the Chinese government revoked most aspects of Tibet's autonomy, and initiated resettlement and land distribution across the country. The Dalai Lama has remained in exile ever since.
  • China's central government, in a bid to dilute the Tibetan population and provide jobs for Han Chinese, initiated a "Western China Development Program" in 1978.As many as 300,000 Han now live in Tibet, 2/3 of them in the capital city. The Tibetan population of Lhasa, in contrast, is only 100,000.Ethnic Chinese hold the vast majority of government posts.
  • On May 1, 1998, the Chinese officials at Drapchi Prison in Tibet ordered hundreds of prisoners, both criminals and political detainees, to participate in a Chinese flag-raising ceremony.Some of the prisoners began to shout anti-Chinese and pro-Dalai Lama slogans, and prison guards fired shots into the air before returning all the prisoners to their cells.
  • The prisoners were then severely beaten with belt buckles, rifle butts, and plastic batons, and some were put into solitary confinement for months at a time, according to one young nun who was released from the prison a year later.
  • Three days later, the prison administration decided to hold the flag-raising ceremony again.Once more, some of the prisoners began to shout slogans.Prison official reacted with even more brutality, and five nuns, three monks, and one male criminal were killed by the guards. One man was shot; the rest were beaten to death.
  • On March 10, 2008, Tibetans marked the 49th anniversary of the 1959 uprising by peacefully protesting for the release of imprisoned monks and nuns. Chinese police then broke up the protest with tear gas and gunfire.The protest resumed for several more days, finally turning into a riot. Tibetan anger was fueled by reports that imprisoned monks and nuns were being mistreated or killed in prison as a reaction to the street demonstrations.
  • China immediately cut off access to Tibet for foreign media and tourists.
  • The unrest came at a sensitive time for China, which was gearing up for the 2008 Summer Olympics in Beijing.The situation in Tibet caused increased international scrutiny of Beijing's entire human rights record, leading some foreign leaders to boycott the Olympic Opening Ceremonies. Olympic torch-bearers around the world were met by thousands of human rights protestors.
  • Tibet and China have had a long relationship, fraught with difficulty and change.At times, the two nations have worked closely together. At other times, they have been at war.
  • Today, the nation of Tibet does not exist; not one foreign government officially recognizes the Tibetan government-in-exile.
anonymous

Queen Elizabeth II recalls WWII evacuations during coronavirus speech - The Washington Post - 0 views

  • She also harked back to her first speech to the public ever, when she was only 14 and still a princess.“It reminds me of the very first broadcast I made, in 1940, helped by my sister,” she said, as an archive photo of the girls appeared on-screen. “We as children spoke from here at Windsor [Castle] to children who had been evacuated from their homes and sent away for their own safety.”
  • The wave of child evacuations had begun the year before, on Sept. 1, 1939 — the same day Nazi Germany invaded Poland and only two days before Britain’s prime minister declared war. Fearing civilian casualties if British cities were bombed, officials urged parents to send their children to the countryside to live with strangers who volunteered to provide space for them.
  • Evacuation of children was voluntary, according to the Imperial War Museum, but since urban schools had been shut down, the decision was made easier.
  • ...8 more annotations...
  • In the first wave, nearly 1 million children, hundreds of thousands of teachers and half-a-million mothers with babies were evacuated. The teachers were assigned groups of kids to find spaces for when their trains arrived in smaller towns and villages.
  • In September 1940, the predicted Nazi bombing campaign known as “the Blitz” began, and the last wave of child evacuations took place. Many well-to-do families also arranged for their children to be sent overseas to countries such as Canada, Australia and the United States.
  • For others, the evacuation was a nightmare. Their food rations from the government were confiscated by the families they ended up with; they were put to work in fields; many were physically and sexually abused. John Abbott told the BBC he was whipped by his host family whenever he spoke and was eventually rescued by local police, bruised and bleeding.
  • by January 1940, nearly half of parents had brought their children home, the museum said. The health ministry put up threatening posters to discourage this. One poster depicts a mother visiting her children in the country with a ghostly Adolf Hitler over her shoulder, tempting her like Satan to “Take them back! Take them back!”
  • Accommodations varied wildly. Some children were virtually adopted by host families and given love and good care. Some lived in large manors housing dozens of children and run by teachers. Many of the urban children were seeing the countryside, agriculture and farm animals for the first time, finding it both inspiring and boring.
  • It was after this last wave, in October 1940, that Princess Elizabeth addressed the children of Britain.
  • When Elizabeth turned 18 in early 1945, she joined the Auxiliary Territorial Service, where she trained as a truck mechanic and driver. To this day, she is the only female member of the royal family to have served in the military.
  • In 1940, she told the children — her contemporaries — “When peace comes, remember it will be for us, the children of today, to make the world of tomorrow a better and happier place.”Now 93, she said Sunday: “I hope, in the years to come, everyone will be able to take pride in how they responded to this challenge. And those who come after us will say the Britons of this generation were as strong as any.”“Today, once again, many will feel a painful sense of separation from their loved ones,” she closed. “But now, as then, we know deep down that it is the right thing to do.”
Javier E

Japanese Culture: 4th Edition (Updated and Expanded) (Kindle version) (Studies of the Weatherhead East Asian Institute) (Paul Varley) - 0 views

  • It is fitting that Japan’s earliest remaining works, composed at a time when the country was so strongly under the civilizing influence of China, should be of a historical character. In the Confucian tradition, the writing of history has always been held in the highest esteem, since Confucianists believe that the lessons of the past provide the best guide for ethical rule in the present and future. In contrast to the Indians, who have always been absorbed with metaphysical and religious speculation and scarcely at all with history, the Chinese are among the world’s greatest record-keepers.
  • he wrote that it is precisely because life and nature are changeable and uncertain that things have the power to move us.
  • The turbulent centuries of the medieval age produced many new cultural pursuits that catered to the tastes of various classes of society, including warriors, merchants, and even peasants. Yet, coloring nearly all these pursuits was miyabi, reflected in a fundamental preference on the part of the Japanese for the elegant, the restrained, and the subtly suggestive.
  • ...65 more annotations...
  • “Nothing in the West can compare with the role which aesthetics has played in Japanese life and history since the Heian period”; and “the miyabi spirit of refined sensibility is still very much in evidence” in modern aesthetic criticism.9
  • there has run through history the idea that the Japanese are, in terms of their original nature (that is, their nature before the introduction from the outside of such systems of thought and religion as Confucianism and Buddhism), essentially an emotional people. And in stressing the emotional side of human nature, the Japanese have always assigned high value to sincerity (makoto) as the ethic of the emotions.
  • If the life of the emotions thus had an ethic in makoto, the evolution of mono no aware in the Heian period provided it also with an aesthetic.
  • Tsurayuki said, in effect, that people are emotional entities and will intuitively and spontaneously respond in song and verse when they perceive things and are moved. The most basic sense of mono no aware is the capacity to be moved by things, whether they are the beauties of nature or the feelings of people,
  • One of the finest artistic achievements of the middle and late Heian period was the evolution of a native style of essentially secular painting that reached its apex in the narrative picture scrolls of the twelfth century. The products of this style of painting are called “Yamato [that is, Japanese] pictures” to distinguish them from works categorized as “Chinese pictures.”
  • The Fujiwara epoch, in literature as well as the visual arts, was soft, approachable, and “feminine.” By contrast, the earlier Jōgan epoch had been forbidding, secretive (esoteric), and “masculine.”
  • Despite the apparent lust of the samurai for armed combat and martial renown, much romanticized in later centuries, the underlying tone of the medieval age in Japan was from the beginning somber, pessimistic, and despairing. In The Tale of Genji the mood shifted from satisfaction with the perfections of Heian courtier society to uncertainty about this life and a craving for salvation in the next.
  • Despite political woes and territorial losses, the Sung was a time of great advancement in Chinese civilization. Some scholars, impressed by the extensive growth in cities, commerce, maritime trade, and governmental bureaucratization in the late T’ang and Sung, have even asserted that this was the age when China entered its “early modern” phase. The Sung was also a brilliant period culturally.
  • the fortuitous combination of desire on the part of the Sung to increase its foreign trade with Japan and the vigorous initiative taken in maritime activity by the Taira greatly speeded the process of transmission.
  • The Sung period in China, on the other hand, was an exceptional age for scholarship, most notably perhaps in history and in the compilation of encyclopedias and catalogs of art works. This scholarly activity was greatly facilitated by the development of printing, invented by the Chinese several centuries earlier.
  • In addition to reviving interest in Japanese poetry, the use of kana also made possible the evolution of a native prose literature.
  • peasantry, who formed the nucleus of what came to be known as the True Sect of Pure Land Buddhism. Through the centuries, this sect has attracted one of the largest followings among the Japanese, and its founder, Shinran, has been canonized as one of his country’s most original religious thinkers.
  • True genre art, picturing all classes at work and play, did not appear in Japan until the sixteenth century. The oldest extant genre painting of the sixteenth century is a work, dating from about 1525, called “Views Inside and Outside Kyoto” (rakuchū-rakugai zu).
  • the aesthetic principles that were largely to dictate the tastes of the medieval era. We have just remarked the use of sabi. Another major term of the new medieval aesthetics was yūgen, which can be translated as “mystery and depth.” Let
  • One of the basic values in the Japanese aesthetic tradition—along with such things as perishability, naturalness, and simplicity—is suggestion. The Japanese have from earliest times shown a distinct preference for the subtleties of suggestion, intimation, and nuance, and have characteristically sought to achieve artistic effect by means of “resonances” (yojō).
  • Amidism was not established as a separate sect until the time of the evangelist Hōnen (1133–1212).
  • But even in Chōmei we can observe a tendency to transform what is supposed to be a mean hovel into something of beauty based on an aesthetic taste for “deprivation” (to be discussed later in this chapter) that evolved during medieval times.
  • Apart from the proponents of Pure Land Buddhism, the person who most forcefully propagated the idea of universal salvation through faith was Nichiren (1222–82).
  • Nichiren held that ultimate religious truth lay solely in the Lotus Sutra, the basic text of the Greater Vehicle of Buddhism in which Gautama had revealed that all beings possess the potentiality for buddhahood.
  • At the time of its founding in Japan by Saichō in the early ninth century, the Tendai sect had been based primarily on the Lotus Sutra; but, in the intervening centuries, Tendai had deviated from the Sutra’s teachings and had even spawned new sects, like those of Pure Land Buddhism, that encouraged practices entirely at variance with these teachings.
  • Declaring himself “the pillar of Japan, the eye of the nation, and the vessel of the country,”14 Nichiren seems even to have equated himself with Japan and its fate.
  • The kōan is especially favored by what the Japanese call the Rinzai sect of Zen, which is also known as the school of “sudden enlightenment” because of its belief that satori, if it is attained, will come to the individual in an instantaneous flash of insight or awareness. The other major sect of Zen, Sōtō, rejects this idea of sudden enlightenment and instead holds that satori is a gradual process to be attained primarily through seated meditation.
  • Fought largely in Kyoto and its environs, the Ōnin War dragged on for more than ten years, and after the last armies withdrew in 1477 the once lovely capital lay in ruins. There was no clear-cut victor in the Ōnin War. The daimyos had simply fought themselves into exhaustion,
  • Yoshimasa was perhaps even more noteworthy as a patron of the arts than his grandfather, Yoshimitsu. In any case, his name is just as inseparably linked with the flourishing of culture in the Higashiyama epoch (usually taken to mean approximately the last half of the fifteenth century) as Yoshimitsu’s is with that of Kitayama.
  • The tea room, as a variant of the shoin room, evolved primarily in the sixteenth century.
  • Shukō’s admonition about taking care to “harmonize Japanese and Chinese tastes” has traditionally been taken to mean that he stood, in the late fifteenth century, at a point of transition from the elegant and “aristocratic” kind of Higashiyama chanoyu just described, which featured imported Chinese articles, to a new, Japanese form of the ceremony that used native ceramics,
  • the new kind of tea ceremony originated by Shukō is called wabicha, or “tea based on wabi.” Developed primarily by Shukō’s successors during the sixteenth century, wabicha is a subject for the next chapter.
  • The Japanese, on the other hand, have never dealt with nature in their art in the universalistic sense of trying to discern any grand order or structure; much less have they tried to associate the ideal of order in human society with the harmonies of nature. Rather,
  • The Chinese Sung-style master may have admired a mountain, for example, for its enduring, fixed quality, but the typical Japanese artist (of the fifteenth century or any other age) has been more interested in a mountain for its changing aspects:
  • Zen culture of Muromachi Japan was essentially a secular culture. This seems to be strong evidence, in fact, of the degree to which medieval Zen had become secularized: its view of nature was pantheistic and its concern with man was largely psychological.
  • Nobunaga’s castle at Azuchi and Hideyoshi’s at Momoyama have given their names to the cultural epoch of the age of unification. The designation of this epoch as Azuchi-Momoyama (or, for the sake of convenience, simply Momoyama) is quite appropriate in view of the significance of castles—as represented by these two historically famous structures—in the general progress, cultural and otherwise, of these exciting years.
  • Along with architecture, painting was the art that most fully captured the vigorous and expansive spirit of the Momoyama epoch of domestic culture during the age of unification. It was a time when many styles of painting and groups of painters flourished. Of the latter, by far the best known and most successful were the Kanō,
  • Motonobu also made free use of the colorful Yamato style of native art that had evolved during the Heian period and had reached its pinnacle in the great narrative picture scrolls of the twelfth and thirteenth centuries.
  • what screen painting really called for was color, and it was this that the Kanō artists, drawing on the native Yamato tradition, added to their work with great gusto during the Momoyama epoch. The color that these artists particularly favored was gold, and compositions done in ink and rich pigments on gold-leaf backgrounds became the most characteristic works of Momoyama art.
  • there could hardly be a more striking contrast between the spirits of two ages than the one reflected in the transition from the subdued monochromatic art of Japan’s medieval era to the blazing use of color by Momoyama artists, who stood on the threshold of early modern times.
  • aware, which, as we saw in Chapter 3, connotes the capacity to be moved by things. In the period of the Shinkokinshū, when Saigyō lived, this sentiment was particularly linked with the aesthetic of sabi or “loneliness” (and, by association, sadness). The human condition was essentially one of loneliness;
  • During the sixteenth century the ceremony was further developed as wabicha, or tea (cha) based on the aesthetic of wabi. Haga Kōshirō defines wabi as comprising three kinds of beauty: a simple, unpretentious beauty; an imperfect, irregular beauty; and an austere, stark beauty.
  • The alternate attendance system also had important consequences in the cultural realm, contributing to the development for the first time of a truly national culture. Thus, for example, the daimyos and their followers from throughout the country who regularly visited Edo were the disseminators of what became a national dialect or “lingua franca” and, ultimately, the standard language of modern Japan.
  • They also fostered the spread of customs, rules of etiquette, standards of taste, fashions, and the like that gave to Japanese everywhere a common lifestyle.
  • “[Tokugawa-period] statesmen thought highly of agriculture, but not of agriculturalists.”6 The life of the average peasant was one of much toil and little joy. Organized into villages that were largely self-governing, the peasants were obliged to render a substantial portion of their farming yields—on average, perhaps 50 percent or more—to the samurai, who provided few services in return. The resentment of peasants toward samurai grew steadily throughout the Tokugawa period and was manifested in countless peasant rebellions
  • Although in the long run the seclusion policy undeniably limited the economic growth of Tokugawa Japan by its severe restrictions both on foreign trade and on the inflow of technology from overseas, it also ensured a lasting peace that made possible a great upsurge in the domestic economy, especially during the first century of shogunate rule.
  • Both samurai and peasants were dependent almost solely on income from agriculture and constantly suffered declines in real income as the result of endemic inflation; only the townsmen, who as commercialists could adjust to price fluctuations, were in a position to profit significantly from the economic growth of the age.
  • We should not be surprised, therefore, to find this class giving rise to a lively and exuberant culture that reached its finest flowering in the Genroku epoch at the end of the seventeenth and the beginning of the eighteenth centuries. The mainstays of Genroku culture were the theatre, painting (chiefly in the form of the woodblock print), and prose fiction,
  • The Japanese had, of course, absorbed Confucian thinking from the earliest centuries of contact with China, but for more than a millennium Buddhism had drawn most of their intellectual attention. Not until the Tokugawa period did they come to study Confucianism with any great zeal.
  • One of the most conspicuous features of the transition from medieval to early modern times in Japan was the precipitous decline in the vigor of Buddhism and the rise of a secular spirit.
  • The military potential and much of the remaining landed wealth of the medieval Buddhist sects had been destroyed during the advance toward unification in the late sixteenth century. And although Buddhism remained very much part of the daily lives of the people, it not only ceased to hold appeal for many Japanese intellectuals but indeed even drew the outright scorn and enmity of some.
  • it was the Buddhist church—and especially the Zen sect—that paved the way for the upsurge in Confucian studies during Tokugawa times. Japanese Zen priests had from at least the fourteenth century on assiduously investigated the tenets of Sung Neo-Confucianism, and in ensuing centuries had produced a corpus of research upon which the Neo-Confucian scholarship of the Tokugawa period was ultimately built.
  • Yamaga Sokō is generally credited as the formulator of the code of bushidō, or the “way of the warrior.”4 Certainly he was a pioneer in analyzing the role of the samurai as a member of a true ruling elite and not simply as a rough, and frequently illiterate, participant in the endless civil struggles of the medieval age.
  • The fundamental purpose of Neo-Confucian practice is to calm one’s turbid ki to allow one’s nature (ri) to shine forth. The person who achieves this purpose becomes a sage, his ri seen as one with the universal principle, known as the “supreme ultimate” (taikyoku), that governs all things.
  • Neo-Confucianism proposed two main courses to clarify ri, one objective and the other subjective.7 The objective course was through the acquisition of knowledge by means of the “investigation of things,” a phrase taken by Chu Hsi from the Chinese classic The Great Learning (Ta hsüeh). At the heart of things to investigate was history,
  • Quite apart from any practical guidance to good rulership it may have provided, this Neo-Confucian stress on historical research proved to be a tremendous spur to scholarship and learning in general during the Tokugawa period;8 and, as we will see in the next chapter, it also facilitated the development of other, heterodox lines of intellectual inquiry.
  • the subjective course appeared to have been taken almost directly from Buddhism, and in particular Zen. It was the course of “preserving one’s heart by holding fast to seriousness,” which called for the clarification of ri by means remarkably similar to Zen meditation.
  • The calendrical era of Genro ku lasted from 1688 until 1703, but the Genroku cultural epoch is usually taken to mean the span of approximately a half-century from, say, 1675 until 1725. Setting the stage for this rise of a townsman-oriented culture was nearly a century of peace and steady commercial growth.
  • places of diversion and assignation, these quarters were the famous “floating worlds” (ukiyo) of Tokugawa fact and legend. Ukiyo, although used specifically from about this time to designate such demimondes, meant in the broadest sense the insubstantial and ever-changing existence in which man is enmeshed.
  • ukiyo15 always carried the connotation that life is fundamentally sad; but, in Genroku times, the term was more commonly taken to mean a world that was pleasurable precisely because it was constantly changing, exciting, and up-to-date.
  • the Tokugawa period was not at all like the humanism that emerged in the West from the Renaissance on. Whereas modern Western humanism became absorbed with people as individuals, with all their personal peculiarities, feelings, and ways, Japanese humanism of the Tokugawa period scarcely conceived of the existence of true individuals at all; rather, it focused on “the people” and regarded them as comprising essentially types, such as samurai, farmers, and courtesans.
  • there is little in the literature as a whole of that quality—character development—that is probably the single most important feature of the modern Western novel.
  • Although shogunate authorities and Tokugawa-period intellectuals in general had relatively little interest in the purely metaphysical side of Chu Hsi’s teachings, they found his philosophy to be enormously useful in justifying or ideologically legitimizing the feudal structure of state and society that had emerged in Japan by the seventeenth century.
  • With its radical advocacy of violent irrationality—to the point of psychosis—Hagakure has shocked many people. But during Japan’s militarist years of the 1930s and World War II, soldiers and others hailed it as something of a bible of samurai behavior, and the postwar nationalist writer Mishima Yukio was even inspired to write a book in praise of its values.
  • It is significant that many of the leading prose writers, poets, and critics of the most prominent journal of Japanese romanticism, Bungakukai (The Literary World, published from 1893 until 1898), were either converts to or strongly influenced by Protestant Christianity, the only creed in late Meiji Japan that gave primacy to the freedom and spiritual independence of the individual. The absolutism embodied in the Meiji Constitution demanded strict subordination of the interests of the individual to those of the state;
  • The feeling of frustration engendered by a society that placed such preponderant stress upon obedience to the group, especially in the form of filial piety toward one’s parents and loyalty to the state, no doubt accounts for much of the sense of alienation observable in the works of so many modern Japanese writers.
  • These writers have been absorbed to an unusual degree with the individual, the world of his personal psychology, and his essential loneliness. In line with this preoccupation, novelists have perennially turned to the diary-like, confessional tale—the so-called I-novel—as their preferred medium of expression.
  • In intellectual and emotional terms, the military came increasingly to be viewed as the highest repository of the traditional Japanese spirit that was the sole hope for unifying the nation to act in a time of dire emergency.
  • The enemy that had led the people astray was identified as those sociopolitical doctrines and ideologies that had been introduced to Japan from the West during the preceding half-century or so along with the material tools of modernization.
  • If there is a central theme to this book, it is that the Japanese, within the context of a history of abundant cultural borrowing from China in premodern times and the West in the modern age, have nevertheless retained a hard core of native social, ethical, and cultural values by means of which they have almost invariably molded and adapted foreign borrowing to suit their own tastes and purposes.
Javier E

Opinion | Pound for Pound, Taiwan Is the Most Important Place in the World - The New York Times - 0 views

  • The new Cold War, between the United States and China, is increasingly focused on access to just one industry in one place: computer chips made in Taiwan.
  • Over the past year, Taiwan has taken a lead in the race to build thinner, faster and more powerful chips, or semiconductors. Its fastest chips are the critical building blocks of rapidly evolving digital industries like artificial intelligence and high-speed computing.
  • As of now, any country looking to dominate the digital future has to buy these superfast, ultrathin chips from either Taiwan or South Korea. And Taiwan has the edge in both technology and market power.
  • ...10 more annotations...
  • It is a small island of just 24 million people, but it is at the center of the battle for global technological supremacy. Pound for pound, it is the most important place in the world. As the Cold War between China and the United States intensifies, that importance will only continue to grow.
  • After World War II, only two major emerging economies managed to grow faster than 5 percent for five decades in a row and to rise from poverty into the ranks of developed economies. One was Taiwan, the other South Korea
  • They kept advancing up the industrial ladder by investing more heavily in research and development than did any of their rivals among emerging economies. Now they are among the research leaders of the developed economic world as well.
  • How did they accomplish this feat? Competent governments played a major role. South Korea nurtured giant conglomerates like Samsung and Hyundai, which exported consumer products under their own brand names. Taiwan cultivated smaller companies focused on making parts or assembling finished products for foreign brands
  • Today the flexibility goes a long way toward explaining its success.
  • Mixing overseas experience with young local graduates, the science parks became hothouses for entrepreneurial start-ups.
  • Going forward, many tech analysts predict that Taiwan’s business model gives it a clear edge. Most customers prefer a pure foundry that does not compete with them to design chips or build devices, and only Taiwan offers this service.
  • That is a big reason Apple has been switching from Samsung to T.S.M.C. for the processing chips in the iPhone and why Intel is expected to outsource production of its most advanced chips mainly to T.S.M.C.
  • Taiwan has tried to position itself as the “Switzerland” of chips, a neutral supplier, but it increasingly finds itself at the center of the jousting between China and the United States
  • Historically, the importance of Taiwan was calculated in geopolitical terms. A small democracy thriving in the shadow of a Communist giant stirred sympathy and support in Washington. Now, as a byproduct of its successful economic model, Taiwan has become a critical link in the global tech supply chain, adding economic weight to the geopolitical calculations.
alexdeltufo

Republicans ignore the lessons of World War II - The Washington Post - 0 views

  • President Obama addressed the nation Sunday night from the Oval Office, a rare use of the sacred symbols of the presidency to reassure Americans about their security while steeling them for a long and complex struggle against the Islamic State.
  • Donald Trump answered Obama’s call for tolerance by declaring that no Muslims should be permitted to enter the United States:
  • “Repackaged half measures . . . Tone deaf . . . sales pitch for the status quo . . . President Obama is riding the bench at T-ball today.
  • ...14 more annotations...
  • But the only thing this reflexive complaining does is divide the country further and make a coherent response to Islamic State more difficult.
  • There, I met Dale “Red” Robinson, a Pearl Harbor survivor and a staff sergeant in the infantry who was later part of the Normandy landing. He recalled the national unity of that war
  • “Nothing like this, these days. It’s sad, kind of sad, my friend.”
  • Robinson said he feels “sorry for all of the soldiers” serving today, let down by their political leaders.
  • That’s what Monday afternoon’s ceremony was about. A sailor rang a bell at 1:57 p.m. Eastern time, the moment 74 years earlier when Japanese planes struck
  • ater, while the band played “America the Beautiful,” the veterans, some wheeled, some walking with support, made a slow procession around the memorial’s pool to place wreaths
  • We live all of us today with the prosperity and the security built on the shoulders of these heroes,”
  • That’s our challenge — and we’re failing.
  • “The difference is the uncertainty of today, and it’s a big difference,”
  • “I thought they were more loyal, more concerned about the nation than their position,” he said.
  • Mays complained about the sharp partisan divisions in Congress. “How can we bring unity when you have that?”
  • Now, our representatives can’t even manage to come up with a resolution authorizing the use of military force against the Islamic State — and they’ve been at it for a year.
  • “just a half-hearted attempt to defend and distract from a failing policy.” Ryan said we are “one step behind our enemy.”
  • and the constant sniping at each other does nothing to defeat the Islamic State. As those old soldiers on the Mall taught us, victory comes from unity.
  •  
    Dana Milbank
Javier E

Readers on Guns: The Lynching Parallel - James Fallows - The Atlantic - 0 views

  • Let's begin with a comparison to a previous "uncontrollable" phenomenon of mass American violence: the wave of lynchings in the early 20th century
  • you'll find many parallels between lynchings and mass killings. First and foremost is the irrationality of the violence, the notion that it's a uncontrollable condition that comes over the killer or killers. Both are a subset of violence in a violent culture carried out by people not considered professional criminals. 
  • lynchings had common catalysts, just like mass shootings do. And in each, individual incidents seem to seed the air and feed each other  psychologically. Each new lynching or shooting increases/or increased the odds of the next one, it seems to me. And lynchings were considered just as inevitable and eternal as mass shootings are in America's modern gun culture.
  • ...14 more annotations...
  • lynchings and mass shootings are almost photo negatives of each other. An individual doing to an anonymous crowd what an anonymous crowd does to an individual....
  • 1) Clean the air: We need "responsible" gun owners to help police the
  • Legislators and courts and law enforcement had clear and major roles -- once they decided to play them -- in suppressing the mob. It's much less clear, obviously, what will work on mass shootings.
  • I have some thoughts:
  • From the time that serious pieces of the establishment began to condemn and try to systematically stop lynchings (around World War I) to the time that real progress was made -- after World War II -- was 30 years. This is a long-term project.
  • bullshit bravado that dominates right wing gun culture. All of that bravado, like almost all bravado, is based on an irrational fear. The government is going to take my gun. Bullshit. And we all need to attack it as bullshit. It is the same level of bullshit as That negro is gonna rape my daughter. It's the exact same irrational fear of the other.
  • While the angry gun culture may not carry out most shootings, they are willing to tolerate them, just as much of America was long willing to tolerate lynchings, because of this primal/tribal fear. The NRA, like the Klan before it, is kind of a shiny object. It's hugely important, but it's the wider bullying gun culture that is the core of the problem.
  • If we can reduce the gun rhetoric pollution in the air somewhat through shaming, that may make these explosions less common.
  • 2) Licensing v. Bans: Along those lines, your gun safety vs. gun control distinction is precisely correct. This is about meaningful licensing measures. Ways to assess the intersection of people and guns. Use the gun culture's own language. If you think people kill people, not guns, why do you object to closer monitoring of people? And I'd suggest working through concealed carry expansion. I would absolutely trade concealed carry expansion for stricter licensing and background measures
  • 3) The Drug War: So much of our gun culture and violence organizes itself around drug prohibition -- both through tools of business and means of enforcement -- that any act of violence, especially gun violence, is inseparable from it
  • We'll never fully clean the air today without ratcheting down the drug war and moving toward as much legalization as possible
  • 4) Seize this moment: One place where I think this moment is different than others is the growing sense in the country -- even among conservatives -- that right wing cultural nihilism is the greatest short and long-term challenge we face
  • The resistance to any effort to combat gun violence with something other than guns needs to be understood -- the NRA needs to be understood -- as a subsidiary of right-wing nihilism.
  • Over and over again, we ask what constructive suggestion do you offer to help govern your country to the benefit of all its people? And we get, fuck you, 47 percent. Arm yourself.
qkirkpatrick

How WWI was waged at sea deck - Washington Times - 0 views

  • In “Dreadnought,” the reader is reminded that the author is a biographer at heart: His portraits of the young Winston Churchill; of the intense, eccentric Adm. Sir John “Jacky” Fisher; and of Germany’s ambitious Adm. Alfred von Tirpitz make for a riveting narrative.
  • The two great fleets had vastly different origins. The German fleet was the creation of an unstable monarch, Kaiser Wilhelm II, who chose to compete for naval supremacy with the Royal Navy.
  • At sea as on land, World War I confounded the planners on both sides. The Germans anticipated a sea blockade, but thought that an enemy blockade close to their own coast would be vulnerable to sorties by their own warships.
  • ...3 more annotations...
  • Instead, Britain instituted a remote blockade, declaring most of the North Sea a war zone and enforcing the blockade from a distance. Much of the war was passed in attempts by each side to lure the other into a naval ambush.
  • The climax of World War I at sea was the Battle of Jutland on May 31, 1916. There, in the North Sea off Denmark, a sortie by German battle cruisers led to the greatest naval battle of the last century, involving at one time or another more than 200 vessels, including 44 battleships.
  • The British lost more ships and men than their enemies in the resulting melee, in part because of the rugged construction of the German ships.
Javier E

Opinion | White Riot - The New York Times - 0 views

  • how important is the frustration among what pollsters call non-college white men at not being able to compete with those higher up on the socioeconomic ladder because of educational disadvantage?
  • How critical is declining value in marriage — or mating — markets?
  • How toxic is the combination of pessimism and anger that stems from a deterioration in standing and authority? What might engender existential despair, this sense of irretrievable loss?
  • ...40 more annotations...
  • How hard is it for any group, whether it is racial, political or ethnic, to come to terms with losing power and status? What encourages desperate behavior and a willingness to believe a pack of lies?
  • I posed these questions to a wide range of experts. This column explores their replies.
  • While most acute among those possessing high status and power, Anderson said,People in general are sensitive to status threats and to any potential losses of social standing, and they respond to those threats with stress, anxiety, anger, and sometimes even violence
  • White supremacy and frank racism are prime motivators, and they combined with other elements to fuel the insurrection: a groundswell of anger directed specifically at elites and an addictive lust for revenge against those they see as the agents of their disempowerment.
  • It is this admixture of factors that makes the insurgency that wrested control of the House and Senate so dangerous — and is likely to spark new forms of violence in the future.
  • The population of U.S. Citizens who’ve lost the most power in the past 40 years, who aren’t competing well to get into college or get high paying jobs, whose marital prospects have dimmed, and who are outraged, are those I believe were most likely to be in on the attack.
  • The terrorist attacks on 9/11, the Weatherman bombings in protest of the Vietnam War, ethnic cleansing in Bosnia, or the assassination of abortion providers, may be motivated by different ideological beliefs but nonetheless share a common theme: The people who did these things appear to be motivated by strong moral conviction. Although some argue that engaging in behaviors like these requires moral disengagement, we find instead that they require maximum moral engagement and justification.
  • “lower class individuals experience greater vigilance to threat, relative to high status individuals, leading them to perceive greater hostility in their environment.”
  • This increased vigilance, Brinke and Keltner continue, createsa bias such that relatively low socio-economic status individuals perceive the powerful as dominant and threatening — endorsing a coercive theory of power
  • there is evidence that individuals of lower social class are more cynical than those occupying higher classes, and that this cynicism is directed toward out-group members — that is, those that occupy higher classes.
  • Before Trump, many of those who became his supporters suffered from what Carol Graham, a senior fellow at Brookings, describes as pervasive “unhappiness, stress and lack of hope” without a narrative to legitimate their condition:
  • When the jobs went away, families fell apart. There was no narrative other than the classic American dream that everyone who works hard can get ahead, and the implicit correlate was that those who fall behind and are on welfare are losers, lazy, and often minorities.
  • What, however, could prompt a mob — including not only members of the Proud Boys and the Boogaloo Bois but also many seemingly ordinary Americans drawn to Trump — to break into the Capitol?
  • One possible answer: a mutated form of moral certitude based on the belief that one’s decline in social and economic status is the result of unfair, if not corrupt, decisions by others, especially by so-called elites.
  • There is evidence that many non-college white Americans who have been undergoing what psychiatrists call “involuntary subordination” or “involuntary defeat” both resent and mourn their loss of centrality and what they perceive as their growing invisibility.
  • violence is:considered to be the essence of evil. It is the prototype of immorality. But an examination of violent acts and practices across cultures and throughout history shows just the opposite. When people hurt or kill someone, they usually do it because they feel they ought to: they feel that it is morally right or even obligatory to be violent.
  • “Most violence,” Fiske and Rai contend, “is morally motivated.”
  • A key factor working in concert to aggravate the anomie and disgruntlement in many members of Trump’s white working-class base is their inability to obtain a college education, a limitation that blocks access to higher paying jobs and lowers their supposed “value” in marriage markets.
  • In their paper “Trends in Educational Assortative Marriage From 1940 to 2003,” Christine R. Schwartz and Robert D. Mare, professors of sociology at the University of Wisconsin and the University of California-Los Angeles, wrote that the “most striking” data in their research, “is the decline in odds that those with very low levels of education marry up.”
  • there isvery consistent and compelling evidence to suggest the some of what we have witnessed this past week is a reflection of the angst, anger, and refusal to accept an “America”’ in which White (Christian) Americans are losing dominance, be it political, material, and/or cultural. And, I use the term dominance here, because it is not simply a loss of status. It is a loss of power. A more racially, ethnically, religiously diverse US that is also a democracy requires White Americans to acquiesce to the interests and concerns of racial/ethnic and religious minorities.
  • In this new world, Federico argues, “promises of broad-based economic security” were replaced by a job market whereyou can have dignity, but it must be earned through market or entrepreneurial success (as the Reagan/Thatcher center-right would have it) or the meritocratic attainment of professional status (as the center-left would have it). But obviously, these are not avenues available to all, simply because society has only so many positions for captains of industry and educated professionals.
  • The result, Federico notes, is that “group consciousness is likely to emerge on the basis of education and training” and when “those with less education see themselves as being culturally very different from an educated stratum of the population that is more socially liberal and cosmopolitan, then the sense of group conflict is deepened.”
  • A major development since the end of the “Great Compression” of the 30 years or so after World War II, when there was less inequality and relatively greater job security, at least for white male workers, is that the differential rate of return on education and training is now much higher.
  • Trump, Richeson continued,leaned into the underlying White nationalist sentiments that had been on the fringe in his campaign for the presidency and made his campaign about re-centering Whiteness as what it actually means to be American and, by implication, delegitimizing claims for greater racial equity, be it in policing or any other important domain of American life.
  • Whites in the last 60 years have seen minoritized folks gain more political power, economic and educational opportunity. Even though these gains are grossly exaggerated, Whites experience them as a loss in group status.
  • all the rights revolutions — civil rights, women’s rights, gay rights — have been key to the emergence of the contemporary right wing:As the voices of women, people of color, and other traditionally marginalized communities grow louder the frame of reference from which we tell the story of American is expanding
  • The white male story is not irrelevant but it’s insufficient, and when you have a group of people that are accustomed to the spotlight see the camera lens pan away, it’s a threat to their sense of self. It’s not surprising that QAnon support started to soar in the weeks after B.L.M. QAnon offers a way for white evangelicals to place blame on (fictional) bad people instead of a broken system. It’s an organization that validates the source of Q-Anoners insecurity — irrelevance — and in its place offers a steady source of self-righteousness and acceptance.
  • “compared to other advanced countries caught up in the transition to knowledge society, the United States appears to be in a much more vulnerable position to a strong right-wing populist challenge.”
  • First, Kitschelt noted,The difference between economic winners and losers, captured by income inequality, poverty, and illiteracy rates within the dominant white ethnicity, is much greater than in most other Western countries, and there is no dense welfare state safety net to buffer the fall of people into unemployment and poverty.
  • Another key factor, Kitschelt pointed out, is thatThe decline of male status in the family is more sharply articulated than in Europe, hastened in the U.S. by economic inequality (men fall further under changing economic circumstances) and religiosity (leading to pockets of greater male resistance to the redefinition of gender roles).
  • More religious and less well-educated whites see Donald Trump as one of their own despite his being so obviously a child of privilege. He defends America as a Christian nation. He defends English as our national language. He is unashamed in stating that the loyalty of any government should be to its own citizens — both in terms of how we should deal with noncitizens here and how our foreign policy should be based on the doctrine of “America First.”
  • On top of that, in the United States.Many lines of conflict mutually reinforce each other rather than crosscut: Less educated whites tend to be more Evangelical and more racist, and they live in geographical spaces with less economic momentum.
  • for the moment the nation faces, for all intents and purposes, the makings of a civil insurgency. What makes this insurgency unusual in American history is that it is based on Trump’s false claim that he, not Joe Biden, won the presidency, that the election was stolen by malefactors in both parties, and that majorities in both branches of Congress no longer represent the true will of the people.
  • We would not have Trump as president if the Democrats had remained the party of the working class. The decline of labor unions proceeded at the same rate when Democrats were president as when Republicans were president; the same is, I believe, true of loss of manufacturing jobs as plants moved overseas.
  • President Obama, Grofman wrote,responded to the housing crisis with bailouts of the lenders and interlinked financial institutions, not of the folks losing their homes. And the stagnation of wages and income for the middle and bottom of the income distribution continued under Obama. And the various Covid aid packages, while they include payments to the unemployed, are also helping big businesses more than the small businesses that have been and will be permanently going out of business due to the lockdowns (and they include various forms of pork.
  • “white less well-educated voters didn’t desert the Democratic Party, the Democratic Party deserted them.”
  • nlike most European countries, Kitschelt wrote,The United States had a civil war over slavery in the 19th century and a continuous history of structural racism and white oligarchical rule until the 1960s, and in many aspects until the present. Europe lacks this legacy.
  • He speaks in a language that ordinary people can understand. He makes fun of the elites who look down on his supporters as a “basket of deplorables” and who think it is a good idea to defund the police who protect them and to prioritize snail darters over jobs. He appoints judges and justices who are true conservatives. He believes more in gun rights than in gay rights. He rejects political correctness and the language-police and woke ideology as un-American. And he promises to reclaim the jobs that previous presidents (of both parties) allowed to be shipped abroad. In sum, he offers a relatively coherent set of beliefs and policies that are attractive to many voters and which he has been better at seeing implemented than any previous Republican president.
  • What Trump supporters who rioted in D.C. share are the beliefs that Trump is their hero, regardless of his flaws, and that defeating Democrats is a holy war to be waged by any means necessary.
  • In the end, Grofman said,Trying to explain the violence on the Hill by only talking about what the demonstrators believe is to miss the point. They are guilty, but they wouldn’t be there were it not for the Republican politicians and the Republican attorneys general, and most of all the president, who cynically exaggerate and lie and create fake conspiracy theories and demonize the opposition. It is the enablers of the mob who truly deserve the blame and the shame.
anonymous

Spike Lee and the Battlefield of American History - The New York Times - 0 views

  • This is Lee at a strange and singular moment in his career. He has spent nearly four decades and more than 30 films reckoning with the jagged and brutal course of history. Now, in the middle of a global calamity, and with a new film, “Da 5 Bloods,” that revisits the Vietnam War, he is its witness once again — older, more contemplative and as insatiable as ever, despite a legacy as solid as exists in American cinema.
  • If front-line workers are the heroes of this story, it’s clear who Lee thinks is the villain. The director, an outspoken antagonist of Donald J. Trump since the 1980s, lamented the president’s “pathetic lack of leadership,” singling out his widely condemned public musings on crackpot treatments for the virus.
  • “Telling people to use ultraviolet lights? Drinking bleach and whatnot?” Lee said, leaning into a chuckle. He squinted, as if he still couldn’t believe it himself. “People will go to the hospital because they believe” that stuff, he said. “Get out of here with that!”
  • ...5 more annotations...
  • Trump is a significant figure in “Da 5 Bloods,” an action-adventure tale about four black veterans who return to Vietnam more than 40 years after the war. A central character, Paul, played by the longtime Lee collaborator Delroy Lindo, is an avowed Trump supporter and spends much of the film in a red “Make America Great Again” hat.
  • Though Paul’s vocal defense of the president may come as a surprise to some, Lee has a long track record portraying complicated black characters without sanitizing them. Exit polls show that while the vast majority of black voters overall supported Hillary Clinton in the 2016 presidential election, 13 percent of black men supported Trump.
  • The two were particularly interested in the psychology of black soldiers who fought for freedoms abroad that they’d been denied at home, a subject Lee previously explored in his World War II film “Miracle at St. Anna” (2008).
  • “Da 5 Bloods,” which, in addition to footage of antiwar protests, is intercut with some extremely graphic documentary images of the war, including a haunting photo from the My Lai massacre, reaffirms Lee’s capacity for outrage at his country. That capacity was tested again recently, when footage showing the killing of Ahmaud Arbery was released earlier this month.
  • “It’s 2020, and black and brown people are being shot like animals,” Lee said, his voice climbing to a new register. “Tell me in what world can two brothers with a handgun and a shotgun follow a white jogger in a pickup truck, kill him, and it takes two months for them to get arrested?”
Javier E

FC95: The Age of Louis XIV, the "Sun King" (1643-1715) - The Flow of History - 2 views

  • Introduction From 1643 to 1815 France dominated much of Europe's political history and culture.  Foreigners came to France, preferring it to the charms of their own homeland.  Even today, many still consider it the place to visit in Europe and the world.  In the 1600's and 1700's there was a good reason for this dominance: population.  France had 23,000,000 people in a strongly unified state compared to 5,000,000 in Spain and England, and 2,000,000 in the Dutch Republic and the largest of the German states.  This reservoir of humanity first reached for and nearly attained the dominance of Europe under Louis XIV, the "Sun King".
  • Louis' early life and reign (1643-61) Louis was born in 1638 and succeeded his father, Louis XIII, as king in 1643 at the age of five.  Luckily, another able minister and Richelieu's successor, Cardinal Mazarin, continued to run the government.  In 1648, encroachment by the government on the nobles' power, poor harvests, high taxes, and unemployed mercenaries plundering the countryside after the Thirty Years War led to a serious revolt known as the Fronde, named after the slingshot used by French boys.  Louis and the court barely escaped from Paris with their lives.  Although Mazarin and his allies crushed the rebels after five hard years of fighting (1648-53), Louis never forgot the fear and humiliation of having to run from the Parisian mob and fight for his life and throne against the nobles.  This bitter experience would heavily influence Louis' policies when he ruled on his own.
  • Louis XIV may not have said, "I am the state", but he ruled as if he had said it.  Louis was the supreme example of the absolute monarch, and other rulers in Europe could do no better than follow his example.  Although Louis wished to be remembered as a great conqueror, his first decade of active rule was largely taken up with building France's internal strength.  There are two main areas of Louis' rule we will look at here: finances and the army. Louis' finance minister, Jean Baptiste Colbert, was an astute businessman of modest lineage, being the son of a draper.  Colbert's goal was to build France's industries and reduce foreign imports.  This seventeenth century policy where a country tried to export more goods and import more gold and silver was known as mercantilism.  While its purpose was to generate revenue for the king, it also showed the growing power of the emerging nation state.  Colbert declared his intention to reform the whole financial structure of the French state, and he did succeed in reducing the royal debt by cutting down on the number of tax farms he sold and freeing royal lands from mortgage.  Colbert especially concentrated on developing France's economy in three ways.
  • ...7 more annotations...
  • Versailles Louis' religious faith was largely a superficial one attached to the elaborate ritual of the Catholic mass.  This love of ritual also showed itself in how Louis ran his court at his magnificent palace of Versailles, several miles outside of Paris.  Much of the reason for building Versailles goes back to the Fronde that had driven Louis from Paris as a young boy.  Ever since then, Louis had distrusted the volatile Paris mob and was determined to move the court away from the influence of that city.  Versailles was also the showpiece of Louis' reign, glorifying him as the Sun King with its magnificent halls and gardens.
  • Religion was one aspect of Louis' reign that illustrated the absolute nature of his monarchy quite well.  Louis himself was quite a pious Catholic, learning that trait from his mother.  However, in the spirit of the day, he saw religion as a department of state subordinate to the will of the king.  By the same token, not adhering to the Catholic faith was seen as treason. As a result, Louis gradually restricted the rights of the French Huguenots and finally, in 1685, revoked the Edict of Nantes, which had given them religious freedom since the end of the French Wars of Religion in 1598.  This drove 200,000 Huguenots out of France, depriving it of some of its most skilled labor.  Thus Louis let his political and religious biases ruin a large sector of France's economy.
  • Results of Louis' reign The age of Louis XIV was important to European history for several reasons.  First of all, it saw the triumph of absolutism in France and continental Europe.  Versailles was a glittering symbol and example for other European rulers to follow.  Any number of German and East European monarchs modeled their states and courts after Louis XIV, sometimes to the point of financial ruin.  Second, Louis' wars showed the system of Balance of Power politics working better than ever.  French aggression was contained and the status quo was maintained.  All this had its price, since the larger sizes of the armies and the final replacement of the pike with the musket took European warfare to a new level of destruction.  Finally, Louis' reign definitely established France as the dominant power in Europe.  However, the cost was immense and left his successors a huge debt.  Ironically, the problems caused by Louis XIV's reign would help lead to the French Revolution in 1789 and the spread of democratic principles across Europe and eventually the world.
  • Louis' main goals were to expand France to its "natural borders": the Rhine, the Alps, and the Pyrenees.  This, of course, would make him enemies among the Dutch, Germans, Austrians, Spanish, and English.  Therefore, Louis' diplomacy had to clear the way to make sure he did not fight everyone at once.  For this purpose he skillfully used money to neutralize potential enemies (such as Charles II of England in the Secret Treaty of Dover) and extracted favorable terms from stalemate or losing situations.  But Louis could also make some fateful blunders to hurt his cause.  His obsessive hatred of the Dutch dominated his policy too much, as did his own self-confidence and arrogance in trying to publicly humiliate his enemies.  However, this just alarmed Louis' enemies more, especially the Dutch, Austrians, and English, who allied against Louis to preserve the balance of power.
  • Exhaustion on both sides finally led to the Treaty of Utrecht in 1713.  Louis' grandson took the throne of Spain and its American empire, but the French and Spanish thrones could not be united under one ruler.  Austria got the Spanish Netherlands to contain French aggression to the north.  Just as the Treaty of Westphalia in 1648 had contained Hapsburg aggression, the Treaty of Utrecht contained French expansion.  Two years later Louis XIV was dead, with little to show for his vaunted ambitions as a conqueror except an exhausted economy and dissatisfied populace.
  • Just as Louis's palace at Versailles dominated European culture during the late 1600's and early 1700's, his diplomacy and wars dominated Europeans political history.  As Louis himself put it: "The character of a conqueror is regarded as the noblest and highest of titles."  Interestingly enough, he never led his troops in battle except for overseeing a few sieges from a safe distance.
  • I am the state. — Voltaire, incorrectly quoting Louis XIV
Javier E

Opinion | Therapy Culture Has Undermined Our Maturity - The New York Times - 0 views

  • to trace the decline of the American psyche, I suppose I would go to a set of cultural changes that started directly after World War II and built over the next few decades, when writers as diverse as Philip Rieff, Christopher Lasch and Tom Wolfe noticed the emergence of what came to be known as the therapeutic culture.
  • many writers noticed that this ethos often turned people into fragile narcissists. It cut them off from moral traditions and the normal sources of meaning and identity. It pushed them in on themselves, made them self-absorbed, craving public affirmation so they could feel good about themselves
  • in a therapeutic culture people’s sense of self-worth depends on their subjective feelings about themselves. Do I feel good about myself? Do I like me?
  • ...32 more annotations...
  • In earlier cultural epochs, many people derived their self-worth from their relationship with God, or from their ability to be a winner in the commercial marketplace
  • As Lasch wrote in his 1979 book, “The Culture of Narcissism,” such people are plagued by an insecurity that can be “overcome only by seeing his ‘grandiose self’ reflected in the attentions of others.”
  • “Plagued by anxiety, depression, vague discontents, a sense of inner emptiness, the ‘psychological man’ of the 20th century seeks neither individual self-aggrandizement nor spiritual transcendence but peace of mind, under conditions that increasingly militate against it.”
  • Fast forward a few decades, and the sense of lostness and insecurity, which Lasch and many others had seen in nascent form, had transmogrified into a roaring epidemic of psychic pain. By, say, 2010, it began to be clear that we were in the middle of a mental health crisis, with rising depression and suicide rates, an epidemic of hopelessness and despair among the young.
  • Before long, safetyism was on the march. This is the assumption that people are so fragile they need to be protected from social harm. Slate magazine proclaimed 2013 “the year of the trigger warning.” Concepts like “microaggression” and “safe spaces” couldn’t have lagged far behind.
  • Social media became a place where people went begging for attention, validation and affirmation — even if they often found rejection instead.
  • the elephantiasis of trauma
  • Once, the word “trauma” referred to brutal physical wounding one might endure in war or through abuse. But usage of the word spread so that it was applied across a range of upsetting experiences.
  • A mega-best-selling book about trauma, “The Body Keeps the Score,” by Bessel van der Kolk, became the defining cultural artifact of the era. Parul Sehgal wrote a perceptive piece in The New Yorker called “The Case Against the Trauma Plot,” noting how many characters in novels, memoirs and TV shows are trying to recover from psychological trauma — from Ted Lasso on down. In January 2022, Vox declared that “trauma” had become “the word of the decade,” noting that there were over 5,500 podcasts with the word in the title.
  • For many people, trauma became their source of identity. People began defining themselves by the way they had been hurt.
  • a culture war, and that’s what happened to the psychological crisis. In one camp, there were the coddlers.
  • They sought to alter behavior and reform institutions so that no one would feel emotionally unsafe
  • the first bad idea in “The Coddling of the American Mind.” It was the notion that “what doesn’t kill you makes you weaker,” inducing people to look at the wounds in their past and feel debilitated, not stronger.
  • the coddling approach turned out to be counterproductive. It was based on a series of false ideas that ended up hurting the people it was trying to help.
  • People on all sides genuinely come to believe they are powerless, unwilling to assume any responsibility for their plight — another classic symptom of immaturity.
  • The third bad idea is, “If I keep you safe, you will be strong.”
  • But overprotective parenting and overprotective school administration don’t produce more resilient children; they produce less resilient ones.
  • The counterreaction to the coddlers came from what you might call the anti-fragile coalition. This was led by Jordan Peterson and thousands of his lesser imitators
  • they merely represented the flip side of the fragile victim mind-set.
  • The right-wing victimologists feel beset by hidden forces trying to oppress them, by a culture that conspires to unman them, dark shadowy conspiracies all around
  • recent right-wing narratives, even J.D. Vance’s “Hillbilly Elegy,” often follow the trauma formula: “Take the lamentations about atrophying manhood and falling sperm counts. Call it what you want, but the core idea is always shaped like trauma. Once, we were whole, but now we’re not; now we suffer from a sickness we struggle to grasp or name.”
  • The instability of the self has created an immature public culture — impulsive, dramatic, erratic and cruel. In institution after institution, from churches to schools to nonprofits, the least mature voices dominate and hurl accusations, while the most mature lie low, trying to get through the day.
  • They are considerate to and gracious toward others because they can see situations from multiple perspectives
  • The founders of the therapeutic ethos thought they were creating autonomous individualists who would feel good about themselves. But, as Lasch forecast: “The narcissist depends on others to validate his self-esteem. He cannot live without an admiring audience. His apparent freedom from family ties and institutional constraints does not free him to stand alone or to glory in his individuality. On the contrary, it contributes to his insecurity.”
  • Maturity, now as ever, is understanding that you’re not the center of the universe. The world isn’t a giant story about me.
  • In a nontherapeutic ethos, people don’t build secure identities on their own. They weave their stable selves out of their commitments to and attachments with others. Their identities are forged as they fulfill their responsibilities as friends, family members, employees, neighbors and citizens. The process is social and other-absorbed; not therapeutic.
  • Maturity in this alternative ethos is achieved by getting out of your own selfish point of view and developing the ability to absorb, understand and inhabit the views of others.
  • Mature people are calm amid the storm because their perception lets them see the present challenges from a long-term vantage.
  • The second false idea was, “I am a thing to whom things happen.” The traumatized person is cast as a passive victim unable to control his own life. He is defined by suffering and
  • They can withstand the setbacks because they have pointed their life toward some concrete moral goal.
  • “one of the greatest indicators of our own spiritual maturity is revealed in how we respond to the weaknesses, the inexperience and the potentially offensive actions of others.”
  • a sign of maturity is the ability to respond with understanding when other people have done something stupid and given you the opportunity to feel superior.
criscimagnael

Deep in Vatican Archives, Scholar Discovers 'Flabbergasting' Secrets - The New York Times - 0 views

  • David Kertzer has spent decades excavating the Vatican’s hidden history, with his work winning a Pulitzer and capturing Hollywood’s attention. A new book examines Pope Pius XII’s role in the Holocaust.
  • Over the last few decades, Mr. Kertzer has turned the inquisitive tables on the church. Using the Vatican’s own archives, the soft-spoken Brown University professor and trustee at the American Academy in Rome has become arguably the most effective excavator of the Vatican’s hidden sins, especially those leading up to and during World War II.
  • The son of a rabbi who participated in the liberation of Rome as an Army chaplain, Mr. Kertzer grew up in a home that had taken in a foster child whose family was murdered in Auschwitz.
  • ...12 more annotations...
  • “Part of what I hope to accomplish,” Mr. Kertzer said, “is to show how important a role Pius XII played.”
  • Mr. Kertzer makes the case that Pius XII’s overriding dread of Communism, his belief that the Axis powers would win the war, and his desire to protect the church’s interests all motivated him to avoid offending Hitler and Mussolini, whose ambassadors had worked to put him on the throne. The pope was also worried, the book shows, that opposing the Führer would alienate millions of German Catholics.
  • The book further reveals that a German prince and fervent Nazi acted as a secret back channel between Pius XII and Hitler, and that the pope’s top Vatican adviser on Jewish issues urged him in a letter not to protest a Fascist order to arrest and send to concentration camps most of Italy’s Jews.
  • “A more open protest would not have saved a single Jew but killed even more,” Michael Hesemann, who considers Pius XII a champion of Jews, wrote in response to the evidence revealed by Mr. Kertzer, whom he called “heavily biased.”
  • Since then, Vatican archivists recognize and, sometimes, encourage him.
  • On Oct. 16, 1943, Nazis rounded up more than a thousand of them throughout the city, including hundreds in the Jewish ghetto, now a tourist attraction where crowds feast on Jewish-style artichokes near a church where Jews were once forced to attend conversion sermons.
  • “They didn’t want to offend the pope,” Mr. Kertzer said. His book shows that Pius XII’s top aides only interceded with the German ambassador to free “non-Aryan Catholics.” About 250 were released. More than a thousand were murdered in Auschwitz.
  • One U.S. soldier, a Jew from Rome who had emigrated to America when Mussolini introduced Italy’s racial laws, asked Rabbi Kertzer if he could make an announcement to see if his mother had survived the war. The rabbi positioned the soldier at his side, and when the services started, a cry broke out and the G.I.’s mother rushed up to embrace her son.
  • At Brown University, his organizing against the Vietnam War nearly got him kicked out, and landed him in a jail cell with Norman Mailer. He stayed in school and became enamored with anthropology and with Susan Dana, a religion major from Maine.
  • In the early 1990s, an Italian history professor told him about Edgardo Mortara, a 6-year-old child of Jewish parents in Bologna. In 1858, the church Inquisitor ordered the boy seized because a Christian servant girl had possibly, and secretly, had him baptized, and so he could not remain in a Jewish family.
  • Mr. Kertzer argues that the unearthed documents paint a more nuanced picture of Pius XII, showing him as neither the antisemitic monster often called “Hitler’s Pope” nor a hero.
  • “Perhaps even they’re happy that some outsider is able to bring this to light because it’s awkward, perhaps, for some of them to do so,” he said.
anonymous

Wooing Trump, Xi Jinping Seeks Great Power Status for China - The New York Times - 0 views

  • Wooing Trump, Xi Jinping Seeks Great Power Status for China
  • Chinese leaders have long sought to present themselves as equals to American presidents. Xi Jinping has wanted something more: a special relationship that sets China apart, as the other great power in an emerging bipolar world.
  • Mr. Trump has often cast China as an unfair trade rival, and, after arriving in Japan on Sunday, he vowed to build a “free and open Indo-Pacific,” a phrase designed to emphasize America’s democratic allies in the region as a balance against China’s rise.
  • ...6 more annotations...
  • “The outcome of this clash of national ambitions will be one of the great, perhaps perilous stories of the next several decades,” said David M. Lampton, a professor at the Johns Hopkins School of Advanced International Studies.
  • the pomp will also be a chance for Mr. Xi to showcase his “China Dream” — a vision of his nation joining or perhaps supplanting the United States as a superpower leading the world.
  • Mr. Xi is expected to propose some version of what he has called a “new type of great power relations,” the idea that China and the United States should share global leadership as equals and break a historical pattern of conflict between rising and established powers.
  • Mr. Xi has positioned China as a stable alternative to the United States, willing to take on the obligations of global leadership and invest in big infrastructure projects across Asia and Europe much as the United States did after World War II.
  • Mr. Xi is now the unquestioned paramount leader of China, Professor Yan added, while Mr. Trump only “represents himself.”
  • “The word ‘partnership’ has a long and sad history in the vocabulary of U.S.-China relations,” Professor Lampton said. “American politicians since have avoided the word partnership.”
Javier E

The Coming Software Apocalypse - The Atlantic - 0 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • What made programming so difficult was that it required you to think like a computer.
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

How America Went Haywire - The Atlantic - 0 views

  • You are entitled to your own opinion, but you are not entitled to your own facts.
  • Why are we like this?The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.
  • The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites.
  • ...92 more annotations...
  • Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.
  • Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.
  • Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking
  • The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.
  • The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them
  • Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.
  • we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.
  • We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.
  • For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.
  • It was a headquarters for a new religion of no religion, and for “science” containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.
  • These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is “a theory not a fact.”
  • The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.” Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.
  • And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.” If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.
  • Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible … There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.
  • During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent.
  • The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.
  • over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explained—so was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself
  • Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger.
  • then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.
  • To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.” “What is ‘real’ to a Tibetan monk may not be ‘real’ to an American businessman.”
  • In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.
  • Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”
  • Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certaint
  • Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.
  • Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies.
  • Conspiracy became the high-end Hollywood dramatic premise—Chinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream.
  • t more and more people on both sides would come to believe that an extraordinarily powerful cabal—international organizations and think tanks and big businesses and politicians—secretly ran America.
  • Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the ’60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the ’70s on the paranoid left before it became a fixture on the right.
  • Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didn’t subside, but grew and thrived—and came to seem unexceptional.
  • Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.
  • Starting in the ’80s, loving America and making money and having a family were no longer unfashionable.The sense of cultural and political upheaval and chaos dissipated—which lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.
  • For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.
  • Relativism became entrenched in academia—tenured, you could say
  • as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”
  • After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.
  • America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.
  • In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment
  • Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.
  • I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did.
  • Limbaugh’s virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day.
  • Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.
  • Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries.
  • there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the ’60s, and now the match was lit and thrown
  • After the ’60s and ’70s happened as they happened, the internet may have broken America’s dynamic balance between rational thinking and magical thinking for good.
  • Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers
  • Why did Senator Daniel Patrick Moynihan begin remarking frequently during the ’80s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say
  • Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.
  • On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.
  • Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any “alternative” theory or belief seems to generate more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of result
  • If more and more of a political party’s members hold more and more extreme and extravagantly supernatural beliefs, doesn’t it make sense that the party will be more and more open to make-believe in its politics?
  • an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.
  • Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, “Individuals’ explicit religious and paranormal beliefs” are the best predictors of their “perception of purpose in life events”—their tendency “to view the world in terms of agency, purpose, and design.”
  • Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the ’60s, that exceptional religiosity has fed the tendency to believe in conspiracies.
  • Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.
  • People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, America’s unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.
  • Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?
  • One reason, I think, is religion. The GOP is now quite explicitly Christian
  • , as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but ratherthey are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.
  • Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades.
  • The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that “a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government”? Yes, say 34 percent of Republican voters, according to Public Policy Polling.
  • starting in the ’90s, the farthest-right quarter of Americans, let’s say, couldn’t and wouldn’t adjust their beliefs to comport with their side’s victories and the dramatically new and improved realities. They’d made a god out of Reagan, but they ignored or didn’t register that he was practical and reasonable, that he didn’t completely buy his own antigovernment rhetoric.
  • Another way the GOP got loopy was by overdoing libertarianism
  • Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and don’t spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish
  • For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans
  • Karl Rove was stone-cold cynical, the Wizard of Oz’s evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how “judicious study of discernible reality [is] … not the way the world really works anymore.” These leaders were rational people who understood that a large fraction of citizens don’t bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.
  • But over the past few decades, a lot of the rabble they roused came to believe all the untruths. “The problem is that Republicans have purposefully torn down the validating institutions,”
  • “They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse.”
  • What had been the party’s fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.
  • The Christian takeover happened gradually, but then quickly in the end, like a phase change from liquid to gas. In 2008, three-quarters of the major GOP presidential candidates said they believed in evolution, but in 2012 it was down to a third, and then in 2016, just one did
  • A two-to-one majority of Republicans say they “support establishing Christianity as the national religion,” according to Public Policy Polling.
  • Although constitutionally the U.S. can have no state religion, faith of some kind has always bordered on mandatory for politicians.
  • What connects them all, of course, is the new, total American embrace of admixtures of reality and fiction and of fame for fame’s sake. His reality was a reality show before that genre or term existed
  • When he entered political show business, after threatening to do so for most of his adult life, the character he created was unprecedented—presidential candidate as insult comic with an artificial tan and ridiculous hair, shamelessly unreal and whipped into shape as if by a pâtissier.
  • Republicans hated Trump’s ideological incoherence—they didn’t yet understand that his campaign logic was a new kind, blending exciting tales with a showmanship that transcends ideology.
  • Trump waited to run for president until he sensed that a critical mass of Americans had decided politics were all a show and a sham. If the whole thing is rigged, Trump’s brilliance was calling that out in the most impolitic ways possible, deriding his straight-arrow competitors as fakers and losers and liars—because that bullshit-calling was uniquely candid and authentic in the age of fake.
  • Trump took a key piece of cynical wisdom about show business—the most important thing is sincerity, and once you can fake that, you’ve got it made—to a new level: His actual thuggish sincerity is the opposite of the old-fashioned, goody-goody sanctimony that people hate in politicians.
  • Trump’s genius was to exploit the skeptical disillusion with politics—there’s too much equivocating; democracy’s a charade—but also to pander to Americans’ magical thinking about national greatness. Extreme credulity is a fraternal twin of extreme skepticism.
  • Trump launched his political career by embracing a brand-new conspiracy theory twisted around two American taproots—fear and loathing of foreigners and of nonwhites.
  • The fact-checking website PolitiFact looked at more than 400 of his statements as a candidate and as president and found that almost 50 percent were false and another 20 percent were mostly false.
  • He gets away with this as he wouldn’t have in the 1980s or ’90s, when he first talked about running for president, because now factual truth really is just one option. After Trump won the election, he began referring to all unflattering or inconvenient journalism as “fake news.”
  • indeed, their most honest defense of his false statements has been to cast them practically as matters of religious conviction—he deeply believes them, so … there. When White House Press Secretary Sean Spicer was asked at a press conference about the millions of people who the president insists voted illegally, he earnestly reminded reporters that Trump “has believed that for a while” and “does believe that” and it’s “been a long-standing belief that he’s maintained” and “it’s a belief that he has maintained for a while.”
  • Which is why nearly half of Americans subscribe to that preposterous belief themselves. And in Trump’s view, that overrides any requirement for facts.
  • he idea that progress has some kind of unstoppable momentum, as if powered by a Newtonian law, was always a very American belief. However, it’s really an article of faith, the Christian fantasy about history’s happy ending reconfigured during and after the Enlightenment as a set of modern secular fantasies
  • I really can imagine, for the first time in my life, that America has permanently tipped into irreversible decline, heading deeper into Fantasyland. I wonder whether it’s only America’s destiny, exceptional as ever, to unravel in this way. Or maybe we’re just early adopters, the canaries in the global mine
  • I do despair of our devolution into unreason and magical thinking, but not everything has gone wrong.
  • I think we can slow the flood, repair the levees, and maybe stop things from getting any worse. If we’re splitting into two different cultures, we in reality-based America—whether the blue part or the smaller red part—must try to keep our zone as large and robust and attractive as possible for ourselves and for future generations
  • We need to firmly commit to Moynihan’s aphorism about opinions versus facts. We must call out the dangerously untrue and unreal
  • do not give acquaintances and friends and family members free passes. If you have children or grandchildren, teach them to distinguish between true and untrue as fiercely as you do between right and wrong and between wise and foolish.
  • How many Americans now inhabit alternate realities?
  • reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.
  • Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”
  • A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.
jongardner04

Future Tsar Nicholas II Born in Russia - 0 views

  • On 18th May, 1868, the last Tsar of Russia, Nicholas II, was born near St. Petersburg. Nicholas’ reign would go on to witness some of the most significant moments in Russian history: the Russo-Japanese War, the 1905 Revolution, the First World War and ultimately, the Revolution of 1917 which saw the abolition of the monarchy and Russia’s transition to Bolshevism.
Javier E

AOC Isn't Interested in American Exceptionalism - The Atlantic - 0 views

  • American exceptionalism does not merely connote cultural and political uniqueness. It connotes moral superiority
  • Embedded in exceptionalist discourse is the belief that, because America has a special devotion to democracy and freedom, its sins are mostly incidental. The greatest evils humankind has witnessed, in places such as the Nazi death camps, are far removed from anything Americans would ever do
  • America’s adversaries commit crimes; America merely stumbles on its way to doing the right thing. This distinction means that, in mainstream political discourse, the ugliest terms—fascism, dictatorship, tyranny, terrorism, imperialism, genocide—are generally reserved for phenomena beyond America’s shores.
  • ...18 more annotations...
  • when the anti-war and other protest movements of the 1960s faded, so did their challenge to exceptionalist language. By the 1980s, Democrats were playing catch-up to Ronald Reagan’s flag-waving patriotism
  • During the Barack Obama years, questioning American exceptionalism was considered a career-imperiling transgression. When Republicans questioned his commitment to the creed, Obama in 2014 replied, “I believe in American exceptionalism with every fiber of my being.”
  • a resurgent left fueled by an influx of Millennial voters has launched a new challenge to exceptionalist discourse
  • Partly, it’s because a higher percentage of Millennials are people of color, who generally look more skeptically on America’s claims of moral innocence
  • Partly, it’s because the financial crisis has cast doubt on whether America’s economic model is preferable to those practiced in other nations. Younger Americans—a majority of whom embrace “socialism”—believe it’s not
  • Most of all, the challenge to exceptionalism is a response to Trump.
  • . A 2017 Pew Research Center survey found that Americans over the age of 65 were 37 points more likely to say the “U.S. stands above all other countries in the world” than that “there are other countries that are better than the U.S.”
  • Americans under 30 split in the opposite direction. By a margin of 16 points, they said some other countries were better.
  • While conservatives affirm America’s superiority by a margin of almost 10 to one, liberals reject it by more than two to one.
  • A few years ago, commentators rarely evoked the specter of American “authoritarianism.” Now it’s commonplace
  • With his embrace of foreign authoritarians and his cultivation of conservatism’s xenophobic and racist fringes, Trump has become a galvanizing figure for the left, which for the first time since the 1960s has begun regularly evoking the specter of American “fascism.
  • Fascism didn’t seem like an American problem. That’s no longer the case. Leftist street activists now embrace the term antifa, and the movement has grown dramatically under Trump.
  • they’re also reinterpreting the American past. New scholarship has, for instance, muddied the distinction between German Nazism and early-20th-century American white supremacy.
  • Adam Serwer excavated the work of World War I–era racial theorists such as Madison Grant to show that the “seed of Nazism’s ultimate objective—the preservation of a pure white race, uncontaminated by foreign blood—was in fact sown with striking success in the United States.”
  • This willingness to equate American white supremacy with the barbarism that occurs in other countries has also shaped the way the left describes terrorism.
  • Now it’s become common, not only among leftist commentators but among Democratic politicians, to apply the term to violence committed by native-born white Americans.
  • When remembering the detention of Japanese Americans during World War II, Americans have generally employed the term internment camps—largely, the historian Roger Daniels has argued, to create a clear separation between America’s misdeeds and those of its hated foes.
  • They are challenging not only the physical and legal barriers that Trump is erecting against immigrants entering the United States, but also the conceptual barriers that American exceptionalism erects against seeing the United States as a nation capable of evil
Javier E

How Will the Coronavirus Change Us? - The Atlantic - 0 views

  • Although medical data from the time are too scant to be definitive, its first attack is generally said to have occurred in Kansas in March 1918, as the U.S. was stepping up its involvement in the First World War.
  • Estimates of the final death toll range from 17 million to 100 million, depending on assumptions about the number of uncounted victims. Almost 700,000 people are thought to have died in the United States—as a proportion of the population, equivalent to more than 2 million people today.
  • Garthwaite matched NHIS respondents’ health conditions to the dates when their mothers were probably exposed to the flu. Mothers who got sick in the first months of pregnancy, he discovered, had babies who, 60 or 70 years later, were unusually likely to have diabetes; mothers afflicted at the end of pregnancy tended to bear children prone to kidney disease. The middle months were associated with heart disease.
  • ...23 more annotations...
  • Other studies showed different consequences. Children born during the pandemic grew into shorter, poorer, less educated adults with higher rates of physical disability than one would expect
  • the microorganisms likely killed more people than the war did. And their effects weren’t confined to European battlefields, but spread across the globe, emptying city streets and filling cemeteries on six continents.
  • Unlike the war, the flu was incomprehensible—the influenza virus wasn’t even identified until 1931. It inspired fear of immigrants and foreigners, and anger toward the politicians who played down the virus
  • killed more men than women, skewing sex ratios for years afterward. Can one be sure that the ensuing, abrupt changes in gender roles had nothing to do with the virus?
  • the accompanying flood of anti-Semitic violence. As it spread through Germany, Switzerland, France, Spain, and the Low Countries, it left behind a trail of beaten cadavers and burned homes.
  • In northern Italy, landlords tended to raise wages, which fostered the development of a middle class. In southern Italy, the nobility enacted decrees to prevent peasants from leaving to take better offers. Some historians date the separation in fortunes of the two halves of Italy—the rich north, the poor south—to these decisions.
  • When the Black Death began, the English Plantagenets were in the middle of a long, brutal campaign to conquer France. The population losses meant such a rise in the cost of infantrymen that the whole enterprise foundered. English nobles did not occupy French châteaus. Instead they stayed home and tried to force their farmhands to accept lower wages. The result, the Peasants’ Revolt of 1381, nearly toppled the English crown. King Richard II narrowly won out, but the monarchy’s ability to impose taxes, and thus its will, was permanently weakened.
  • The coronavirus is hitting societies that regarded deadly epidemics as things of the past, like whalebone corsets and bowler hats.
  • The American public has not enjoyed its surprise reentry into the world of contagion and quarantine—and this unhappiness seems likely to have consequences.
  • People sought new sources of authority, finding them through direct personal experience with the world and with God.
  • With the supply of European workers suddenly reduced and the demand for labor relatively unchanged, medieval landowners found themselves in a pickle: They could leave their grain to rot in the fields, or they could abandon all sense of right and wrong and raise wages enough to attract scarce workers
  • Within a few decades, Cohn wrote, hysteria gave way to sober observation. Medical tracts stopped referring to conjunctions of Saturn and prescribed more earthly cures: ointments, herbs, methods for lancing boils. Even priestly writings focused on the empirical. “God was not mentioned,” Cohn noted. The massacres of Jews mostly stopped.
  • the lesson seems more that humans confronting unexpected disaster engage in a contest for explanation—and the outcome can have consequences that ripple for decades or centuries.
  • Columbus’s journey to the Americas set off the worst demographic catastrophe in history
  • Somewhere between two-thirds and nine-tenths of the people in the Americas died. Many later European settlers, like my umpteen-great-grandparents, believed they were coming to a vacant wilderness. But the land was not empty; it had been emptied—a world of loss encompassed in a shift of tense.
  • Absent the diseases, it is difficult to imagine how small groups of poorly equipped Europeans at the end of very long supply chains could have survived and even thrived in the alien ecosystems of the Americas
  • “I fully support banning travel from Europe to prevent the spread of infectious disease,” the Cherokee journalist Rebecca Nagle remarked after President Trump announced his plan to do this. “I just think it’s 528 years too late.”
  • a possible legacy of Hong Kong’s success with SARS is that its citizens seem to put more faith in collective action than they used to
  • The result will be, among other things, a test of how much contemporary U.S. society values the elderly.
  • The speed with which pundits emerged to propose that the U.S. could more easily tolerate a raft of dead oldsters than an economic contraction indicates that the reservoir of appreciation for today’s elders is not as deep as it once was
  • the 2003 SARS epidemic in Hong Kong. That epidemic, which killed about 300 people, was stopped only by heroic communal efforts. (As a percentage of the population, the equivalent U.S. death toll would be about 15,000.)
  • For Native peoples, the U-shaped curve was as devastating as the sheer loss of life. As an indigenous archaeologist once put it to me, the epidemics simultaneously robbed his nation of its future and its past: the former, by killing all the children; the latter, by killing all the elders, who were its storehouses of wisdom and experience.
  • Past societies mourned the loss of collective memory caused by epidemics. Ours may not, at least at first.
« First ‹ Previous 161 - 180 of 405 Next › Last »
Showing 20 items per page