Skip to main content

Home/ History Readings/ Group items tagged past

Rss Feed Group items tagged

Javier E

The Wages of Guilt: Memories of War in Germany and Japan (Ian Buruma) - 0 views

  • the main reason why Germans were more trusted by their neighbors was that they were learning, slowly and painfully, and not always fully, to trust themselves.
  • elders, in government and the mass media, still voice opinions about the Japanese war that are unsettling, to say the least. Conservative politicians still pay their annual respects at a shrine where war criminals are officially remembered. Justifications and denials of war crimes are still heard. Too many Japanese in conspicuous places, including the prime minister’s office itself, have clearly not “coped” with the war.
  • unlike Nazi Germany, Japan had no systematic program to destroy the life of every man, woman, and child of a people that, for ideological reasons, was deemed to have no right to exist.
  • ...297 more annotations...
  • “We never knew,” a common reaction in the 1950s, had worn shamefully thin in the eyes of a younger generation by the 1960s. The extraordinary criminality of a deliberate genocide was so obvious that it left no room for argument.
  • Right-wing nationalists like to cite the absence of a Japanese Holocaust as proof that Japanese have no reason to feel remorse about their war at all. It was, in their eyes, a war like any other; brutal, yes, just as wars fought by all great nations in history have been brutal. In fact, since the Pacific War was fought against Western imperialists, it was a justified—even noble—war of Asian liberation.
  • in the late 1940s or 1950s, a time when most Germans were still trying hard not to remember. It is in fact extraordinary how honestly Japanese novelists and filmmakers dealt with the horrors of militarism in those early postwar years. Such honesty is much less evident now.
  • Popular comic books, aimed at the young, extol the heroics of Japanese soldiers and kamikaze pilots, while the Chinese and their Western allies are depicted as treacherous and belligerent. In 2008, the chief of staff of the Japanese Air Self-Defense Force stated that Japan had been “tricked” into the war by China and the US. In 2013, Prime Minister Abe Shinzo publicly doubted whether Japan’s military aggression in China could even be called an invasion.
  • The fact is that Japan is still haunted by historical issues that should have been settled decades ago. The reasons are political rather than cultural, and have to do with the pacifist constitution—written by American jurists in 1946—and with the imperial institution, absolved of war guilt by General Douglas MacArthur after the war for the sake of expediency.
  • Japan, even under Allied occupation, continued to be governed by much the same bureaucratic and political elite, albeit under a new, more democratic constitution,
  • a number of conservatives felt humiliated by what they rightly saw as an infringement of their national sovereignty. Henceforth, to them, everything from the Allied Tokyo War Crimes Tribunal to the denunciations of Japan’s war record by left-wing teachers and intellectuals would be seen in this light.
  • The more “progressive” Japanese used the history of wartime atrocities as a warning against turning away from pacifism, the more defensive right-wing politicians and commentators became about the Japanese war.
  • Views of history, in other words, were politicized—and polarized—from the beginning.
  • To take the sting out of this confrontation between constitutional pacifists and revisionists, which had led to much political turmoil in the 1950s, mainstream conservatives made a deliberate attempt to distract people’s attention from war and politics by concentrating on economic growth.
  • For several decades, the chauvinistic right wing, with its reactionary views on everything from high school education to the emperor’s status, was kept in check by the sometimes equally dogmatic Japanese left. Marxism was the prevailing ideology of the teachers union and academics.
  • the influence of Marxism waned after the collapse of the Soviet empire in the early 1990s, and the brutal records of Chairman Mao and Pol Pot became widely known.
  • Marginalized in the de facto one-party LDP state and discredited by its own dogmatism, the Japanese left did not just wane, it collapsed. This gave a great boost to the war-justifying right-wing nationalists,
  • Japanese young, perhaps out of boredom with nothing but materialistic goals, perhaps out of frustration with being made to feel guilty, perhaps out of sheer ignorance, or most probably out of a combination of all three, are not unreceptive to these patriotic blandishments.
  • Anxiety about the rise of China, whose rulers have a habit of using Japan’s historical crimes as a form of political blackmail, has boosted a prickly national pride, even at the expense of facing the truth about the past.
  • By 1996, the LDP was back in power, the constitutional issue had not been resolved, and historical debates continue to be loaded with political ideology. In fact, they are not really debates at all, but exercises in propaganda, tilted toward the reactionary side.
  • My instinct—call it a prejudice, if you prefer—before embarking on this venture was that people from distinct cultures still react quite similarly to similar circumstances.
  • The Japanese and the Germans, on the whole, did not behave in the same ways—but then the circumstances, both wartime and postwar, were quite different in the two Germanies and Japan. They still are.
  • Our comic-book prejudices turned into an attitude of moral outrage. This made life easier in a way. It was comforting to know that a border divided us from a nation that personified evil. They were bad, so we must be good. To grow up after the war in a country that had suffered German occupation was to know that one was on the side of the angels.
  • The question that obsessed us was not how we would have acquitted ourselves in uniform, going over the top, running into machine-gun fire or mustard gas, but whether we would have joined the resistance, whether we would have cracked under torture, whether we would have hidden Jews and risked deportation ourselves. Our particular shadow was not war, but occupation.
  • the frightened man who betrayed to save his life, who looked the other way, who grasped the wrong horn of a hideous moral dilemma, interested me more than the hero. This is no doubt partly because I fear I would be much like that frightened man myself. And partly because, to me, failure is more typical of the human condition than heroism.
  • I was curious to learn how Japanese saw the war, how they remembered it, what they imagined it to have been like, how they saw themselves in view of their past. What I heard and read was often surprising to a European:
  • this led me to the related subject of modern Japanese nationalism. I became fascinated by the writings of various emperor worshippers, historical revisionists, and romantic seekers after the unique essence of Japaneseness.
  • Bataan, the sacking of Manila, the massacres in Singapore, these were barely mentioned. But the suffering of the Japanese, in China, Manchuria, the Philippines, and especially in Hiroshima and Nagasaki, was remembered vividly, as was the imprisonment of Japanese soldiers in Siberia after the war. The Japanese have two days of remembrance: August 6, when Hiroshima was bombed, and August 15, the date of the Japanese surrender.
  • The curious thing was that much of what attracted Japanese to Germany before the war—Prussian authoritarianism, romantic nationalism, pseudo-scientific racialism—had lingered in Japan while becoming distinctly unfashionable in Germany. Why?
  • the two peoples saw their own purported virtues reflected in each other: the warrior spirit, racial purity, self-sacrifice, discipline, and so on. After the war, West Germans tried hard to discard this image of themselves. This was less true of the Japanese.
  • Which meant that any residual feelings of nostalgia for the old partnership in Japan were likely to be met with embarrassment in Germany.
  • I have concentrated on the war against the Jews in the case of Germany, since it was that parallel war, rather than, say, the U-boat battles in the Atlantic, or even the battle of Stalingrad, that left the most sensitive scar on the collective memory of (West) Germany.
  • I have emphasized the war in China and the bombing of Hiroshima, for these episodes, more than others, have lodged themselves, often in highly symbolic ways, in Japanese public life.
  • Do Germans perhaps have more reason to mourn? Is it because Japan has an Asian “shame culture,” to quote Ruth Benedict’s phrase, and Germany a Christian “guilt culture”?
  • why the collective German memory should appear to be so different from the Japanese. Is it cultural? Is it political? Is the explanation to be found in postwar history, or in the history of the war itself?
  • the two peoples still have anything in common after the war, it is a residual distrust of themselves.
  • when Michael sees thousands of German peace demonstrators, he does not see thousands of gentle people who have learned their lesson from the past; he sees “100 percent German Protestant rigorism, aggressive, intolerant, hard.”
  • To be betroffen implies a sense of guilt, a sense of shame, or even embarrassment. To be betroffen is to be speechless. But it also implies an idea of moral purity. To be betroffen is one way to “master the past,” to show contriteness, to confess, and to be absolved and purified.
  • In their famous book, written in the sixties, entitled The Inability to Mourn, Alexander and Margarethe Mitscherlich analyzed the moral anesthesia that afflicted postwar Germans who would not face their past. They were numbed by defeat; their memories appeared to be blocked. They would or could not do their labor, and confess. They appeared to have completely forgotten that they had glorified a leader who caused the death of millions.
  • There is something religious about the act of being betroffen, something close to Pietism,
  • heart of Pietism was the moral renovation of the individual, achieved by passing through the anguish of contrition into the overwhelming realization of the assurance of God’s grace.” Pietism served as an antidote to the secular and rational ideas of the French Enlightenment.
  • It began in the seventeenth century with the works of Philipp Jakob Spener. He wanted to reform the Church and bring the Gospel into daily life, as it were, by stressing good works and individual spiritual labor.
  • German television is rich in earnest discussion programs where people sit at round tables and debate the issues of the day. The audience sits at smaller tables, sipping drinks as the featured guests hold forth. The tone is generally serious, but sometimes the arguments get heated. It is easy to laugh at the solemnity of these programs, but there is much to admire about them. It is partly through these talk shows that a large number of Germans have become accustomed to political debate.
  • There was a real dilemma: at least two generations had been educated to renounce war and never again to send German soldiers to the front, educated, in other words, to want Germany to be a larger version of Switzerland. But they had also been taught to feel responsible for the fate of Israel, and to be citizens of a Western nation, firmly embedded in a family of allied Western nations. The question was whether they really could be both.
  • the Gulf War showed that German pacifism could not be dismissed simply as anti-Americanism or a rebellion against Adenauer’s West.
  • the West German mistrust of East Germans—the East Germans whose soldiers still marched in goose step, whose petit bourgeois style smacked of the thirties, whose system of government, though built on a pedestal of antifascism, contained so many disturbing remnants of the Nazi past; the East Germans, in short, who had been living in “Asia.”
  • Michael, the Israeli, compared the encounter of Westerners (“Wessies”) with Easterners (“Ossies”) with the unveiling of the portrait of Dorian Gray: the Wessies saw their own image and they didn’t like what they saw.
  • he added: “I also happen to think Japanese and Germans are racists.”
  • Germany for its Nazi inheritance and its sellout to the United States. But now that Germany had been reunified, with its specters of “Auschwitz” and its additional hordes of narrow-minded Ossies, Adenauer was deemed to have been right after
  • The picture was of Kiel in 1945, a city in ruins. He saw me looking at it and said: “It’s true that whoever is being bombed is entitled to some sympathy from us.”
  • “My personal political philosophy and maybe even my political ambition has to do with an element of distrust for the people I represent, people whose parents and grandparents made Hitler and the persecution of the Jews possible.”
  • in the seventies he had tried to nullify verdicts given in Nazi courts—without success until well into the eighties. One of the problems was that the Nazi judiciary itself was never purged. This continuity was broken only by time.
  • To bury Germany in the bosom of its Western allies, such as NATO and the EC, was to bury the distrust of Germans. Or so it was hoped. As Europeans they could feel normal, Western, civilized. Germany; the old “land in the middle,” the Central European colossus, the power that fretted over its identity and was haunted by its past, had become a Western nation.
  • It is a miracle, really, how quickly the Germans in the Federal Republic became civilized. We are truly part of the West now. We have internalized democracy. But the Germans of the former GDR, they are still stuck in a premodern age. They are the ugly Germans, very much like the West Germans after the war, the people I grew up with. They are not yet civilized.”
  • “I like the Germans very much, but I think they are a dangerous people. I don’t know why—perhaps it is race, or culture, or history. Whatever. But we Japanese are the same: we swing from one extreme to the other. As peoples, we Japanese, like the Germans, have strong collective discipline. When our energies are channeled in the right direction, this is fine, but when they are misused, terrible things happen.”
  • to be put in the same category as the Japanese—even to be compared—bothered many Germans. (Again, unlike the Japanese, who made the comparison often.) Germans I met often stressed how different they were from the Japanese,
  • To some West Germans, now so “civilized,” so free, so individualistic, so, well, Western, the Japanese, with their group discipline, their deference to authority, their military attitude toward work, might appear too close for comfort to a self-image only just, and perhaps only barely, overcome.
  • To what extent the behavior of nations, like that of individual people, is determined by history, culture, or character is a question that exercises many Japanese, almost obsessively.
  • not much sign of betroffenheit on Japanese television during the Gulf War. Nor did one see retired generals explain tactics and strategy. Instead, there were experts from journalism and academe talking in a detached manner about a faraway war which was often presented as a cultural or religious conflict between West and Middle East. The history of Muslim-Christian-Jewish animosity was much discussed. And the American character was analyzed at length to understand the behavior of George Bush and General Schwarzkopf.
  • In the words of one Albrecht Fürst von Urach, a Nazi propagandist, Japanese emperor worship was “the most unique fusion in the world of state form, state consciousness, and religious fanaticism.” Fanaticism was, of course, a positive word in the Nazi lexicon.
  • the identity question nags in almost any discussion about Japan and the outside world. It
  • It was a respectable view, but also one founded on a national myth of betrayal. Japan, according to the myth, had become the unique moral nation of peace, betrayed by the victors who had sat in judgment of Japan’s war crimes; betrayed in Vietnam, in Afghanistan, in Nicaragua; betrayed by the arms race, betrayed by the Cold War; Japan had been victimized not only by the “gratuitous,” perhaps even “racist,” nuclear attacks on Hiroshima and Nagasaki, but by all subsequent military actions taken by the superpowers,
  • When the Prime Minister of Japan, Shidehara Kijuro, protested in 1946 to General MacArthur that it was all very well saying that Japan should assume moral leadership in renouncing war, but that in the real world no country would follow this example, MacArthur replied: “Even if no country follows you, Japan will lose nothing. It is those who do not support this who are in the wrong.” For a long time most Japanese continued to take this view.
  • What is so convenient in the cases of Germany and Japan is that pacifism happens to be a high-minded way to dull the pain of historical guilt. Or, conversely, if one wallows in it, pacifism turns national guilt into a virtue, almost a mark of superiority, when compared to the complacency of other nations.
  • The denial of historical discrimination is not just a way to evade guilt. It is intrinsic to pacifism. To even try to distinguish between wars, to accept that some wars are justified, is already an immoral position.
  • That Kamei discussed this common paranoia in such odd, Volkish terms could mean several things: that some of the worst European myths got stuck in Japan, that the history of the Holocaust had no impact, or that Japan is in some respects a deeply provincial place. I think all three explanations apply.
  • “the problem with the U.S.-Japan relationship is difficult. A racial problem, really. Yankees are friendly people, frank people. But, you know, it’s hard. You see, we have to be friendly …”
  • Like Oda, indeed like many people of the left, Kamei thought in racial terms. He used the word jinshu, literally race. He did not even use the more usual minzoku, which corresponds, in the parlance of Japanese right-wingers, to Volk, or the more neutral kokumin, meaning the citizens of a state.
  • many Germans in the liberal democratic West have tried to deal honestly with their nation’s terrible past, the Japanese, being different, have been unable to do so. It is true that the Japanese, compared with the West Germans, have paid less attention to the suffering they inflicted on others, and shown a greater inclination to shift the blame. And liberal democracy, whatever it may look like on paper, has not been the success in Japan that it was in the German Federal Republic. Cultural differences might account for this. But one can look at these matters in a different, more political way. In his book The War Against the West, published in London in 1938, the Hungarian scholar Aurel Kolnai followed the Greeks in his definition of the West: “For the ancient Greeks ‘the West’ (or ‘Europe’) meant society with a free constitution and self-government under recognized rules, where ‘law is king,’ whereas the ‘East’ (or ‘Asia’) signified theocratic societies under godlike rulers whom their subjects serve ‘like slaves.’
  • According to this definition, both Hitler’s Germany and prewar Japan were of the East.
  • There was a great irony here: in their zeal to make Japan part of the West, General MacArthur and his advisers made it impossible for Japan to do so in spirit. For a forced, impotent accomplice is not really an accomplice at all.
  • In recent years, Japan has often been called an economic giant and a political dwarf. But this has less to do with a traditional Japanese mentality—isolationism, pacifism, shyness with foreigners, or whatnot—than with the particular political circumstances after the war that the United States helped to create.
  • when the Cold War prompted the Americans to make the Japanese subvert their constitution by creating an army which was not supposed to exist, the worst of all worlds appeared: sovereignty was not restored, distrust remained, and resentment mounted.
  • Kamei’s hawks are angry with the Americans for emasculating Japan; Oda’s doves hate the Americans for emasculating the “peace constitution.” Both sides dislike being forced accomplices, and both feel victimized, which is one reason Japanese have a harder time than Germans in coming to terms with their wartime past.
  • As far as the war against the Jews is concerned, one might go back to 1933, when Hitler came to power. Or at the latest to 1935, when the race laws were promulgated in Nuremberg. Or perhaps those photographs of burning synagogues on the night of November 9, 1938, truly marked the first stage of the Holocaust.
  • There is the famous picture of German soldiers lifting the barrier on the Polish border in 1939, but was that really the beginning? Or did it actually start with the advance into the Rhineland in 1936, or was it the annexation of the Sudetenland, or Austria, or Czechoslovakia?
  • IT IS DIFFICULT TO SAY when the war actually began for the Germans and the Japanese. I cannot think of a single image that fixed the beginning of either war in the public mind.
  • Possibly to avoid these confusions, many Germans prefer to talk about the Hitlerzeit (Hitler era) instead of “the war.”
  • only Japanese of a liberal disposition call World War II the Pacific War. People who stick to the idea that Japan was fighting a war to liberate Asia from Bolshevism and white colonialism call it the Great East Asian War (Daitowa Senso), as in the Great East Asian Co-Prosperity Sphere.
  • The German equivalent, I suppose, would be the picture of Soviet soldiers raising their flag on the roof of the gutted Reichstag in Berlin.
  • People of this opinion separate the world war of 1941–45 from the war in China, which they still insist on calling the China Incident.
  • Liberals and leftists, on the other hand, tend to splice these wars together and call them the Fifteen-Year War (1931–45).
  • images marking the end are more obvious.
  • argued that the struggle against Western imperialism actually began in 1853, with the arrival in Japan of Commodore Perry’s ships, and spoke of the Hundred-Year War.
  • These are among the great clichés of postwar Japan: shorthand for national defeat, suffering, and humiliation.
  • The Germans called it Zusammenbruch (the collapse) or Stunde Null (Zero Hour): everything seemed to have come to an end, everything had to start all over. The Japanese called it haisen (defeat) or shusen (termination of the war).
  • kokka (nation, state) and minzoku (race, people) are not quite of the same order as Sonderbehandlung (special treatment) or Einsatzgruppe (special action squad). The jargon of Japanese imperialism was racist and overblown, but it did not carry the stench of death camps.
  • The German people are spiritually starved, Adenauer told him. “The imagination has to be provided for.” This was no simple matter, especially in the German language, which had been so thoroughly infected by the jargon of mass murder.
  • All they had been told to believe in, the Germans and the Japanese, everything from the Führerprinzip to the emperor cult, from the samurai spirit to the Herrenvolk, from Lebensraum to the whole world under one (Japanese) roof, all that lay in ruins
  • How to purge this language from what a famous German philologist called the Lingua Tertii Imperii? “… the language is no longer lived,” wrote George Steiner in 1958, “it is merely spoken.”
  • out of defeat and ruin a new school of literature (and cinema) did arise. It is known in Germany as Trümmerliteratur (literature of the ruins). Japanese writers who came of age among the ruins called themselves the yakeato seidai (burnt-out generation). Much literature of the late forties and fifties was darkened by nihilism and despair.
  • It was as though Germany—Sonderweg or no Sonderweg—needed only to be purged of Nazism, while Japan’s entire cultural tradition had to be overhauled.
  • In Germany there was a tradition to fall back on. In the Soviet sector, the left-wing culture of the Weimar Republic was actively revived. In the Western sectors, writers escaped the rats and the ruins by dreaming of Goethe. His name was often invoked to prove that Germany, too, belonged to the humanist, enlightened strain of European civilization.
  • the Americans (and many Japanese leftists) distrusted anything associated with “feudalism,” which they took to include much of Japan’s premodern past. Feudalism was the enemy of democracy. So not only did the American censors, in their effort to teach the Japanese democracy, forbid sword-fight films and samurai dramas, but at one point ninety-eight Kabuki plays were banned too.
  • yet, what is remarkable about much of the literature of the period, or more precisely, of the literature about that time, since much of it was written later, is the deep strain of romanticism, even nostalgia. This colors personal memories of people who grew up just after the war as well.
  • If the mushroom cloud and the imperial radio speech are the clichés of defeat, the scene of an American soldier (usually black) raping a Japanese girl (always young, always innocent), usually in a pristine rice field (innocent, pastoral Japan), is a stock image in postwar movies about the occupation.
  • To Ango, then, as to other writers, the ruins offered hope. At last the Japanese, without “the fake kimono” of traditions and ideals, were reduced to basic human needs; at last they could feel real love, real pain; at last they would be honest. There was no room, among the ruins, for hypocrisy.
  • Böll was able to be precise about the end of the Zusammenbruch and the beginning of bourgeois hypocrisy and moral amnesia. It came on June 20, 1948, the day of the currency reform, the day that Ludwig Erhard, picked by the Americans as Economics Director in the U.S.-British occupation zone, gave birth to the Deutsche Mark. The DM, from then on, would be the new symbol of West German national pride;
  • the amnesia, and definitely the identification with the West, was helped further along by the Cold War. West Germany now found itself on the same side as the Western allies. Their common enemy was the “Asiatic” Soviet empire. Fewer questions needed to be asked.
  • Indeed, to some people the Cold War simply confirmed what they had known all along: Germany always had been on the right side, if only our American friends had realized it earlier.
  • The process of willed forgetfulness culminated in the manic effort of reconstruction, in the great rush to prosperity.
  • “Prosperity for All” was probably the best that could have happened to the Germans of the Federal Republic. It took the seed of resentment (and thus future extremism) out of defeat. And the integration of West Germany into a Western alliance was a good thing too.
  • The “inability to mourn,” the German disassociation from the piles of corpses strewn all over Central and Eastern Europe, so that the Third Reich, as the Mitscherlichs put it, “faded like a dream,” made it easier to identify with the Americans, the victors, the West.
  • Yet the disgust felt by Böll and others for a people getting fat (“flabby” is the usual term, denoting sloth and decadence) and forgetting about its murderous past was understandable.
  • The Brückners were the price Germany had to pay for the revival of its fortunes. Indeed, they were often instrumental in it. They were the apparatchik who functioned in any system, the small, efficient fish who voted for Christian conservatives in the West and became Communists in the East.
  • Staudte was clearly troubled by this, as were many Germans, but he offered no easy answers. Perhaps it was better this way: flabby democrats do less harm than vengeful old Nazis.
  • the forgetful, prosperous, capitalist Federal Republic of Germany was in many more or less hidden ways a continuation of Hitler’s Reich. This perfectly suited the propagandists of the GDR, who would produce from time to time lists of names of former Nazis who were prospering in the West. These lists were often surprisingly accurate.
  • In a famous film, half fiction, half documentary, made by a number of German writers and filmmakers (including Böll) in 1977, the continuity was made explicit. The film, called Germany in Autumn (Deutschland in Herbst),
  • Rainer Werner Fassbinder was one of the participants in this film. A year later he made The Marriage of Maria Braun.
  • To lifelong “antifascists” who had always believed that the Federal Republic was the heir to Nazi Germany, unification seemed—so they said—almost like a restoration of 1933. The irony was that many Wessies saw their new Eastern compatriots as embarrassing reminders of the same unfortunate past.
  • Rarely was the word “Auschwitz” heard more often than during the time of unification, partly as an always salutary reminder that Germans must not forget, but partly as an expression of pique that the illusion of a better, antifascist, anticapitalist, idealistic Germany, born in the ruins of 1945, and continued catastrophically for forty years in the East, had now been dashed forever.
  • Ludwig Erhard’s almost exact counterpart in Japan was Ikeda Hayato, Minister of Finance from 1949 and Prime Minister from 1960 to 1964. His version of Erhard’s “Prosperity for AH” was the Double Your Incomes policy, which promised to make the Japanese twice as rich in ten years. Japan had an average growth rate of 11 percent during the 1960s.
  • It explains, at any rate, why the unification of the two Germanys was considered a defeat by antifascists on both sides of the former border.
  • Very few wartime bureaucrats had been purged. Most ministries remained intact. Instead it was the Communists, who had welcomed the Americans as liberators, who were purged after 1949, the year China was “lost.”
  • so the time of ruins was seen by people on the left as a time of missed chances and betrayal. Far from achieving a pacifist utopia of popular solidarity, they ended up with a country driven by materialism, conservatism, and selective historical amnesia.
  • the “red purges” of 1949 and 1950 and the return to power of men whose democratic credentials were not much better helped to turn many potential Japanese friends of the United States into enemies. For the Americans were seen as promoters of the right-wing revival and the crackdown on the left.
  • For exactly twelve years Germany was in the hands of a criminal regime, a bunch of political gangsters who had started a movement. Removing this regime was half the battle.
  • It is easier to change political institutions and hope that habits and prejudices will follow. This, however, was more easily done in Germany than in Japan.
  • There had not been a cultural break either in Japan. There were no exiled writers and artists who could return to haunt the consciences of those who had stayed.
  • There was no Japanese Thomas Mann or Alfred Döblin. In Japan, everyone had stayed.
  • In Japan there was never a clear break between a fascist and a prefascist past. In fact, Japan was never really a fascist state at all. There was no fascist or National Socialist ruling party, and no Führer either. The closest thing to it would have been the emperor, and whatever else he may have been, he was not a fascist dictator.
  • whereas after the war Germany lost its Nazi leaders, Japan lost only its admirals and generals.
  • Japan was effectively occupied only by the Americans. West Germany was part of NATO and the European Community, and the GDR was in the Soviet empire. Japan’s only formal alliance is with the United States, through a security treaty that many Japanese have opposed.
  • But the systematic subservience of Japan meant that the country never really grew up. There is a Japanese fixation on America, an obsession which goes deeper, I believe, than German anti-Americanism,
  • Yet nothing had stayed entirely the same in Japan. The trouble was that virtually all the changes were made on American orders. This was, of course, the victor’s prerogative, and many changes were beneficial.
  • like in fiction. American Hijiki, a novella by Nosaka Akiyuki, is, to my mind, a masterpiece in the short history of Japanese Trümmerliteratur.
  • Older Japanese do, however, remember the occupation, the first foreign army occupation in their national history. But it was, for the Japanese, a very unusual army. Whereas the Japanese armies in Asia had brought little but death, rape, and destruction, this one came with Glenn Miller music, chewing gum, and lessons in democracy. These blessings left a legacy of gratitude, rivalry, and shame.
  • did these films teach the Japanese democracy? Oshima thinks not. Instead, he believes, Japan learned the values of “progress” and “development.” Japan wanted to be just as rich as America—no, even richer:
  • think it is a romantic assumption, based less on history than on myth; a religious notion, expressed less through scholarship than through monuments, memorials, and historical sites turned into sacred grounds.
  • The past, wrote the West German historian Christian Meier, is in our bones. “For a nation to appropriate its history,” he argued, “is to look at it through the eyes of identity.” What we have “internalized,” he concluded, is Auschwitz.
  • Auschwitz is such a place, a sacred symbol of identity for Jews, Poles, and perhaps even Germans. The question is what or whom Germans are supposed to identify with.
  • The idea that visiting the relics of history brings the past closer is usually an illusion. The opposite is more often true.
  • To visit the site of suffering, any description of which cannot adequately express the horror, is upsetting, not because one gets closer to knowing what it was actually like to be a victim, but because such visits stir up emotions one cannot trust. It is tempting to take on the warm moral glow of identification—so easily done and so presumptuous—with the victims:
  • Were the crimes of Auschwitz, then, part of the German “identity”? Was genocide a product of some ghastly flaw in German culture, the key to which might be found in the sentimental proverbs, the cruel fairy tales, the tight leather shorts?
  • yet the imagination is the only way to identify with the past. Only in the imagination—not through statistics, documents, or even photographs—do people come alive as individuals, do stories emerge, instead of History.
  • nature. It is all right to let the witnesses speak, in the courtroom, in the museums, on videotape (Claude Lanzmann’s Shoah has been shown many times on German television), but it is not all right for German artists to use their imagination.
  • the reluctance in German fiction to look Auschwitz in the face, the almost universal refusal to deal with the Final Solution outside the shrine, the museum, or the schoolroom, suggests a fear of committing sacrilege.
  • beneath the fear of bad taste or sacrilege may lie a deeper problem. To imagine people in the past as people of flesh and blood, not as hammy devils in silk capes, is to humanize them. To humanize is not necessarily to excuse or to sympathize, but it does demolish the barriers of abstraction between us and them. We could, under certain circumstances, have been them.
  • the flight into religious abstraction was to be all too common among Germans of the Nazi generation, as well as their children; not, as is so often the case with Jews, to lend mystique to a new identity, as a patriotic Zionist, but on the contrary to escape from being the heir to a peculiarly German crime, to get away from having to “internalize” Auschwitz, or indeed from being German at all.
  • a Hollywood soap opera, a work of skillful pop, which penetrated the German imagination in a way nothing had before. Holocaust was first shown in Germany in January 1979. It was seen by 20 million people, about half the adult population of the Federal Republic; 58 percent wanted it to be repeated; 12,000 letters, telegrams, and postcards were sent to the broadcasting stations; 5,200 called the stations by telephone after the first showing; 72.5 percent were positive, 7.3 percent negative.
  • “After Holocaust,” wrote a West German woman to her local television station, “I feel deep contempt for those beasts of the Third Reich. I am twenty-nine years old and a mother of three children. When I think of the many mothers and children sent to the gas chambers, I have to cry. (Even today the Jews are not left in peace. We Germans have the duty to work every day for peace in Israel.) I bow to the victims of the Nazis, and I am ashamed to be a German.”
  • Auschwitz was a German crime, to be sure. “Death is a master from Germany.” But it was a different Germany. To insist on viewing history through the “eyes of identity,” to repeat the historian Christian Meier’s phrase, is to resist the idea of change.
  • Is there no alternative to these opposing views? I believe there is.
  • The novelist Martin Walser, who was a child during the war, believes, like Meier, that Auschwitz binds the German people, as does the language of Goethe. When a Frenchman or an American sees pictures of Auschwitz, “he doesn’t have to think: We human beings! He can think: Those Germans! Can we think: Those Nazis! I for one cannot …”
  • Adorno, a German Jew who wished to save high German culture, on whose legacy the Nazis left their bloody finger marks, resisted the idea that Auschwitz was a German crime. To him it was a matter of modern pathology, the sickness of the “authoritarian personality,” of the dehumanized SS guards, those inhumane cogs in a vast industrial wheel.
  • To the majority of Japanese, Hiroshima is the supreme symbol of the Pacific War. All the suffering of the Japanese people is encapsulated in that almost sacred word: Hiroshima. But it is more than a symbol of national martyrdom; Hiroshima is a symbol of absolute evil, often compared to Auschwitz.
  • has the atmosphere of a religious center. It has martyrs, but no single god. It has prayers, and it has a ready-made myth about the fall of man. Hiroshima, says a booklet entitled Hiroshima Peace Reader, published by the Hiroshima Peace Culture Foundation, “is no longer merely a Japanese city. It has become recognized throughout the world as a Mecca of world peace.”
  • They were not enshrined in the Japanese park, and later attempts by local Koreans to have the monument moved into Peace Park failed. There could only be one cenotaph, said the Hiroshima municipal authorities. And the cenotaph did not include Koreans.
  • What is interesting about Hiroshima—the Mecca rather than the modern Japanese city, which is prosperous and rather dull—is the tension between its universal aspirations and its status as the exclusive site of Japanese victimhood.
  • it is an opinion widely held by Japanese nationalists. The right always has been concerned with the debilitating effects on the Japanese identity of war guilt imposed by American propaganda.
  • The Japanese, in contrast, were duped by the Americans into believing that the traces of Japanese suffering should be swept away by the immediate reconstruction of Hiroshima. As a result, the postwar Japanese lack an identity and their racial virility has been sapped by American propaganda about Japanese war guilt.
  • Hiroshima, Uno wrote, should have been left as it was, in ruins, just as Auschwitz, so he claims, was deliberately preserved by the Jews. By reminding the world of their martyrdom, he said, the Jews have kept their racial identity intact and restored their virility.
  • But the idea that the bomb was a racist experiment is less plausible, since the bomb was developed for use against Nazi Germany.
  • There is another view, however, held by leftists and liberals, who would not dream of defending the “Fifteen-Year War.” In this view, the A-bomb was a kind of divine punishment for Japanese militarism. And having learned their lesson through this unique suffering, having been purified through hellfire and purgatory, so to speak, the Japanese people have earned the right, indeed have the sacred duty, to sit in judgment of others, specifically the United States, whenever they show signs of sinning against the “Hiroshima spirit.”
  • The left has its own variation of Japanese martyrdom, in which Hiroshima plays a central role. It is widely believed, for instance, that countless Japanese civilians fell victim to either a wicked military experiment or to the first strike in the Cold War, or both.
  • However, right-wing nationalists care less about Hiroshima than about the idée fixe that the “Great East Asian War” was to a large extent justified.
  • This is at the heart of what is known as Peace Education, which has been much encouraged by the leftist Japan Teachers’ Union and has been regarded with suspicion by the conservative government. Peace Education has traditionally meant pacifism, anti-Americanism, and a strong sympathy for Communist states, especially China.
  • The A-bomb, in this version, was dropped to scare the Soviets away from invading Japan. This at least is an arguable position.
  • left-wing pacifism in Japan has something in common with the romantic nationalism usually associated with the right: it shares the right’s resentment about being robbed by the Americans of what might be called a collective memory.
  • The romantic pacifists believe that the United States, to hide its own guilt and to rekindle Japanese militarism in aid of the Cold War, tried to wipe out the memory of Hiroshima.
  • few events in World War II have been described, analyzed, lamented, reenacted, re-created, depicted, and exhibited so much and so often as the bombing of Hiroshima
  • The problem with Nagasaki was not just that Hiroshima came first but also that Nagasaki had more military targets than Hiroshima. The Mitsubishi factories in Nagasaki produced the bulk of Japanese armaments. There was also something else, which is not often mentioned: the Nagasaki bomb exploded right over the area where outcasts and Christians lived. And unlike in Hiroshima, much of the rest of the city was spared the worst.
  • yet, despite these diatribes, the myth of Hiroshima and its pacifist cult is based less on American wickedness than on the image of martyred innocence and visions of the apocalypse.
  • The comparison between Hiroshima and Auschwitz is based on this notion; the idea, namely, that Hiroshima, like the Holocaust, was not part of the war, not even connected with it, but “something that occurs at the end of the world
  • still I wonder whether it is really so different from the position of many Germans who wish to “internalize” Auschwitz, who see Auschwitz “through the eyes of identity.”
  • the Japanese to take two routes at once, a national one, as unique victims of the A-bomb, and a universal one, as the apostles of the Hiroshima spirit. This, then, is how Japanese pacifists, engaged in Peace Education, define the Japanese identity.
  • the case for Hiroshima is at least open to debate. The A-bomb might have saved lives; it might have shortened the war. But such arguments are incompatible with the Hiroshima spirit.
  • In either case, nationality has come to be based less on citizenship than on history, morality, and a religious spirit.
  • The problem with this quasi-religious view of history is that it makes it hard to discuss past events in anything but nonsecular terms. Visions of absolute evil are unique, and they are beyond human explanation or even comprehension. To explain is hubristic and amoral.
  • in the history of Japan’s foreign wars, the city of Hiroshima is far from innocent. When Japan went to war with China in 1894, the troops set off for the battlefronts from Hiroshima, and the Meiji emperor moved his headquarters there. The city grew wealthy as a result. It grew even wealthier when Japan went to war with Russia eleven years later, and Hiroshima once again became the center of military operations. As the Hiroshima Peace Reader puts it with admirable conciseness, “Hiroshima, secure in its position as a military city, became more populous and prosperous as wars and incidents occurred throughout the Meiji and Taisho periods.” At the time of the bombing, Hiroshima was the base of the Second General Headquarters of the Imperial Army (the First was in Tokyo). In short, the city was swarming with soldiers. One of the few literary masterpieces to emerge
  • when a local group of peace activists petitioned the city of Hiroshima in 1987 to incorporate the history of Japanese aggression into the Peace Memorial Museum, the request was turned down. The petition for an “Aggressors’ Corner” was prompted by junior high school students from Osaka, who had embarrassed Peace Museum officials by asking for an explanation about Japanese responsibility for the war.
  • Yukoku Ishinkai (Society for Lament and National Restoration), thought the bombing had saved Japan from total destruction. But he insisted that Japan could not be held solely responsible for the war. The war, he said, had simply been part of the “flow of history.”
  • They also demanded an official recognition of the fact that some of the Korean victims of the bomb had been slave laborers. (Osaka, like Kyoto and Hiroshima, still has a large Korean population.) Both requests were denied. So a group called Peace Link was formed, from local people, many of whom were Christians, antinuclear activists, or involved with discriminated-against minorities.
  • The history of the war, or indeed any history, is indeed not what the Hiroshima spirit is about. This is why Auschwitz is the only comparison that is officially condoned. Anything else is too controversial, too much part of the “flow of history.”
  • “You see, this museum was not really intended to be a museum. It was built by survivors as a place of prayer for the victims and for world peace. Mankind must build a better world. That is why Hiroshima must persist. We must go back to the basic roots. We must think of human solidarity and world peace. Otherwise we just end up arguing about history.”
  • Only when a young Japanese history professor named Yoshimi Yoshiaki dug up a report in American archives in the 1980s did it become known that the Japanese had stored 15,000 tons of chemical weapons on and near the island and that a 200-kilogram container of mustard gas was buried under Hiroshima.
  • what was the largest toxic gas factory in the Japanese Empire. More than 5,000 people worked there during the war, many of them women and schoolchildren. About 1,600 died of exposure to hydrocyanic acid gas, nausea gas, and lewisite. Some were damaged for life. Official Chinese sources claim that more than 80,000 Chinese fell victim to gases produced at the factory. The army was so secretive about the place that the island simply disappeared from Japanese maps.
  • in 1988, through the efforts of survivors, the small museum was built, “to pass on,” in the words of the museum guide, “the historical truth to future generations.”
  • Surviving workers from the factory, many of whom suffered from chronic lung diseases, asked for official recognition of their plight in the 1950s. But the government turned them down. If the government had compensated the workers, it would have been an official admission that the Japanese Army had engaged in an illegal enterprise. When a brief mention of chemical warfare crept into Japanese school textbooks, the Ministry of Education swiftly took it out.
  • I asked him about the purpose of the museum. He said: “Before shouting ‘no more war,’ I want people to see what it was really like. To simply look at the past from the point of view of the victim is to encourage hatred.”
  • “Look,” he said, “when you fight another man, and hit him and kick him, he will hit and kick back. One side will win. How will this be remembered? Do we recall that we were kicked, or that we started the kicking ourselves? Without considering this question, we cannot have peace.”
  • The fact that Japanese had buried poison gas under Hiroshima did not lessen the horror of the A-bomb. But it put Peace Park, with all its shrines, in a more historical perspective. It took the past away from God and put it in the fallible hands of man.
  • What did he think of the Peace Museum in Hiroshima? “At the Hiroshima museum it is easy to feel victimized,” he said. “But we must realize that we were aggressors too. We were educated to fight for our country. We made toxic gas for our country. We lived to fight the war. To win the war was our only goal.”
  • Nanking, as the capital of the Nationalist government, was the greatest prize in the attempted conquest of China. Its fall was greeted in Japan with banner headlines and nationwide celebration. For six weeks Japanese Army officers allowed their men to run amok. The figures are imprecise, but tens of thousands, perhaps hundreds of thousands (the Chinese say 300,000) of Chinese soldiers and civilians, many of them refugees from other towns, were killed. And thousands of women between the ages of about nine and seventy-five were raped, mutilated, and often murdered.
  • Was it a deliberate policy to terrorize the Chinese into submission? The complicity of the officers suggests there was something to this. But it might also have been a kind of payoff to the Japanese troops for slogging through China in the freezing winter without decent pay or rations. Or was it largely a matter of a peasant army running out of control? Or just the inevitable consequence of war, as many Japanese maintain?
  • inevitable cruelty of war. An atrocity is a willful act of criminal brutality, an act that violates the law as well as any code of human decency. It isn’t that the Japanese lack such codes or are morally incapable of grasping the concept. But “atrocity,” like “human rights,” is part of a modern terminology which came from the West, along with “feminism,” say, or “war crimes.” To right-wing nationalists it has a leftist ring, something subversive, something almost anti-Japanese.
  • During the Tokyo War Crimes Tribunal, Nanking had the same resonance as Auschwitz had in Nuremberg. And being a symbol, the Nanking Massacre is as vulnerable to mythology and manipulation as Auschwitz and Hiroshima.
  • Mori’s attitude also raises doubts about Ruth Benedict’s distinction between Christian “guilt culture” and Confucian “shame culture.”
  • In her opinion, a “society that inculcates absolute standards of morality and relies on man’s developing a conscience is a guilt culture by definition …” But in “a culture where shame is a major sanction, people are chagrined about acts which we expect people to feel guilty about.” However, this “chagrin cannot be relieved, as guilt can be, by confession and atonement …”
  • memory was admitted at all, the Mitscherlichs wrote about Germans in the 1950s, “it was only in order to balance one’s own guilt against that of others. Many horrors had been unavoidable, it was claimed, because they had been dictated by crimes committed by the adversary.” This was precisely what many Japanese claimed, and still do claim. And it is why Mori insists on making his pupils view the past from the perspective of the aggressors.
  • Two young Japanese officers, Lieutenant N. and Lieutenant M., were on their way to Nanking and decided to test their swordsmanship: the first to cut off one hundred Chinese heads would be the winner. And thus they slashed their way through Chinese ranks, taking scalps in true samurai style. Lieutenant M. got 106, and Lieutenant N. bagged 105.
  • The story made a snappy headline in a major Tokyo newspaper: “Who Will Get There First! Two Lieutenants Already Claimed 80.” In the Nanking museum is a newspaper photograph of the two friends, glowing with youthful high spirits. Lieutenant N. boasted in the report that he had cut the necks off 56 men without even denting the blade of his ancestral sword.
  • I was told by a Japanese veteran who had fought in Nanking that such stories were commonly made up or at least exaggerated by Japanese reporters, who were ordered to entertain the home front with tales of heroism.
  • Honda Katsuichi, a famous Asahi Shimbun reporter, was told the story in Nanking. He wrote it up in a series of articles, later collected in a book entitled A Journey to China, published in 1981.
  • the whole thing developed into the Nankin Ronso, or Nanking Debate. In 1984, an anti-Honda book came out, by Tanaka Masaaki, entitled The Fabrication of the “Nanking Massacre.”
  • back in Japan, Lieutenant M. began to revise his story. Speaking at his old high school, he said that in fact he had beheaded only four or five men in actual combat. As for the rest … “After we occupied the city, I stood facing a ditch, and told the Chinese prisoners to step forward. Since Chinese soldiers are stupid, they shuffled over to the ditch, one by one, and I cleanly cut off their heads.”
  • The nationalist intellectuals are called goyo gakusha by their critics. It is a difficult term to translate, but the implied meaning is “official scholars,” who do the government’s bidding.
  • the debate on the Japanese war is conducted almost entirely outside Japanese universities, by journalists, amateur historians, political columnists, civil rights activists, and so forth. This means that the zanier theories of the likes of Tanaka…
  • The other reason was that modern history was not considered academically respectable. It was too fluid, too political, too controversial. Until 1955, there was not one modern historian on the staff of Tokyo University. History stopped around the middle of the nineteenth century. And even now, modern…
  • In any case, so the argument invariably ends, Hiroshima, having been planned in cold blood, was a far worse crime. “Unlike in Europe or China,” writes Tanaka, “you won’t find one instance of planned, systematic murder in the entire history of Japan.” This is because the Japanese…
  • One reason is that there are very few modern historians in Japan. Until the end of the war, it would have been dangerously subversive, even blasphemous, for a critical scholar to write about modern…
  • they have considerable influence on public opinion, as television commentators, lecturers, and contributors to popular magazines. Virtually none of them are professional historians.
  • Tanaka and others have pointed out that it is physically impossible for one man to cut off a hundred heads with one blade, and that for the same reason Japanese troops could never have…
  • Besides, wrote Tanaka, none of the Japanese newspapers reported any massacre at the time, so why did it suddenly come up…
  • He admits that a few innocent people got killed in the cross fire, but these deaths were incidental. Some soldiers were doubtless a bit rough, but…
  • even he defends an argument that all the apologists make too: “On the battlefield men face the ultimate extremes of human existence, life or death. Extreme conduct, although still ethically…
  • atrocities carried out far from the battlefield dangers and imperatives and according to a rational plan were acts of evil barbarism. The Auschwitz gas chambers of our ‘ally’ Germany and the atomic bombing of our…
  • The point that it was not systematic was made by leftist opponents of the official scholars too. The historian Ienaga Saburo, for example, wrote that the Nanking Massacre, whose scale and horror he does not deny, “may have been a reaction to the fierce Chinese resistance after the Shanghai fighting.” Ienaga’s…
  • The nationalist right takes the opposite view. To restore the true identity of Japan, the emperor must be reinstated as a religious head of state, and Article Nine must be revised to make Japan a legitimate military power again. For this reason, the Nanking Massacre, or any other example of extreme Japanese aggression, has to be ignored, softened, or denied.
  • the question remains whether the raping and killing of thousands of women, and the massacre of thousands, perhaps hundreds of thousands, of other unarmed people, in the course of six weeks, can still be called extreme conduct in the heat of battle. The question is pertinent, particularly when such extreme violence is justified by an ideology which teaches the aggressors that killing an inferior race is in accordance with the will of their divine emperor.
  • The politics behind the symbol are so divided and so deeply entrenched that it hinders a rational historical debate about what actually happened in 1937. The more one side insists on Japanese guilt, the more the other insists on denying it.
  • The Nanking Massacre, for leftists and many liberals too, is the main symbol of Japanese militarism, supported by the imperial (and imperialist) cult. Which is why it is a keystone of postwar pacifism. Article Nine of the constitution is necessary to avoid another Nanking Massacre.
  • The Japanese, he said, should see their history through their own eyes, for “if we rely on the information of aliens and alien countries, who use history for the sake of propaganda, then we are in danger of losing the sense of our own history.” Yet another variation of seeing history through the eyes of identity.
  • their emotions were often quite at odds with the idea of “shame culture” versus “guilt culture.” Even where the word for shame, hazukashii, was used, its meaning was impossible to distinguish from the Western notion of guilt.
  • wasn’t so bad in itself. But then they killed them. You see, rape was against military regulations, so we had to destroy the evidence. While the women were fucked, they were considered human, but when we killed them, they were just pigs. We felt no shame about it, no guilt. If we had, we couldn’t have done it.
  • “Whenever we would enter a village, the first thing we’d do was steal food, then we’d take the women and rape them, and finally we’d kill all the men, women, and children to make sure they couldn’t slip away and tell the Chinese troops where we were. Otherwise we wouldn’t have been able to sleep at night.”
  • Clearly, then, the Nanking Massacre had been the culmination of countless massacres on a smaller scale. But it had been mass murder without a genocidal ideology. It was barbaric, but to Azuma and his comrades, barbarism was part of war.
  • “Sexual desire is human,” he said. “Since I suffered from a venereal disease, I never actually did it with Chinese women. But I did peep at their private parts. We’d always order them to drop their trousers. They never wore any underwear, you know. But the others did it with any woman that crossed our path.
  • He did have friends, however, who took part in the killings. One of them, Masuda Rokusuke, killed five hundred men by the Yangtze River with his machine gun. Azuma visited his friend in the hospital just before he died in the late 1980s. Masuda was worried about going to hell. Azuma tried to reassure him that he was only following orders. But Masuda remained convinced that he was going to hell.
  • “One of the worst moments I can remember was the killing of an old man and his grandson. The child was bayoneted and the grandfather started to suck the boy’s blood, as though to conserve his grandson’s life a bit longer. We watched a while and then killed both. Again, I felt no guilt, but I was bothered by this kind of thing. I felt confused. So I decided to keep a diary. I thought it might help me think straight.”
  • What about his old comrades? I asked. How did they discuss the war? “Oh,” said Azuma, “we wouldn’t talk about it much. When we did, it was to justify it. The Chinese resisted us, so we had to do what we did, and so on. None of us felt any remorse. And I include myself.”
  • got more and more agitated. “They turned the emperor into a living god, a false idol, like the Ayatollah in Iran or like Kim II Sung. Because we believed in the divine emperor, we were prepared to do anything, anything at all, kill, rape, anything. But I know he fucked his wife every night, just like we do …” He paused and lowered his voice. “But you know we cannot say this in Japan, even today. It is impossible in this country to tell the truth.”
  • My first instinct was to applaud West German education. Things had come a long way since 1968. There had been no school classes at Nuremberg, or even at the Auschwitz trial in Frankfurt from 1963 till 1965. Good for the teacher, I thought. Let them hear what was done. But I began to have doubts.
  • Just as belief belongs in church, surely history education belongs in school. When the court of law is used for history lessons, then the risk of show trials cannot be far off. It may be that show trials can be good politics—though I have my doubts about this too. But good politics don’t necessarily serve the truth.
  • There is a story about the young Richard when he was in Nuremberg at the time of the war crimes trials. He is said to have turned to a friend and to have remarked, in his best Wehrmacht officer style, that they should storm the court and release the prisoners. The friend, rather astonished, asked why on earth they should do such a thing. “So that we can try them ourselves” was Weiszäcker’s alleged response.
  • There was also concern that international law might not apply to many of the alleged crimes. If revenge was the point, why drag the law into it? Why not take a political decision to punish? This was what Becker, in his office, called the Italian solution: “You kill as many people as you can in the first six weeks, and then you forget about it: not very legal, but for the purposes of purification, well …”
  • Becker was not against holding trials as such. But he believed that existing German laws should have been applied, instead of retroactive laws about crimes against peace (preparing, planning, or waging an aggressive war).
  • It was to avoid a travesty of the legal process that the British had been in favor of simply executing the Nazi leaders without a trial. The British were afraid that a long trial might change public opinion. The trial, in the words of one British diplomat, might be seen as a “put-up job.”
  • The question is how to achieve justice without distorting the law, and how to stage a trial by victors over the vanquished without distorting history. A possibility would have been to make victors’ justice explicit, by letting military courts try the former enemies.
  • This would have avoided much hypocrisy and done less damage to the due process of law in civilian life. But if the intention was to teach Germans a history lesson, a military court would have run into the same problems as a civilian one.
  • Due process or revenge. This problem had preoccupied the ancient Greek tragedians. To break the cycle of vendetta, Orestes had to be tried by the Athens court for the murder of his mother. Without a formal trial, the vengeful Furies would continue to haunt the living.
  • The aspect of revenge might have been avoided had the trial been held by German judges. There was a precedent for this, but it was not a happy one. German courts had been allowed to try alleged war criminals after World War I. Despite strong evidence against them, virtually all were acquitted, and the foreign delegates were abused by local mobs. Besides, Wetzka was right: German judges had collaborated with the Nazi regime; they could hardly be expected to be impartial. So it was left to the victors to see that justice was done.
  • When the American chief prosecutor in Nuremberg, Robert H. Jackson, was asked by the British judge, Lord Justice Lawrence, what he thought the purpose of the trials should be, Jackson answered that they were to prove to the world that the German conduct of the war had been unjustified and illegal, and to demonstrate to the German people that this conduct deserved severe punishment and to prepare them for
  • What becomes clear from this kind of language is that law, politics, and religion became confused: Nuremberg became a morality play, in which Göring, Kaltenbrunner, Keitel, and the others were cast in the leading roles. It was a play that claimed to deliver justice, truth, and the defeat of evil.
  • The Nuremberg trials were to be a history lesson, then, as well as a symbolic punishment of the German people—a moral history lesson cloaked in all the ceremonial trappings of due legal process. They were the closest that man, or at least the men belonging to the victorious powers, could come to dispensing divine justice. This was certainly the way some German writers felt about it. Some welcomed it
  • We now have this law on our books, the prosecutor said: “It will be used against the German aggressor this time. But the four powers, who are conducting this trial in the name of twenty-three nations, know this law and declare: Tomorrow we shall be judged before history by the same yardstick by which we judge these defendants today.”
  • “We had seen through the amorality of the Nazis, and wanted to rid ourselves of it. It was from the moral seriousness of the American prosecution that we wished to learn sensible political thinking. “And we did learn. “And we allowed ourselves to apply this thinking to the present time. For example, we will use it now to take quite literally the morality of those American prosecutors. Oradour and Lidice—today they are cities in South Vietnam” (Italics in the original text.)
  • The play ends with a statement by the American prosecutor on crimes against peace
  • (It was decided in 1979, after the shock of the Holocaust TV series, to abolish the statute of limitations for crimes against humanity.)
  • after Nuremberg, most Germans were tired of war crimes. And until the mid-1950s German courts were permitted to deal only with crimes committed by Germans against other Germans. It took the bracing example of the Eichmann trial in Jerusalem to jolt German complacency—that, and the fact that crimes committed before 1946 would no longer be subject to prosecution after 1965.
  • Trying the vanquished for conventional war crimes was never convincing, since the victors could be accused of the same. Tu quoque could be invoked, in private if not in the Nuremberg court, when memories of Dresden and Soviet atrocities were still fresh. But Auschwitz had no equivalent. That was part of another war, or, better, it was not really a war at all; it was mass murder pure and simple, not for reasons of strategy or tactics, but of ideology alone.
  • Whether you are a conservative who wants Germany to be a “normal” nation or a liberal/leftist engaging in the “labor of mourning,” the key event of World War II is Auschwitz, not the Blitzkrieg, not Dresden, not even the war on the eastern front. This was the one history lesson of Nuremberg that stuck. As Hellmut Becker said, despite his skepticism about Nuremberg: “It was most important that the German population realized that crimes against humanity had taken place and that during the trials it became clear how they had taken place.”
  • In his famous essay on German guilt, Die Schuldfrage (The Question of German Guilt), written in 1946, Karl Jaspers distinguished four categories of guilt: criminal guilt, for breaking the law; political guilt, for being part of a criminal political system; moral guilt, for personal acts of criminal behavior; and metaphysical guilt, for failing in one’s responsibility to maintain the standards of civilized humanity. Obviously these categories overlap.
  • The great advantage, in his view, of a war crimes trial was its limitation. By allowing the accused to defend themselves with arguments, by laying down the rules of due process, the victors limited their own powers.
  • In any event, the trial distanced the German people even further from their former leaders. It was a comfortable distance, and few people had any desire to bridge it. This might be why the Nazi leaders are hardly ever featured in German plays, films, or novels.
  • And: “For us Germans this trial has the advantage that it distinguishes between the particular crimes of the leaders and that it does not condemn the Germans collectively.”
  • Serious conservative intellectuals, such as Hermann Lübbe, argued that too many accusations would have blocked West Germany’s way to becoming a stable, prosperous society. Not that Lübbe was an apologist for the Third Reich. Far from it: the legitimacy of the Federal Republic, in his opinion, lay in its complete rejection of the Nazi state.
  • their reaction was often one of indignation. “Why me?” they would say. “I just did my duty. I just followed orders like every decent German. Why must I be punished?”
  • “that these criminals were so like all of us at any point between 1918 and 1945 that we were interchangeable, and that particular circumstances caused them to take a different course, which resulted in this trial, these matters could not be properly discussed in the courtroom.” The terrible acts of individuals are lifted from their historical context. History is reduced to criminal pathology and legal argument.
  • they will not do as history lessons, nor do they bring us closer to that elusive thing that Walser seeks, a German identity.
  • The GDR had its own ways of using courts of law to deal with the Nazi past. They were in many respects the opposite of West German ways. The targets tended to be the very people that West German justice had ignored.
  • Thorough purges took place in the judiciary, the bureaucracy, and industry. About 200,000 people—four-fifths of the Nazi judges and prosecutors—lost their jobs. War crimes trials were held too; until 1947 by the Soviets, after that in German courts.
  • There were two more before 1957, and none after that. All in all, about 30,000 people had been tried and 500 executed. In the Federal Republic the number was about 91,000, and none were executed, as the death penalty was abolished by the 1949 constitution.
  • East German methods were both ruthless and expedient, and the official conclusion to the process was that the GDR no longer had to bear the burden of guilt. As state propaganda ceaselessly pointed out, the guilty were all in the West. There the fascists still sat as judges and ran the industries that produced the economic boom, the Wirtschaftswunder.
  • society. Although some of his critics, mostly on the old left, in both former Germanys, called him a grand inquisitor, few doubted the pastor’s good intentions. His arguments for trials were moral, judicial, and historical. He set out his views in a book entitled The Stasi Documents. Echoes of an earlier past rang through almost every page. “We can
  • Germany of the guilty, the people who felt betroffen by their own “inability to mourn,” the nation that staged the Auschwitz and Majdanek trials, that Germany was now said to stand in judgment over the other Germany—the Germany of the old antifascists, the Germany that had suffered under two dictatorships, the Germany of uniformed marches, goose-stepping drills, and a secret police network, vast beyond even the Gestapo’s dreams.
  • It is almost a form of subversion to defend a person who stands accused in court. So the idea of holding political and military leaders legally accountable for their actions was even stranger in Japan than it was in Germany. And yet, the shadows thrown by the Tokyo trial have been longer and darker in Japan than those of the Nuremberg trial in Germany.
  • never was—unlike, say, the railway station or the government ministry—a central institution of the modern Japanese state. The law was not a means to protect the people from arbitrary rule; it was, rather, a way for the state to exercise more control over the people. Even today, there are relatively few lawyers in Japan.
  • Japanese school textbooks are the product of so many compromises that they hardly reflect any opinion at all. As with all controversial matters in Japan, the more painful, the less said. In a standard history textbook for middle school students, published in the 1980s, mention of the Tokyo trial takes up less than half a page. All it says is that the trial…
  • As long as the British and the Americans continued to be oppressors in Asia, wrote a revisionist historian named Hasegawa Michiko, who was born in 1945, “confrontation with Japan was inevitable. We did not fight for Japan alone. Our aim was to fight a Greater East Asia War. For this reason the war between Japan and China and Japan’s oppression of…
  • West German textbooks describe the Nuremberg trial in far more detail. And they make a clear distinction between the retroactive law on crimes against peace and the…
  • Nationalist revisionists talk about “the Tokyo Trial View of History,” as though the conclusions of the tribunal had been nothing but rabid anti-Japanese propaganda. The tribunal has been called a lynch mob, and Japanese leftists are blamed for undermining the morale of generations of Japanese by passing on the Tokyo Trial View of History in school textbooks and liberal publications. The Tokyo Trial…
  • When Hellmut Becker said that few Germans wished to criticize the procedures of the Nuremberg trial because the criminality of the defendants was so plain to see, he was talking about crimes against humanity—more precisely, about the Holocaust. And it was…
  • The knowledge compiled by the doctors of Unit 731—of freezing experiments, injection of deadly diseases, vivisections, among other things—was considered so valuable by the Americans in 1945 that the doctors…
  • those aspects of the war that were most revolting and furthest removed from actual combat, such as the medical experiments on human guinea pigs (known as “logs”) carried out by Unit 731 in…
  • There never were any Japanese war crimes trials, nor is there a Japanese Ludwigsburg. This is partly because there was no exact equivalent of the Holocaust. Even though the behavior of Japanese troops was often barbarous, and the psychological consequences of State Shinto and emperor worship were frequently as hysterical as Nazism, Japanese atrocities were part of a…
  • This difference between (West) German and Japanese textbooks is not just a matter of detail; it shows a gap in perception. To the Japanese, crimes against humanity are not associated with an equivalent to the…
  • on what grounds would Japanese courts have prosecuted their own former leaders? Hata’s answer: “For starting a war which they knew they would lose.” Hata used the example of General Galtieri and his colleagues in Argentina after losing the Falklands War. In short, they would have been tried for losing the war, and the intense suffering they inflicted on their own people. This is as though German courts in 1918 had put General Hindenburg or General Ludendorff on trial.
  • it shows yet again the fundamental difference between the Japanese war, in memory and, I should say, in fact, and the German experience. The Germans fought a war too, but the one for which they tried their own people, the Bogers and the Schwammbergers, was a war they could not lose, unless defeat meant that some of the enemies survived.
  • Just as German leftists did in the case of Nuremberg, Kobayashi used the trial to turn the tables against the judges. But not necessarily to mitigate Japanese guilt. Rather, it was his intention to show how the victors had betrayed the pacifism they themselves had imposed on Japan.
  • the Japanese left has a different view of the Tokyo trial than the revisionist right. It is comparable to the way the German left looks upon Nuremberg. This was perfectly, if somewhat long-windedly, expressed in Kobayashi Masaki’s documentary film Tokyo Trial, released in 1983. Kobayashi is anything but an apologist for the Japanese war. His most famous film, The Human Condition, released in 1959, took a highly critical view of the war.
  • Yoshimoto’s memory was both fair and devastating, for it pointed straight at the reason for the trial’s failure. The rigging of a political trial—the “absurd ritual”—undermined the value of that European idea of law.
  • Yoshimoto went on to say something no revisionist would ever mention: “I also remember my fresh sense of wonder at this first encounter with the European idea of law, which was so different from the summary justice in our Asiatic courts. Instead of getting your head chopped off without a proper trial, the accused were able to defend themselves, and the careful judgment appeared to follow a public procedure.”
  • Yoshimoto Takaaki, philosopher of the 1960s New Left. Yet he wrote in 1986 that “from our point of view as contemporaries and witnesses, the trial was partly plotted from the very start. It was an absurd ritual before slaughtering the sacrificial lamb.”
  • This, from all accounts, was the way it looked to most Japanese, even if they had little sympathy for most of the “lambs.” In 1948, after three years of American occupation censorship and boosterism, people listened to the radio broadcast of the verdicts with a sad but fatalist shrug: this is what you can expect when you lose the war.
  • Some of the information even surprised the defendants. General Itagaki Seishiro, a particularly ruthless figure, who was in command of prison camps in Southeast Asia and whose troops had massacred countless Chinese civilians, wrote in his diary: “I am learning of matters I had not known and recalling things I had forgotten.”
  • hindsight, one can only conclude that instead of helping the Japanese to understand and accept their past, the trial left them with an attitude of cynicism and resentment.
  • After it was over, the Nippon Times pointed out the flaws of the trial, but added that “the Japanese people must ponder over why it is that there has been such a discrepancy between what they thought and what the rest of the world accepted almost as common knowledge. This is at the root of the tragedy which Japan brought upon herself.”
  • Political trials produce politicized histories. This is what the revisionists mean when they talk about the Tokyo Trial View of History. And they are right, even if their own conclusions are not.
  • Frederick Mignone, one of the prosecutors, said a trifle histrionically that “in Japan and in the Orient in general, the trial is one of the most important phases of the occupation. It has received wide coverage in the Japanese press and revealed for the first time to millions of Japanese the scheming, duplicity, and insatiable desire for power of her entrenched militaristic leaders, writing a much-needed history of events which otherwise would not have been written.” It was indeed much-needed, since so little was known.
  • The president of the Tokyo tribunal, Sir William Webb, thought “the crimes of the German accused were far more heinous, varied and extensive than those of the Japanese accused.” Put in another way, nearly all the defendants at Nuremberg, convicted of crimes against peace, were also found guilty of crimes against humanity. But half the Japanese defendants received life sentences for political crimes only.
  • the question of responsibility is always a tricky affair in Japan, where formal responsibility is easier to identify than actual guilt. Not only were there many men, such as the hero of Kinoshita’s play, who took the blame for what their superiors had done—a common practice in Japan, in criminal gangs as well as in politics or business corporations—but the men at the top were often not at all in control of their unscrupulous subordinates.
  • “These men were not the hoodlums who were the powerful part of the group which stood before the tribunal at Nuremberg, dregs of a criminal environment, thoroughly schooled in the ways of crime and knowing no other methods but those of crime. These men were supposed to be the elite of the nation, the honest and trusted leaders to whom the fate of the nation had been confidently entrusted
  • many people were wrongly accused of the wrong things for the wrong reasons. This is why there was such sympathy in Japan for the men branded by foreigners as war criminals, particularly the so-called Class B and Class C criminals, the men who followed orders, or gave them at a lower level: field commanders, camp guards, and so on.
  • “The Japanese people are of the opinion that the actual goal of the war crimes tribunals was never realized, since the judgments were reached by the victors alone and had the character of revenge. The [Japanese] war criminal is not conscious of having committed a crime, for he regards his deeds as acts of war, committed out of patriotism.”
  • Yamashita Tomoyuki. Terrible atrocities were committed under his command in the Philippines. The sacking of Manila in 1945 was about as brutal as the Nanking Massacre. So to depict him in the movie as a peaceful gentleman, while portraying the American prosecutor in Manila as one of the main villains, might seem an odd way to view the past.
  • The Shrine ranks highest. It is the supreme symbol of authority, shouldered (like a shrine on festival days) by the Officials.
  • The political theorist Maruyama Masao called the prewar Japanese government a “system of irresponsibilities.” He identified three types of political personalities: the portable Shrine, the Official, and the Outlaw.
  • those who carry it, the Officials, are the ones with actual power. But the Officials—bureaucrats, politicians, admirals and generals—are often manipulated by the lowest-ranking Outlaws, the military mavericks, the hotheaded officers in the field, the mad nationalists, and other agents of violence.
  • But it was not entirely wrong, for the trial was rigged. Yamashita had no doubt been a tough soldier, but in this case he had been so far removed from the troops who ran amok in Manila that he could hardly have known what was going on. Yet the American prosecutor openly talked about his desire to hang “Japs.”
  • When the system spins out of control, as it did during the 1930s, events are forced by violent Outlaws, reacted to by nervous Officials, and justified by the sacred status of the Shrines.
  • Here we come to the nub of the problem, which the Tokyo trial refused to deal with, the role of the Shrine in whose name every single war crime was committed, Emperor Hirohito,
  • The historian Ienaga Saburo tells a story about a Japanese schoolchild in the 1930s who was squeamish about having to dissect a live frog. The teacher rapped him hard on the head with his knuckles and said: “Why are you crying about one lousy frog? When you grow up you’ll have to kill a hundred, two hundred Chinks.”
  • the lethal consequences of the emperor-worshipping system of irresponsibilities did emerge during the Tokyo trial. The savagery of Japanese troops was legitimized, if not driven, by an ideology that did not include a Final Solution but was as racialist as Hitler’s National Socialism. The Japanese were the Asian Herrenvolk, descended from the gods.
  • A veteran of the war in China said in a television interview that he was able to kill Chinese without qualms only because he didn’t regard them as human.
  • For to keep the emperor in place (he could at least have been made to resign), Hirohito’s past had to be freed from any blemish; the symbol had to be, so to speak, cleansed from what had been done in its name.
  • The same was true of the Japanese imperial institution, no matter who sat on the throne, a ruthless war criminal or a gentle marine biologist.
  • the chaplain at Sugamo prison, questioned Japanese camp commandants about their reasons for mistreating POWs. This is how he summed up their answers: “They had a belief that any enemy of the emperor could not be right, so the more brutally they treated their prisoners, the more loyal to their emperor they were being.”
  • The Mitscherlichs described Hitler as “an object on which Germans depended, to which they transferred responsibility, and he was thus an internal object. As such, he represented and revived the ideas of omnipotence that we all cherish about ourselves from infancy.
  • The fear after 1945 was that without the emperor Japan would be impossible to govern. In fact, MacArthur behaved like a traditional Japanese strongman (and was admired for doing so by many Japanese), using the imperial symbol to enhance his own power. As a result, he hurt the chances of a working Japanese democracy and seriously distorted history.
  • Aristides George Lazarus, the defense counsel of one of the generals on trial, was asked to arrange that “the military defendants, and their witnesses, would go out of their way during their testimony to include the fact that Hirohito was only a benign presence when military actions or programs were discussed at meetings that, by protocol, he had to attend.” No doubt the other counsel were given similar instructions. Only once during the trial
Javier E

12 Rules for Life: An Antidote to Chaos (Jordan B. Peterson) - 0 views

  • RULES? MORE RULES? REALLY? Isn’t life complicated enough, restricting enough, without abstract rules that don’t take our unique, individual situations into account? And given that our brains are plastic, and all develop differently based on our life experiences, why even expect that a few rules might be helpful to us all?
  • “I’ve got some good news…and I’ve got some bad news,” the lawgiver yells to them. “Which do you want first?” “The good news!” the hedonists reply. “I got Him from fifteen commandments down to ten!” “Hallelujah!” cries the unruly crowd. “And the bad?” “Adultery is still in.”
  • Maps of Meaning was sparked by Jordan’s agonized awareness, as a teenager growing up in the midst of the Cold War, that much of mankind seemed on the verge of blowing up the planet to defend their various identities. He felt he had to understand how it could be that people would sacrifice everything for an “identity,”
  • ...297 more annotations...
  • the story of the golden calf also reminds us that without rules we quickly become slaves to our passions—and there’s nothing freeing about that.
  • And the story suggests something more: unchaperoned, and left to our own untutored judgment, we are quick to aim low and worship qualities that are beneath us—in this case, an artificial animal that brings out our own animal instincts in a completely unregulated way.
  • Similarly, in this book Professor Peterson doesn’t just propose his twelve rules, he tells stories, too, bringing to bear his knowledge of many fields as he illustrates and explains why the best rules do not ultimately restrict us but instead facilitate our goals and make for fuller, freer lives.
  • Peterson wasn’t really an “eccentric”; he had sufficient conventional chops, had been a Harvard professor, was a gentleman (as cowboys can be) though he did say damn and bloody a lot, in a rural 1950s sort of way. But everyone listened, with fascination on their faces, because he was in fact addressing questions of concern to everyone at the table.
  • unlike many academics who take the floor and hold it, if someone challenged or corrected him he really seemed to like it. He didn’t rear up and neigh. He’d say, in a kind of folksy way, “Yeah,” and bow his head involuntarily, wag it if he had overlooked something, laughing at himself for overgeneralizing. He appreciated being shown another side of an issue, and it became clear that thinking through a problem was, for him, a dialogic process.
  • for an egghead Peterson was extremely practical. His examples were filled with applications to everyday life: business management, how to make furniture (he made much of his own), designing a simple house, making a room beautiful (now an internet meme) or in another, specific case related to education, creating an online writing project that kept minority students from dropping out of school by getting them to do a kind of psychoanalytic exercise on themselves,
  • These Westerners were different: self-made, unentitled, hands on, neighbourly and less precious than many of their big-city peers, who increasingly spend their lives indoors, manipulating symbols on computers. This cowboy psychologist seemed to care about a thought only if it might, in some way, be helpful to someone.
  • I was drawn to him because here was a clinician who also had given himself a great books education, and who not only loved soulful Russian novels, philosophy and ancient mythology, but who also seemed to treat them as his most treasured inheritance. But he also did illuminating statistical research on personality and temperament, and had studied neuroscience. Though trained as a behaviourist, he was powerfully drawn to psychoanalysis with its focus on dreams, archetypes, the persistence of childhood conflicts in the adult, and the role of defences and rationalization in everyday life. He was also an outlier in being the only member of the research-oriented Department of Psychology at the University of Toronto who also kept a clinical practice.
  • Maps of Meaning, published nearly two decades ago, shows Jordan’s wide-ranging approach to understanding how human beings and the human brain deal with the archetypal situation that arises whenever we, in our daily lives, must face something we do not understand.
  • The brilliance of the book is in his demonstration of how rooted this situation is in evolution, our DNA, our brains and our most ancient stories. And he shows that these stories have survived because they still provide guidance in dealing with uncertainty, and the unavoidable unknown.
  • this is why many of the rules in this book, being based on Maps of Meaning, have an element of universality to them.
  • We are ambivalent about rules, even when we know they are good for us. If we are spirited souls, if we have character, rules seem restrictive, an affront to our sense of agency and our pride in working out our own lives. Why should we be judged according to another’s rule?
  • And he felt he had to understand the ideologies that drove totalitarian regimes to a variant of that same behaviour: killing their own citizens.
  • Ideologies are simple ideas, disguised as science or philosophy, that purport to explain the complexity of the world and offer remedies that will perfect it.
  • Ideologues are people who pretend they know how to “make the world a better place” before they’ve taken care of their own chaos within.
  • Ideologies are substitutes for true knowledge, and ideologues are always dangerous when they come to power, because a simple-minded I-know-it-all approach is no match for the complexity of existence.
  • To understand ideology, Jordan read extensively about not only the Soviet gulag, but also the Holocaust and the rise of Nazism. I had never before met a person, born Christian and of my generation, who was so utterly tormented by what happened in Europe to the Jews, and who had worked so hard to understand how it could have occurred.
  • I saw what now millions have seen online: a brilliant, often dazzling public speaker who was at his best riffing like a jazz artist; at times he resembled an ardent Prairie preacher (not in evangelizing, but in his passion, in his ability to tell stories that convey the life-stakes that go with believing or disbelieving various ideas). Then he’d just as easily switch to do a breathtakingly systematic summary of a series of scientific studies. He was a master at helping students become more reflective, and take themselves and their futures seriously. He taught them to respect many of the greatest books ever written. He gave vivid examples from clinical practice, was (appropriately) self-revealing, even of his own vulnerabilities, and made fascinating links between evolution, the brain and religious stories.
  • Above all, he alerted his students to topics rarely discussed in university, such as the simple fact that all the ancients, from Buddha to the biblical authors, knew what every slightly worn-out adult knows, that life is suffering.
  • chances are, if you or someone you love is not suffering now, they will be within five years, unless you are freakishly lucky. Rearing kids is hard, work is hard, aging, sickness and death are hard, and Jordan emphasized that doing all that totally on your own, without the benefit of a loving relationship, or wisdom, or the psychological insights of the greatest psychologists, only makes it harder.
  • focused on triumphant heroes. In all these triumph stories, the hero has to go into the unknown, into an unexplored territory, and deal with a new great challenge and take great risks. In the process, something of himself has to die, or be given up, so he can be reborn and meet the challenge. This requires courage, something rarely discussed in a psychology class or textbook.
  • Jordan
  • views of his first YouTube statements quickly numbered in the hundreds of thousands. But people have kept listening because what he is saying meets a deep and unarticulated need. And that is because alongside our wish to be free of rules, we all search for structure.
  • the first generation to have been so thoroughly taught two seemingly contradictory ideas about morality, simultaneously—at their schools, colleges and universities, by many in my own generation. This contradiction has left them at times disoriented and uncertain, without guidance and, more tragically, deprived of riches they don’t even know exist.
  • morality and the rules associated with it are just a matter of personal opinion or happenstance, “relative to” or “related to” a particular framework, such as one’s ethnicity, one’s upbringing, or the culture or historical…
  • The first idea or teaching is that morality is relative, at best a…
  • So, the decent thing to do—once it becomes apparent how arbitrary your, and your society’s, “moral values” are—is to show tolerance for people who think differently, and…
  • for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an adult can…
  • That emphasis on tolerance is so paramount that for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an…
  • And so a generation has been raised untutored in what was once called, aptly, “practical wisdom,” which guided previous generations. Millennials, often told they have received the finest education available anywhere, have actually…
  • professors, chose to devalue thousands of years of human knowledge about how to acquire virtue, dismissing it as passé, “…
  • They were so successful at it that the very word “virtue” sounds out of date, and someone using it appears…
  • The study of virtue is not quite the same as the study of morals (right and wrong, good and evil). Aristotle defined the virtues simply as the ways of behaving that are most conducive to happiness in life. Vice was…
  • Cultivating judgment about the difference between virtue and vice is the beginning of wisdom, something…
  • By contrast, our modern relativism begins by asserting that making judgments about how to live is impossible, because there is no real good, and no…
  • Thus relativism’s closest approximation to “virtue” is “tolerance.” Only tolerance will provide social cohesion between different groups, and save us from harming each other. On Facebook and other forms of social media, therefore, you signal your so-called…
  • Intolerance of others’ views (no matter how ignorant or incoherent they may be) is not simply wrong; in a world where there is no right or wrong, it is worse: it is a sign you are…
  • But it turns out that many people cannot tolerate the vacuum—the chaos—which is inherent in life, but made worse by this moral relativism; they cannot live without a moral compass,…
  • So, right alongside relativism, we find the spread of nihilism and despair, and also the opposite of moral relativism: the blind certainty offered by ideologies…
  • Dr. Norman Doidge, MD, is the author of The Brain That Changes Itself
  • so we arrive at the second teaching that millennials have been bombarded with. They sign up for a humanities course, to study the greatest books ever written. But they’re not assigned the books; instead they are given…
  • (But the idea that we can easily separate facts and values was and remains naive; to some extent, one’s values determine what one will pay…
  • For the ancients, the discovery that different people have different ideas about how, practically, to live, did not paralyze them; it deepened their understanding of humanity and led to some of the most satisfying conversations human beings have ever had, about how life might be lived.
  • Modern moral relativism has many sources. As we in the West learned more history, we understood that different epochs had different moral codes. As we travelled the seas and explored the globe, we learned of far-flung tribes on different continents whose different moral codes made sense relative to, or within the framework of, their societies. Science played a role, too, by attacking the religious view of the world, and thus undermining the religious grounds for ethics and rules. Materialist social science implied that we could divide the world into facts (which all could observe, and were objective and “real”) and values (…
  • it seems that all human beings are, by some kind of biological endowment, so ineradicably concerned with morality that we create a structure of laws and rules wherever we are. The idea that human life can be free of moral concerns is a fantasy.
  • given that we are moral animals, what must be the effect of our simplistic modern relativism upon us? It means we are hobbling ourselves by pretending to be something we are not. It is a mask, but a strange one, for it mostly deceives the one who wears it.
  • Far better to integrate the best of what we are now learning with the books human beings saw fit to preserve over millennia, and with the stories that have survived, against all odds, time’s tendency to obliterate.
  • these really are rules. And the foremost rule is that you must take responsibility for your own life. Period.
  • Jordan’s message that each individual has ultimate responsibility to bear; that if one wants to live a full life, one first sets one’s own house in order; and only then can one sensibly aim to take on bigger responsibilities.
  • if it’s uncertain that our ideals are attainable, why do we bother reaching in the first place? Because if you don’t reach for them, it is certain you will never feel that your life has meaning.
  • And perhaps because, as unfamiliar and strange as it sounds, in the deepest part of our psyche, we all want to be judged.
  • Instead of despairing about these differences in moral codes, Aristotle argued that though specific rules, laws and customs differed from place to place, what does not differ is that in all places human beings, by their nature, have a proclivity to make rules, laws and customs.
  • Freud never argued (as do some who want all culture to become one huge group therapy session) that one can live one’s entire life without ever making judgments, or without morality. In fact, his point in Civilization and Its Discontents is that civilization only arises when some restraining rules and morality are in place.
  • Aleksandr Solzhenitsyn, the great documenter of the slave-labour-camp horrors of the latter, once wrote that the “pitiful ideology” holding that “human beings are created for happiness” was an ideology “done in by the first blow of the work assigner’s cudgel.”1 In a crisis, the inevitable suffering that life entails can rapidly make a mockery of the idea that happiness is the proper pursuit of the individual. On the radio show, I suggested, instead, that a deeper meaning was required. I noted that the nature of such meaning was constantly re-presented in the great stories of the past, and that it had more to do with developing character in the face of suffering than with happiness.
  • I proposed in Maps of Meaning that the great myths and religious stories of the past, particularly those derived from an earlier, oral tradition, were moral in their intent, rather than descriptive. Thus, they did not concern themselves with what the world was, as a scientist might have it, but with how a human being should act.
  • I suggested that our ancestors portrayed the world as a stage—a drama—instead of a place of objects. I described how I had come
  • to believe that the constituent elements of the world as drama were order and chaos, and not material things.
  • Order is where the people around you act according to well-understood social norms, and remain predictable and cooperative. It’s the world of social structure, explored territory, and familiarity. The state of Order is typically portrayed, symbolically—imaginatively—as masculine.
  • Chaos, by contrast, is where—or when—something unexpected happens.
  • As the antithesis of symbolically masculine order, it’s presented imaginatively as feminine. It’s the new and unpredictable suddenly emerging in the midst of the commonplace familiar. It’s Creation and Destruction,
  • Order is the white, masculine serpent; Chaos, its black, feminine counterpart. The black dot in the white—and the white in the black—indicate the possibility of transformation: just when things seem secure, the unknown can loom, unexpectedly and large. Conversely, just when everything seems lost, new order can emerge from catastrophe and chaos.
  • For the Taoists, meaning is to be found on the border between the ever-entwined pair. To walk that border is to stay on the path of life, the divine Way. And that’s much better than happiness.
  • trying to address a perplexing problem: the reason or reasons for the nuclear standoff of the Cold War. I couldn’t understand how belief systems could be so important to people that they were willing to risk the destruction of the world to protect them. I came to realize that shared belief systems made people intelligible to one another—and that the systems weren’t just about belief.
  • People who live by the same code are rendered mutually predictable to one another. They act in keeping with each other’s expectations and desires. They can cooperate. They can even compete peacefully, because everyone knows what to expect from everyone else.
  • Shared beliefs simplify the world, as well, because people who know what to expect from one another can act together to tame the world. There is perhaps nothing more important than the maintenance of this organization—this simplification. If it’s threatened, the great ship of state rocks.
  • It isn’t precisely that people will fight for what they believe. They will fight, instead, to maintain the match between what they believe, what they expect, and what they desire. They will fight to maintain the match between what they expect and how everyone is acting. It is precisely the maintenance of that match that enables everyone
  • There’s more to it, too. A shared cultural system stabilizes human interaction, but is also a system of value—a hierarchy of value, where some things are given priority and importance and others are not. In the absence of such a system of value, people simply cannot act. In fact, they can’t even perceive, because both action and perception require a goal, and a valid goal is, by necessity, something valued.
  • We experience much of our positive emotion in relation to goals. We are not happy, technically speaking, unless we see ourselves progressing—and the very idea of progression implies value.
  • Worse yet is the fact that the meaning of life without positive value is not simply neutral. Because we are vulnerable and mortal, pain and anxiety are an integral part of human existence. We must have something to set against the suffering that is intrinsic to Being.*2 We must have the meaning inherent in a profound system of value or the horror of existence rapidly becomes paramount. Then, nihilism beckons, with its hopelessness and despair.
  • So: no value, no meaning. Between value systems, however, there is the possibility of conflict. We are thus eternally caught between the most diamantine rock and the hardest of places:
  • loss of group-centred belief renders life chaotic, miserable, intolerable; presence of group-centred belief makes conflict with other groups inevitable.
  • In the West, we have been withdrawing from our tradition-, religion- and even nation-centred cultures, partly to decrease the danger of group conflict. But we are increasingly falling prey to the desperation of meaninglessness, and that is no improvement at all.
  • While writing Maps of Meaning, I was (also) driven by the realization that we can no longer afford conflict—certainly not on the scale of the world conflagrations of the twentieth century.
  • I came to a more complete, personal realization of what the great stories of the past continually insist upon: the centre is occupied by the individual.
  • It is possible to transcend slavish adherence to the group and its doctrines and, simultaneously, to avoid the pitfalls of its opposite extreme, nihilism. It is possible, instead, to find sufficient meaning in individual consciousness and experience.
  • How could the world be freed from the terrible dilemma of conflict, on the one hand, and psychological and social dissolution, on the other? The answer was this: through the elevation and development of the individual, and through the willingness of everyone to shoulder the burden of Being and to take the heroic path. We must each adopt as much responsibility as possible for individual life, society and the world.
  • We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated. It is in this manner that we can and must reduce the suffering that poisons the world. It’s asking a lot. It’s asking for everything.
  • the alternative—the horror of authoritarian belief, the chaos of the collapsed state, the tragic catastrophe of the unbridled natural world, the existential angst and weakness of the purposeless
  • individual—is clearly worse.
  • a title: 12 Rules for Life: An Antidote to Chaos. Why did that one rise up above all others? First and foremost, because of its simplicity. It indicates clearly that people need ordering principles, and that chaos otherwise beckons.
  • We require rules, standards, values—alone and together. We’re pack animals, beasts of burden. We must bear a load, to justify our miserable existence. We require routine and tradition. That’s order. Order can become excessive, and that’s not good, but chaos can swamp us, so we drown—and that is also not good. We need to stay on the straight and narrow path.
  • I hope that these rules and their accompanying essays will help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life.
  • RULE 1   STAND UP STRAIGHT WITH YOUR SHOULDERS BACK
  • Because territory matters, and because the best locales are always in short supply, territory-seeking among animals produces conflict. Conflict, in turn, produces another problem: how to win or lose without the disagreeing parties incurring too great a cost.
  • It’s winner-take-all in the lobster world, just as it is in human societies, where the top 1 percent have as much loot as the bottom 50 percent11—and where the richest eighty-five people have as much as the bottom three and a half billion.
  • This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal.
  • Instead of undertaking the computationally difficult task of identifying the best man, the females outsource the problem to the machine-like calculations of the dominance hierarchy. They let the males fight it out and peel their paramours from the top.
  • The dominant male, with his upright and confident posture, not only gets the prime real estate and easiest access to the best hunting grounds. He also gets all the girls. It is exponentially more worthwhile to be successful, if you are a lobster, and male.
  • dominance hierarchies have been an essentially permanent feature of the environment to which all complex life has adapted. A third of a billion years ago, brains and nervous systems were comparatively simple. Nonetheless, they already had the structure and neurochemistry necessary to process information about status and society. The importance of this fact can hardly be overstated.
  • evolution works, in large part, through variation and natural selection. Variation exists for many reasons, including gene-shuffling (to put it simply) and random mutation. Individuals vary within a species for such reasons. Nature chooses from among them, across time. That theory, as stated, appears to account for the continual alteration of life-forms over the eons.
  • But there’s an additional question lurking under the surface: what exactly is the “nature” in “natural selection”? What exactly is “the environment” to which animals adapt?
  • Nature “selects.” The idea of selects contains implicitly nested within it the idea of fitness. It is “fitness” that is “selected.” Fitness, roughly speaking, is the probability that a given organism will leave offspring (will propagate its genes through time). The “fit” in “fitness” is therefore the matching of organismal attribute to environmental demand.
  • But nature, the selecting agent, is not a static selector—not in any simple sense.
  • As the environment supporting a species transforms and changes, the features that make a given individual successful in surviving and reproducing also transform and change. Thus, the theory of natural selection does not posit creatures matching themselves ever more precisely to a template specified by the world. It is more that creatures are in a dance with nature, albeit one that is deadly.
  • Nature is not simply dynamic, either. Some things change quickly, but they are nested within other things that change less quickly (music
  • It’s chaos, within order, within chaos, within higher order. The order that is most real is the order that is most unchanging—and that is not necessarily the order that is most easily seen. The leaf, when perceived, might blind the observer to the tree. The tree can blind him to the forest.
  • It is also a mistake to conceptualize nature romantically.
  • Unfortunately, “the environment” is also elephantiasis and guinea worms (don’t ask), anopheles mosquitoes and malaria, starvation-level droughts, AIDS and the Black Plague.
  • It is because of the existence of such things, of course, that we attempt to modify our surroundings, protecting our children, building cities and transportation systems and growing food and generating power.
  • this brings us to a third erroneous concept: that nature is something strictly segregated from the cultural constructs that have emerged within it.
  • It does not matter whether that feature is physical and biological, or social and cultural. All that matters, from a Darwinian perspective, is permanence—and the dominance hierarchy, however social or cultural it might appear, has been around for some half a billion years.
  • The dominance hierarchy is not capitalism. It’s not communism, either, for that matter. It’s not the military-industrial complex. It’s not the patriarchy—that disposable, malleable, arbitrary cultural artefact. It’s not even a human creation; not in the most profound sense. It is instead a near-eternal aspect of the environment, and much of what is blamed on these more ephemeral manifestations is a consequence of its unchanging existence.
  • We were struggling for position before we had skin, or hands, or lungs, or bones. There is little more natural than culture. Dominance hierarchies are older than trees.
  • The part of our brain that keeps track of our position in the dominance hierarchy is therefore exceptionally ancient and fundamental.17 It is a master control system, modulating our perceptions, values, emotions, thoughts and actions. It powerfully affects every aspect of our Being, conscious and unconscious alike.
  • The ancient part of your brain specialized for assessing dominance watches how you are treated by other people. On that evidence, it renders a determination of your value and assigns you a status. If you are judged by your peers as of little worth, the counter restricts serotonin availability. That makes you much more physically and psychologically reactive to any circumstance or event that might produce emotion, particularly if it is negative. You need that reactivity. Emergencies are common at the bottom, and you must be ready to survive. Unfortunately, that physical hyper-response, that constant alertness, burns up a lot of precious energy and physical resources.
  • It will leave you far more likely to live, or die, carelessly, for a rare opportunity at pleasure, when it manifests itself. The physical demands of emergency preparedness will wear you down in every way.21
  • If you have a high status, on the other hand, the counter’s cold, pre-reptilian mechanics assume that your niche is secure, productive
  • You can delay gratification, without forgoing it forever. You can afford to be a reliable and thoughtful citizen.
  • Sometimes, however, the counter mechanism can go wrong. Erratic habits of sleeping and eating can interfere with its function. Uncertainty can throw it for a loop. The body, with its various parts,
  • needs
  • to function like a well-rehearsed orchestra. Every system must play its role properly, and at exactly the right time, or noise and chaos ensue. It is for this reason that routine is so necessary. The acts of life we repeat every day need to be automatized. They must be turned into stable and reliable habits, so they lose their complexity and gain predictability and simplicity.
  • It is for such reasons that I always ask my clinical clients first about sleep. Do they wake up in the morning at approximately the time the typical person wakes up, and at the same time every day?
  • The next thing I ask about is breakfast. I counsel my clients to eat a fat and protein-heavy breakfast as soon as possible after they awaken (no simple carbohydrates, no sugars,
  • I have had many clients whose anxiety was reduced to subclinical levels merely because they started to sleep on a predictable schedule and eat breakfast.
  • Other bad habits can also interfere with the counter’s accuracy.
  • There are many systems of interaction between brain, body and social world that can get caught in positive feedback loops. Depressed people, for example, can start feeling useless and burdensome, as well as grief-stricken and pained. This makes them withdraw from contact with friends and family. Then the withdrawal makes them more lonesome and isolated, and more likely to feel useless and burdensome. Then they withdraw more. In this manner, depression spirals and amplifies.
  • If someone is badly hurt at some point in life—traumatized—the dominance counter can transform in a manner that makes additional hurt more rather than less likely. This often happens in the case of people, now adults, who were viciously bullied during childhood or adolescence. They become anxious and easily upset. They shield themselves with a defensive crouch, and avoid the direct eye contact interpretable as a dominance challenge.
  • With their capacity for aggression strait-jacketed within a too-narrow morality, those who are only or merely compassionate and self-sacrificing (and naïve and exploitable) cannot call forth the genuinely righteous and appropriately self-protective anger necessary to defend themselves. If you can bite, you generally don’t have to. When skillfully integrated, the ability to respond with aggression and violence decreases rather than increases the probability that actual aggression will become necessary.
  • Naive, harmless people usually guide their perceptions and actions with a few simple axioms: people are basically good; no one really wants to hurt anyone else; the threat (and, certainly, the use) of force, physical or otherwise, is wrong. These axioms collapse, or worse, in the presence of
  • individuals who are genuinely malevolent.27
  • I have had clients who were terrified into literally years of daily hysterical convulsions by the sheer look of malevolence on their attackers’ faces. Such individuals typically come from hyper-sheltered families, where nothing
  • terrible is allowed to exist, and everything is fairyland wonderful (or else).
  • When the wakening occurs—when once-naïve people recognize in themselves the seeds of evil and monstrosity, and see themselves as dangerous (at least potentially)— their fear decreases. They develop more self-respect. Then, perhaps, they begin to resist oppression. They see that they have the ability to withstand, because they are terrible too. They see they can and must stand up, because they begin to understand how genuinely monstrous they will become, otherwise,
  • There is very little difference between the capacity for mayhem and destruction, integrated, and strength of character. This is one of the most difficult lessons of life.
  • even if you came by your poor posture honestly—even if you were unpopular or bullied at home or in grade school28—it’s not necessarily appropriate now. Circumstances change. If you slump around, with the same bearing that characterizes a defeated lobster, people will assign you a lower status, and the old counter that you share with crustaceans, sitting at the very base of your brain, will assign you a low dominance number.
  • the other, far more optimistic lesson of Price’s law and the Pareto distribution: those who start to have will probably get more.
  • Some of these upwardly moving loops can occur in your own private, subjective space.
  • If you are asked to move the muscles one by one into a position that looks happy, you will report feeling happier. Emotion is partly bodily expression, and can be amplified (or dampened) by that expression.29
  • To stand up straight with your shoulders back is to accept the terrible responsibility of life, with eyes wide open.
  • It means deciding to voluntarily transform the chaos of potential into the realities of habitable order. It means adopting the burden of self-conscious vulnerability, and accepting the end of the unconscious paradise of childhood, where finitude and mortality are only dimly comprehended. It means willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language).
  • So, attend carefully to your posture. Quit drooping and hunching around. Speak your mind. Put your desires forward, as if you had a right to them—at least the same right as others. Walk tall and gaze forthrightly ahead. Dare to be dangerous. Encourage the serotonin to flow plentifully through the neural pathways desperate for its calming influence.
  • Thus emboldened, you will embark on the voyage of your life, let your light shine, so to speak, on the heavenly hill, and pursue your rightful destiny. Then the meaning of your life may be sufficient to keep the corrupting influence of mortal despair at bay. Then you may be able to accept the terrible burden of the World, and find joy.
  • RULE 2   TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING
  • People are better at filling and properly administering prescription medication to their pets than to themselves. That
  • It is difficult to conclude anything from this set of facts except that people appear to love their dogs, cats, ferrets and birds (and maybe even their lizards) more than themselves. How horrible is that? How much shame must exist, for something like that to be true? What could it be about people that makes them prefer their pets to themselves?
  • To understand Genesis 1, the Priestly story, with its insistence on speech as the fundamental creative force, it is first necessary to review a few fundamental, ancient assumptions (these are markedly different in type and intent from the assumptions of science, which are, historically speaking, quite novel).
  • those who existed during the distant time in which the foundational epics of our culture emerged were much more concerned with the actions that dictated survival (and with interpreting the world in a manner commensurate with that goal) than with anything approximating what we now understand as objective truth.
  • Before the dawn of the scientific worldview, reality was construed differently. Being was understood as a place of action, not a place of things.31 It was understood as something more akin to story or drama. That story or drama was lived, subjective experience, as it manifested itself moment to moment in the consciousness of every living person.
  • subjective pain. That’s something so real no argument can stand against it. Everyone acts as if their pain is real—ultimately, finally real. Pain matters, more than matter matters. It is for this reason, I believe, that so many of the world’s traditions regard the suffering attendant upon existence as the irreducible truth of Being.
  • In any case, that which we subjectively experience can be likened much more to a novel or a movie than to a scientific description of physical reality.
  • The Domain, Not of Matter, but of What Matters
  • the world of experience has primal constituents, as well. These are the necessary elements whose interactions define drama and fiction. One of these is chaos. Another is order. The third (as there are three) is the process that mediates between the two, which appears identical to what modern people call consciousness.
  • Chaos is the domain of ignorance itself. It’s unexplored territory. Chaos is what extends, eternally and without limit, beyond the boundaries of all states, all ideas, and all disciplines. It’s the foreigner, the stranger, the member of another gang, the rustle in the bushes in the night-time,
  • It is, in short, all those things and situations we neither know nor understand.
  • Chaos is also the formless potential from which the God of Genesis 1 called forth order using language at the beginning of time. It’s the same potential from which we, made in that Image, call forth the novel and ever-changing moments of our lives. And Chaos is freedom, dreadful freedom, too.
  • Order, by contrast, is explored territory. That’s the hundreds-of-millions-of-years-old hierarchy of place, position and authority. That’s the structure of society. It’s the structure provided by biology, too—particularly insofar as you are adapted, as you are, to the structure of society. Order is tribe, religion, hearth, home and country.
  • Order is the public façade we’re called upon to wear, the politeness of a gathering of civilized strangers, and the thin ice on which we all skate. Order is the place where the behavior of the world matches our expectations and our desires; the place where all things turn out the way we want them to.
  • But order is sometimes tyranny and stultification, as well, when the demand for certainty and uniformity and purity becomes too one-sided.
  • In order, we’re able to think about things in the long term. There, things work, and we’re stable, calm and competent. We seldom leave places we
  • understand—geographical or conceptual—for that reason, and we certainly do not like it when we are compelled to or when it happens accidentally.
  • When the same person betrays you, sells you out, you move from the daytime world of clarity and light to the dark underworld of chaos, confusion and despair. That’s the same move you make, and the same place you visit, when the company you work for starts to fail and your job is placed in doubt.
  • Before the Twin Towers fell—that was order. Chaos manifested itself afterward. Everyone felt it. The very air became uncertain. What exactly was it that fell? Wrong question. What exactly remained standing? That was the issue at hand.
  • Chaos is the deep ocean bottom to which Pinocchio voyaged to rescue his father from Monstro, whale and fire-breathing dragon. That journey into darkness and rescue is the most difficult thing a puppet must do, if he wants to be real; if he wants to extract himself from the temptations of deceit and acting and victimization and impulsive pleasure and totalitarian subjugation; if he wants to take his place as a genuine Being in the world.
  • Chaos is the new place and time that emerges when tragedy strikes suddenly, or malevolence reveals its paralyzing visage, even in the confines of your own home. Something unexpected or undesired can always make its appearance, when a plan is being laid out, regardless of how familiar the circumstances.
  • Our brains respond instantly when chaos appears, with simple, hyper-fast circuits maintained from the ancient days, when our ancestors dwelled in trees, and snakes struck in a flash.32 After that nigh-instantaneous, deeply reflexive bodily response comes the later-evolving, more complex but slower responses of emotions—and, after that, comes thinking, of the higher order, which can extend over seconds, minutes or years. All that response is instinctive, in some sense—but the faster the response, the more instinctive.
  • Things or objects are part of the objective world. They’re inanimate; spiritless. They’re dead. This is not true of chaos and order. Those are perceived, experienced and understood (to the degree that they are understood at all) as personalities—and that is just as true of the perceptions, experiences and understanding of modern people as their ancient forebears. It’s just that moderners don’t notice.
  • Perception of things as entities with personality also occurs before perception of things as things. This is particularly true of the action of others,34 living others, but we also see the non-living “objective world” as animated, with purpose and intent.
  • This is because of the operation of what psychologists have called “the hyperactive agency detector” within us.35 We evolved, over millennia, within intensely social circumstances. This means that the most significant elements of our environment of origin were personalities, not things, objects or situations.
  • The personalities we have evolved to perceive have been around, in predictable form, and in typical, hierarchical configurations, forever, for all intents and purposes. They have been…
  • the category of “parent” and/or “child” has been around for 200 million years. That’s longer than birds have existed. That’s longer than flowers have grown. It’s not a billion years, but it’s still a very long time. It’s plenty long enough for male and female and parent and child to serve as vital and fundamental parts of the environment to which we have adapted. This means that male and female and parent and child are…
  • Our brains are deeply social. Other creatures (particularly, other humans) were crucially important to us as we lived, mated and evolved. Those creatures were…
  • From a Darwinian perspective, nature—reality itself; the environment, itself—is what selects. The environment cannot be defined in any more fundamental manner. It is not mere inert matter. Reality itself is whatever we contend with when we are striving to survive and reproduce. A…
  • as our brain capacity increased and we developed curiosity to spare, we became increasingly aware of and curious about the nature of the world—what we eventually conceptualized as the objective…
  • “outside” is not merely unexplored physical territory. Outside is outside of what we currently understand—and understanding is dealing with and coping with…
  • when we first began to perceive the unknown, chaotic, non-animal world, we used categories that had originally evolved to represent the pre-human animal social world. Our minds are far older than mere…
  • Our most…
  • category—as old, in some sense, as the sexual act itself—appears to be that of sex, male and female. We appear to have taken that primordial knowledge of structured, creative opposition and…
  • Order, the known, appears symbolically associated with masculinity (as illustrated in the aforementioned yang of the Taoist yin-yang symbol). This is perhaps because the primary…
  • Chaos—the unknown—is symbolically associated with the feminine. This is partly because all the things we have come to know were born, originally, of the unknown, just as all beings we encounter were born of mothers. Chaos is mater, origin, source, mother; materia, the substance from which all things are made.
  • In its positive guise, chaos is possibility itself, the source of ideas, the mysterious realm of gestation and birth. As a negative force, it’s the impenetrable darkness of a cave and the accident by the side of the road.
  • Chaos, the eternal feminine, is also the crushing force of sexual selection.
  • Most men do not meet female human standards. It is for this reason that women on dating sites rate 85 percent of men as below average in attractiveness.40
  • Women’s proclivity to say no, more than any other force, has shaped our evolution into the creative, industrious, upright, large-brained (competitive, aggressive, domineering) creatures that we are.42 It is Nature as Woman who says, “Well, bucko, you’re good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation.”
  • Many things begin to fall into place when you begin to consciously understand the world in this manner. It’s as if the knowledge of your body and soul falls into alignment with the knowledge of your intellect.
  • And there’s more: such knowledge is proscriptive, as well as descriptive. This is the kind of knowing what that helps you know how. This is the kind of is from which you can derive an ought. The Taoist juxtaposition of yin and yang, for example, doesn’t simply portray chaos and order as the fundamental elements of Being—it also tells you how to act.
  • The Way, the Taoist path of life, is represented by (or exists on) the border between the twin serpents. The Way is the path of proper Being. It’s the same Way as that referred to by Christ in John 14:6: I am the way, and the truth and the life. The same idea is expressed in Matthew 7:14: Because strait is the gate, and narrow is the way, which leadeth unto life, and few there be that find it.
  • We eternally inhabit order, surrounded by chaos. We eternally occupy known territory, surrounded by the unknown. We experience meaningful engagement when we mediate appropriately between them. We are adapted, in the deepest Darwinian sense, not to the world of objects, but to the meta-realities of order and chaos, yang and yin. Chaos and order make up the eternal, transcendent environment of the living.
  • To straddle that fundamental duality is to be balanced: to have one foot firmly planted in order and security, and the other in chaos, possibility, growth and adventure.
  • Chaos and order are fundamental elements because every lived situation (even every conceivable lived situation) is made up of both.
  • you need to place one foot in what you have mastered and understood and the other in what you are currently exploring and mastering. Then you have positioned yourself where the terror of existence is under control and you are secure, but where you are also alert and engaged. That is where there is something new to master and some way that you can be improved. That is where meaning is to be found.
  • The serpent in Eden therefore means the same thing as the black dot in the yin side of the Taoist yin/yang symbol of totality—that is, the possibility of the unknown and revolutionary suddenly manifesting itself where everything appears calm.
  • The outside, chaos, always sneaks into the inside, because nothing can be completely walled off from the rest of reality. So even the ultimate in safe spaces inevitably harbours a snake.
  • We have seen the enemy, after all, and he is us. The snake inhabits each of our souls.
  • The worst of all possible snakes is the eternal human proclivity for evil. The worst of all possible snakes is psychological, spiritual, personal, internal. No walls, however tall, will keep that out. Even if the fortress were thick enough, in principle, to keep everything bad whatsoever outside, it would immediately appear again within.
  • I have learned that these old stories contain nothing superfluous. Anything accidental—anything that does not serve the plot—has long been forgotten in the telling. As the Russian playwright Anton Chekhov advised, “If there is a rifle hanging on the wall in act one, it must be fired in the next act. Otherwise it has no
  • business being there.”50
  • Eve immediately shares the fruit with Adam. That makes him self-conscious. Little has changed. Women have been making men self-conscious since the beginning of time. They do this primarily by rejecting them—but they also do it by shaming them, if men do not take responsibility. Since women bear the primary burden of reproduction, it’s no wonder. It is very hard to see how it could be otherwise. But the capacity of women to shame men and render them self-conscious is still a primal force of nature.
  • What does it mean to know yourself naked
  • Naked means vulnerable and easily damaged. Naked means subject to judgment for beauty and health. Naked means unprotected and unarmed in the jungle of nature and man. This is why Adam and Eve became ashamed, immediately after their eyes were opened. They could see—and what they first saw was themselves.
  • In their vulnerability, now fully realized, they felt unworthy to stand before God.
  • Beauty shames the ugly. Strength shames the weak. Death shames the living—and the Ideal shames us all.
  • He tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s merely descriptive.
  • women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
  • then God banishes the first man and the first woman from Paradise, out of infancy, out of the unconscious animal world, into the horrors of history itself. And then He puts cherubim and a flaming sword at the gate of Eden, just to stop them from eating the Fruit of the Tree of Life.
  • Perhaps Heaven is something you must build, and immortality something you must earn.
  • so we return to our original query: Why would someone buy prescription medication for his dog, and then so carefully administer it, when he would not do the same for himself?
  • Why should anyone take care of anything as naked, ugly, ashamed, frightened, worthless, cowardly, resentful, defensive and accusatory as a descendant of Adam? Even if that thing, that being, is himself?
  • We know how we are naked, and how that nakedness can be exploited—and that means we know how others are naked, and how they can be exploited. We can terrify other people, consciously. We can hurt and humiliate them for faults we understand only too well. We can torture them—literally—slowly, artfully and terribly. That’s far more than predation. That’s a qualitative shift in understanding. That’s a cataclysm as large as the development of self-consciousness itself. That’s the entry of the knowledge of Good and Evil into the world.
  • Only man could conceive of the rack, the iron maiden and the thumbscrew. Only man will inflict suffering for the sake of suffering. That is the best definition of evil I have been able to formulate.
  • with this realization we have well-nigh full legitimization of the idea, very unpopular in modern intellectual circles, of Original Sin.
  • Human beings have a great capacity for wrongdoing. It’s an attribute that is unique in the world of life. We can and do make things worse, voluntarily, with full knowledge of what we are doing (as well as accidentally, and carelessly, and in a manner that is willfully blind). Given that terrible capacity, that proclivity for malevolent actions, is it any wonder we have a hard time taking care of ourselves, or others—or even that we doubt the value of the entire human enterprise?
  • The juxtaposition of Genesis 1 with Genesis 2 & 3 (the latter two chapters outlining the fall of man, describing why our lot is so tragedy-ridden and ethically torturous) produces a narrative sequence almost unbearable in its profundity. The moral of Genesis 1 is that Being brought into existence through true speech is Good.
  • The original Man and Woman, existing in unbroken unity with their Creator, did not appear conscious (and certainly not self-conscious). Their eyes were not open. But, in their perfection, they were also less, not more, than their post-Fall counterparts. Their goodness was something bestowed, rather than deserved or earned.
  • Maybe, even in some cosmic sense (assuming that consciousness itself is a phenomenon of cosmic significance), free choice matters.
  • here’s a proposition: perhaps it is not simply the emergence of self-consciousness and the rise of our moral knowledge of Death and the Fall that besets us and makes us doubt our own worth. Perhaps it is instead our unwillingness—reflected in Adam’s shamed hiding—to walk with God, despite our fragility and propensity for evil.
  • The entire Bible is structured so that everything after the Fall—the history of Israel, the prophets, the coming of Christ—is presented as a remedy for that Fall, a way out of evil. The beginning of conscious history, the rise of the state and all its pathologies of pride and rigidity, the emergence of great moral figures who try to set things right, culminating in the Messiah Himself—that is all part of humanity’s attempt, God willing, to set itself right. And what would that mean?
  • And this is an amazing thing: the answer is already implicit in Genesis 1: to embody the Image of God—to speak out of chaos the Being that is Good—but to do so consciously, of our own free choice.
  • Back is the way forward—as T. S. Eliot so rightly insisted
  • We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.
  • If we wish to take care of ourselves properly, we would have to respect ourselves—but we don’t, because we are—not least in our own eyes—fallen creatures.
  • If we lived in Truth; if we spoke the Truth—then we could walk with God once again, and respect ourselves, and others, and the world. Then we might treat ourselves like people we cared for.
  • We might strive to set the world straight. We might orient it toward Heaven, where we would want people we cared for to dwell, instead of Hell, where our resentment and hatred would eternally sentence everyone.
  • Then, the primary moral issue confronting society was control of violent, impulsive selfishness and the mindless greed and brutality that accompanies it.
  • It is easy to believe that people are arrogant, and egotistical, and always looking out for themselves. The cynicism that makes that opinion a universal truism is widespread and fashionable.
  • But such an orientation to the world is not at all characteristic of many people. They have the opposite problem: they shoulder intolerable burdens of self-disgust, self-contempt, shame and self-consciousness. Thus, instead of narcissistically inflating their own importance, they don’t value themselves at all, and they don’t take care of themselves with attention and skill.
  • Christ’s archetypal death exists as an example of how to accept finitude, betrayal and tyranny heroically—how to walk with God despite the tragedy of self-conscious knowledge—and not as a directive to victimize ourselves in the service of others.
  • To sacrifice ourselves to God (to the highest good, if you like) does not mean to suffer silently and willingly when some person or organization demands more from us, consistently, than is offered in return. That means we are supporting tyranny, and allowing ourselves to be treated like slaves.
  • I learned two very important lessons from Carl Jung, the famous Swiss depth psychologist, about “doing unto others as you would have them do unto you” or “loving your neighbour as yourself.”
  • The first lesson was that neither of these statements has anything to do with being nice. The second was that both are equations, rather than injunctions.
  • If I am someone’s friend, family member, or lover, then I am morally obliged to bargain as hard on my own behalf as they are on theirs.
  • there is little difference between standing up and speaking for yourself, when you are being bullied or otherwise tormented and enslaved, and standing up and speaking for someone else.
  • you do not simply belong to yourself. You are not simply your own possession to torture and mistreat. This is partly because your Being is inexorably tied up with that of others, and your mistreatment of yourself can have catastrophic consequences for others.
  • metaphorically speaking, there is also this: you have a spark of the divine in you, which belongs not to you, but to God. We are, after all—according to Genesis—made in His image.
  • We can make order from chaos—and vice versa—in our way, with our words. So, we may not exactly be God, but we’re not exactly nothing, either.
  • In my own periods of darkness, in the underworld of the soul, I find myself frequently overcome and amazed by the ability of people to befriend each other, to love their intimate partners and parents and children, and to do what they must do to keep the machinery of the world running.
  • It is this sympathy that should be the proper medicament for self-conscious self-contempt, which has its justification, but is only half the full and proper story. Hatred for self and mankind must be balanced with gratefulness for tradition and the state and astonishment at what normal, everyday people accomplish
  • You have some vital role to play in the unfolding destiny of the world. You are, therefore, morally obliged to take care of yourself.
  • To treat yourself as if you were someone you are responsible for helping is, instead, to consider what would be truly good for you. This is not “what you want.” It is also not “what would make you happy.”
  • You must help a child become a virtuous, responsible, awake being, capable of full reciprocity—able to take care of himself and others, and to thrive while doing so. Why would you think it acceptable to do anything less for yourself?
  • You need to know who you are, so that you understand your armament and bolster yourself in respect to your limitations. You need to know where you are going, so that you can limit the extent of chaos in your life, restructure order, and bring the divine force of Hope to bear on the world.
  • You need to determine how to act toward yourself so that you are most likely to become and to stay a good person.
  • Don’t underestimate the power of vision and direction. These are irresistible forces, able to transform what might appear to be unconquerable obstacles into traversable pathways and expanding opportunities.
  • Once having understood Hell, researched it, so to speak—particularly your
  • own individual Hell—you could decide against going there or creating that.
  • You could, in fact, devote your life to this. That would give you a Meaning, with a capital M. That would justify your miserable existence.
  • That would atone for your sinful nature, and replace your shame and self-consciousness with the natural pride and forthright confidence of someone who has learned once again to walk with God in the Garden.
  • RULE 3   MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU
  • It would be more romantic, I suppose, to suggest that we would have all jumped at the chance for something more productive, bored out of our skulls as we were. But it’s not true. We were all too prematurely cynical and world-weary and leery of responsibility to stick to the debating clubs and Air Cadets and school sports that the adults around us tried to organize. Doing anything wasn’t cool.
  • When you move, everything is up in the air, at least for a while. It’s stressful, but in the chaos there are new possibilities. People, including you, can’t hem you in with their old notions. You get shaken out of your ruts. You can make new, better ruts, with people aiming at better things. I thought this was just a natural development. I thought that every person who moved would have—and want—the same phoenix-like experience.
  • What was it that made Chris and Carl and Ed unable (or, worse, perhaps, unwilling) to move or to change their friendships and improve the circumstances of their lives? Was it inevitable—a consequence of their own limitations, nascent illnesses and traumas of the past?
  • Why did he—like his cousin, like my other friends—continually choose people who, and places that, were not good for him?
  • perhaps, they don’t want the trouble of better. Freud called this a “repetition compulsion.” He thought of it as an unconscious drive to repeat the horrors of the past
  • People create their worlds with the tools they have directly at hand. Faulty tools produce faulty results. Repeated use of the same faulty tools produces the same faulty results.
  • It is in this manner that those who fail to learn from the past doom themselves to repeat it. It’s partly fate. It’s partly inability. It’s partly…unwillingness to learn? Refusal to learn? Motivated refusal to learn?
  • People choose friends who aren’t good for them for other reasons, too. Sometimes it’s because they want to rescue someone.
  • it is not easy to distinguish between someone truly wanting and needing help and someone who is merely exploiting a willing helper. The distinction is difficult even for the person who is wanting and needing and possibly exploiting.
  • When it’s not just naïveté, the attempt to rescue someone is often fuelled by vanity and narcissism.
  • But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you.
  • How do you know that your attempts to pull someone up won’t instead bring them—or you—further down?
  • The same thing happens when well-meaning counsellors place a delinquent teen among comparatively civilized peers. The delinquency spreads, not the stability.65 Down is a lot easier than up.
  • maybe you’re saving someone because you want to convince yourself that the strength of your character is more than just a side effect of your luck and birthplace. Or maybe it’s because it’s easier to look virtuous when standing alongside someone utterly irresponsible.
  • Or maybe you have no plan, genuine or otherwise, to rescue anybody. You’re associating with people who are bad for you not because it’s better for anyone, but because it’s easier.
  • You know it. Your friends know it. You’re all bound by an implicit contract—one aimed at nihilism, and failure, and suffering of the stupidest sort.
  • Before you help someone, you should find out why that person is in trouble. You shouldn’t merely assume that he or she is a noble victim of unjust circumstances and exploitation. It’s the most unlikely explanation, not the most probable.
  • Besides, if you buy the story that everything terrible just happened on its own, with no personal responsibility on the part of the victim, you deny that person all agency in the past (and, by implication, in the present and future, as well).
  • It is far more likely that a given individual has just decided to reject the path upward, because of its difficulty. Perhaps that should even be your default assumption, when faced with such a situation.
  • failure is easy to understand. No explanation for its existence is required. In the same manner, fear, hatred, addiction, promiscuity, betrayal and deception require no explanation. It’s not the existence of vice, or the indulgence in it, that requires explanation. Vice is easy.
  • Failure is easy, too. It’s easier not to shoulder a burden. It’s easier not to think, and not to do, and not to care. It’s easier to put off until tomorrow what needs to be done today,
  • Success: that’s the mystery. Virtue: that’s what’s inexplicable. To fail, you merely have to cultivate a few bad habits. You just have to bide your time. And once someone has spent enough time cultivating bad habits and biding their time, they are much diminished.
  • I am not saying that there is no hope of redemption. But it is much harder to extract someone
  • from a chasm than to lift him from a ditch. And some chasms are very deep. And there’s not much left of the body at the bottom.
  • Carl Rogers, the famous humanistic psychologist, believed it was impossible to start a therapeutic relationship if the person seeking help did not want to improve.67 Rogers believed it was impossible to convince someone to change for the better. The
  • none of this is a justification for abandoning those in real need to pursue your narrow, blind ambition, in case it has to be said.
  • Here’s something to consider: If you have a friend whose friendship you wouldn’t recommend to your sister, or your father, or your son, why would you have such a friend for yourself?
  • You are not morally obliged to support someone who is making the world a worse place. Quite the opposite. You should choose people who want things to be better, not worse. It’s a good thing, not a selfish thing, to choose people who are good for you.
  • It is for this reason that every good example is a fateful challenge, and every hero, a judge. Michelangelo’s great perfect marble David cries out to its observer: “You could be more than you are.”
  • Don’t think that it is easier to surround yourself with good healthy people than with bad unhealthy people. It’s not. A good, healthy person is an ideal. It requires strength and daring to stand up near such a person.
  • RULE 4   COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY
  • IT WAS EASIER FOR PEOPLE to be good at something when more of us lived in small, rural communities. Someone could be homecoming queen. Someone else could be spelling-bee champ, math whiz or basketball star. There were only one or two mechanics and a couple of teachers. In each of their domains, these local heroes had the opportunity to enjoy the serotonin-fuelled confidence of the victor.
  • Our hierarchies of accomplishment are now dizzyingly vertical.
  • No matter how good you are at something, or how you rank your accomplishments, there is someone out there who makes you look incompetent.
  • We are not equal in ability or outcome, and never will be. A very small number of people produce very much of everything.
  • People are unhappy at the bottom. They get sick there, and remain unknown and unloved. They waste their lives there. They die there. In consequence, the self-denigrating voice in the minds of people weaves a devastating tale. Life is a zero-sum game. Worthlessness is the default condition.
  • It is for such reasons that a whole generation of social psychologists recommended “positive illusions” as the only reliable route to mental health.69 Their credo? Let a lie be your umbrella. A more dismal, wretched, pessimistic philosophy can hardly be imagined:
  • Here is an alternative approach (and one that requires no illusions). If the cards are always stacked against you, perhaps the game you are playing is somehow rigged (perhaps by you, unbeknownst to yourself). If the internal voice makes you doubt the value of your endeavours—or your life, or life itself—perhaps you should stop listening.
  • There will always be people better than you—that’s a cliché of nihilism, like the phrase, In a million years, who’s going to know the difference? The proper response to that statement is not, Well, then, everything is meaningless. It’s, Any idiot can choose a frame of time within which nothing matters.
  • Standards of better or worse are not illusory or unnecessary. If you hadn’t decided that what you are doing right now was better than the alternatives, you wouldn’t be doing it. The idea of a value-free choice is a contradiction in terms. Value judgments are a precondition for action.
  • Furthermore, every activity, once chosen, comes with its own internal standards of accomplishment. If something can be done at all, it can be done better or worse. To do anything at all is therefore to play a game with a defined and valued end, which can always be reached more or less efficiently and elegantly.
  • We might start by considering the all-too-black-and-white words themselves: “success” or “failure.” You are either a success, a comprehensive, singular, over-all good thing, or its opposite, a failure, a comprehensive, singular, irredeemably bad thing.
  • There are vital degrees and gradations of value obliterated by this binary system, and the consequences are not good.
  • there is not just one game at which to succeed or fail. There are many games and, more specifically, many good games—
  • if changing games does not work, you can invent a new one. I
  • and athletic pursuits. You might consider judging your success across all the games you play.
  • When we are very young we are neither individual nor informed. We have not had the time nor gained the wisdom to develop our own standards. In consequence, we must compare ourselves to others, because standards are necessary.
  • As we mature we become, by contrast, increasingly individual and unique. The conditions of our lives become more and more personal and less and less comparable with those of others. Symbolically speaking, this means we must leave the house ruled by our father, and confront the chaos of our individual Being.
  • We must then rediscover the values of our culture—veiled from us by our ignorance, hidden in the dusty treasure-trove of the past—rescue them, and integrate them into our own lives. This is what gives existence its full and necessary meaning.
  • What is it that you actually love? What is it that you genuinely want? Before you can articulate your own standards of value, you must see yourself as a stranger—and then you must get to know yourself. What
  • Dare to be truthful. Dare to articulate yourself, and express (or at least become aware of) what would really justify your life.
  • Consult your resentment. It’s a revelatory emotion, for all its pathology. It’s part of an evil triad: arrogance, deceit, and resentment. Nothing causes more harm than this underworld Trinity. But resentment always means one of two things. Either the resentful person is immature, in which case he or she should shut up, quit whining, and get on with it, or there is tyranny afoot—in which case the person subjugated has a moral obligation to speak up.
  • Be cautious when you’re comparing yourself to others. You’re a singular being, once you’re an adult. You have your own particular, specific problems—financial, intimate, psychological, and otherwise.
  • Those are embedded in the unique broader context of your existence. Your career or job works for you in a personal manner, or it does not, and it does so in a unique interplay with the other specifics of your life.
  • We must see, but to see, we must aim, so we are always aiming. Our minds are built on the hunting-and-gathering platforms of our bodies. To hunt is to specify a target, track it, and throw at it.
  • We live within a framework that defines the present as eternally lacking and the future as eternally better. If we did not see things this way, we would not act at all. We wouldn’t even be able to see, because to see we must focus, and to focus we must pick one thing above all else on which to focus.
  • The disadvantage to all this foresight and creativity is chronic unease and discomfort. Because we always contrast what is with what could be, we have to aim at what could be.
  • The present is eternally flawed. But where you start might not be as important as the direction you are heading. Perhaps happiness is always to be found in the journey uphill, and not in the fleeting sense of satisfaction awaiting at the next peak.
  • Called upon properly, the internal critic will suggest something to set in order, which you could set in order, which you would set in order—voluntarily, without resentment, even with pleasure.
  • “Excuse me,” you might say to yourself, without irony or sarcasm. “I’m trying to reduce some of the unnecessary suffering around here. I could use some help.” Keep the derision at bay. “I’m wondering if there is anything that you would be willing to do? I’d be very grateful for your service.” Ask honestly and with humility. That’s no simple matter.
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
lenaurick

Your Hitler analogy is wrong, and other complaints from a history professor - Vox - 0 views

  • Recently, writers and pundits have been on a quest to find historical analogs for people, parties, and movements in our own times. Trump is like Hitler, Mussolini, and Napoleon; the imploding GOP getting rid of one ill-suited candidate after another is like Robespierre in the French Revolution, who stuck the executioner in the guillotine because there was no one left to behead. The late Supreme Court Justice Antonin Scalia was like Robert E. Lee.
  • Oh, and how Obama was like Hitler? But that's so 2015.
  • Really? Trump is like Hitler? The egotistical buffoon who sees himself as his own primary foreign adviser and changes his views on abortion three times in one day is like the despicable human being who oversaw the death of 6 million Jews? Hitler comparison has become so common over the years that it has its own probability factor known as Godwin's Law.
  • ...25 more annotations...
  • History is alive, and she has a lot to teach us. I quote William Faulkner (what history professor hasn't?) who famously declared: "The past is never dead. It is not even past."
  • History is not a deck of cards from which to randomly draw for comparative purposes. It is an immense repository of human thinking, doing, and being that can and should help us be slightly less narrow-minded and shortsighted than our forefathers and foremothers sometimes were. Good uses of history require more substance, unpacking, and analysis than a few quick sound bites can provide.
  • People aren't always sure what to do with history. But the laziest use is to make facile comparisons between then and now, this person and that.
  • Mostly these comparisons are shallow and not rooted in any depth of meaningful knowledge of the past. They rely on caricatures and selective historical tidbits in a way that, indeed, just about anyone can be compared to anyone else.
  • These comparisons tend to come in two forms: those meant to elevate, and those meant to denigrate. Both use historical comparisons to accomplish their goals
  • By associating their 21st-century political agendas with the 18th-century American rebels, modern Tea Partiers collapse the distance between then and now in order to legitimize their cause.
  • Slavery is another popular go-to comparison. But ... sorry, Kesha: Recording contracts are not like slavery. And Republicans: ”Neither is the national debt, Obamacare, income tax, or gun control. Or the TSA, global warming, or Affirmative Action.
  • In fact, presidential hopeful Ben Carson's comparisons to slavery were so common that he was parodied as suggesting that even buying a Megabus ticket is like slavery (which, sadly, is almost believable).
  • History as critique, honest assessment, and self-examination. Thinking long and hard about the treatment of Native Americans, past and present. American imperialism. Slavery, and its intertwining with the rise of modern capitalism. Xenophobia. Suppression of women's rights. These stories need to be told and retold, painful as they may be.
  • People who make historical comparisons don't actually believe that Ted Cruz is like Robespierre. But then why bother? The reason there aren't longer expositions of how exactly Trump is like Hitler is because, well, very quickly the analogy would break down. Male ... popular ... racist ... oh, never mind. These analogies are usually politically motivated, shallow, and intended to shock or damn. It's just lazy, and more politics as usual.
  • When we say that Trump or Obama is like Hitler, we slowly water down our actual knowledge of the very historical things we are using for comparison. When people link their frustration with the Affordable Care Act or gun control to slavery, they greatly diminish the historical magnitude and importance of a horrific historical reality that irreversibly altered the lives of 10 to 12 million enslaved Africans who were forced across the Atlantic to the Americas between the 15th and 19th centuries. Scholars speak of a "social death" that came from the incredible violence, emotional damage, and physical dislocation that took place during the Middle Passage and beyond.
  • The GOP's current crisis mirrors the French Revolution? Ted Cruz is like Robespierre? Please. You are granting way too much historical importance to the self-implosion of a political movement that rose to power over the past 30 years on a platform of moralistic piety, militarism, anti-abortion, and xenophobia.
  • One charitable reading of why people make these comparisons is that they fear we will end up in unpleasant and unfortunate situations that are like past circumstances. Behind the charge of Trump being a fascist is the fear that Trump, if elected president, will rule unilaterally in a way that oppresses certain segments of the population.
  • The only problem is that history really doesn't repeat itself. If anything, it remixes themes, reprises melodies, and borrows nasty racist ideologies. There are no exact historical analogs to today's politicians — jackasses or saviors.
  • "History doesn't repeat itself. But it rhymes." And it is in the rhyming that history still plays an important role.
  • Historian William Bouwsma once noted that the past is not the "private preserve of professional historians." Rather, he argued that history is a public utility, like water and electricity. If Bouwsma is right, the kind of history most people want is like water: clear, available at the turn of a knob, and easily controllable. But really, history is more like electricity shooting down the string of Franklin's fabled kite: wild, with alternating currents and unexpected twists, offshoots, and end results.
  • Voting for Trump won't bring about an American Holocaust, but it could usher in a new yet rhyming phase of history in which US citizens and immigrants from certain backgrounds are targeted and legally discriminated against, have their civil liberties curtailed, and even get forcibly relocated into "safe" areas. Hard to imagine?
  • American history, as Jon Stewart brilliantly reminded us, is at its core a series of events in which the current dominant group (no matter how recently established) dumps on the newest immigrant group. Catholics. Jews. Irish. Asians. They've all been in the crosshairs. All of them have been viewed as just as dangerous as the current out-group: Muslims.
  • Flippant comparisons also belittle and ignore the way that historical trauma creates immense ongoing psychological pain and tangible collective struggle that continues through generations, even up through the present.
  • If simplistic comparisons cheapen the past and dumb down our public discourse, using the past to understand how we got to where we are today is actually productive. It increases knowledge, broadens our perspective, and helps connect dots over time.
  • If Americans truly want to understand this GOP moment, we need not look to revolutionary France, but to the circa-1970s US, when the modern Republican Party was born. I know, Republican pundits like to call themselves the "party of Lincoln," but that is mostly nonsense
  • To compare Trump to Napoleon or Hitler is to make a vacuous historical comparison that obscures more than it reveals. But it is actually constructive to try to understand Trump as a fairly logical outcome of some of the cultural impulses that drove the moral majority and the religious right in the late 1970s and early 1980s. It tells us how we got here and, potentially, how to move forward.
  • Done well, history gives us perspective; it helps us gain a longer view of things. Through an understanding of the past we come to see trends over time, outcomes, causes, effects. We understand that stories and individual lives are embedded in larger processes. We learn of the boundless resilience of the human spirit, along with the depressing capacity for evil — even the banal variety — of humankind.
  • The past warns us against cruelty, begs us to be compassionate, asks that we simply stop and look our fellow human beings in the eyes.
  • Why, then, is Obama-Washington still on my office wall? Mostly to remind me of the irony of history. Of its complexity. That the past might not be past but is also not the present. It is a warning against mistaking progression in years with progress on issues. It is a reminder that each one of us plays an important part in the unfolding of history.
Javier E

How the Internet Is Like a Dying Star - 0 views

  • We are experiencing the same problems and having the same arguments. It’s all leading to a pervasive feeling, especially among younger people, that our systems in the United States (including our system of government) “are no longer able to meet the challenges our country is facing.”
  • The internet, as a mediator of human interactions, is not a place, it is a time. It is the past. I mean this in a literal sense. The layers of artifice that mediate our online interactions mean that everything that comes to us online comes to us from the past—sometimes the very recent past, but the past nonetheless.
  • Sacasas asks us to revise the notion of real-time communications online, and to instead view our actions as “inscriptions,” or written and visual records. Like stars in the galaxy, our inscriptions seem to twinkle in the present, but their light is actually many years old.
  • ...22 more annotations...
  • “Because we live in the past when we are online,” Sacasas suggests, “we will find ourselves fighting over the past.”
  • my hunch is that people feel stuck or move on because online, these events feel like things that have happened, rather than something that is happening.
  • “What we’re focused on is not the particular event or movement before us, but the one right behind us,”
  • “As we layer on these events, it becomes difficult for anything to break through. You’re trying to enter the information environment and the debate, and you find layer upon layer of abstraction over the initial point of conflict. You find yourself talking about what people are saying about the thing, instead of talking about the thing. We’re caking layers of commentary over the event itself and the event fades.” This is, if you ask me, a decent description of the last five years of news cycles.
  • So, what’s changed? Why do we feel more stuck now?
  • “I think it also has to do with the proportion of one’s daily experience to dispatches from the past,” Sacasas said. Pre-internet, “the totality of my day wasn't enclosed by this experience of media artifacts coming to me.”
  • the smartphone-bound, reasonably-but-not-terminally online people—the amount they spend engaged with the recent past has increased considerably, to the point that some are enclosed in this online world and develop a disordered relationship to time.
  • Constantly absorbing and commenting on things that have just happened sounds to me like a recipe for feeling powerless.
  • “That feeling of helplessness comes out of the fact that all our agency is being channeled through these media,” he said. “We have these events that are ponderously large, like climate change or gun control, and to view them only through the lens of what happened or the abstraction of what people are saying strips away the notion of our agency and makes it all feel so futile.”
  • the social-media platforms we live on push us toward contribution, and they make it feel necessary. Yet what is the sum total of these contributions? “If I'm cynical,” Sacasas said, “what I think it generates is something akin to influencer culture. It creates people who will make money off of channeling that attention—for better or for ill. Everyone else is stuck watching the show, feeling like we’re unable to effectively change the channel or change our circumstances.”
  • ubiquitous connectivity and our media environments naturally lend themselves toward an influencer-and-fandom dynamic. If the system is built to inspire more and more layers of commentary, then that system will privilege and reward people who feed it
  • On an internet that democratizes publishing, what this might mean is that all media takes on the meta-commentary characteristics of political or sports talk radio.
  • When the Depp-Heard trial began gaining traction online in April, Internet users around the world recognized a fresh opportunity to seize and monetize the attention. Christopher Orec, a 20-year-old content creator in Los Angeles, has posted a dozen videos about the trial to his more than 1.4 million followers on Instagram across several pages. “Personally, what I’ve gained from it is money as well as exposure from how well the videos do,” he said. You can “go from being a kid in high school and, if you hop on it early, it can basically change your life,” Orec said. “You can use those views and likes and shares that you get from it, to monetize and build your account and make more money from it, meet more people and network.”
  • if you were going to design a nightmare scenario, it might look a bit like what is described in this Washington Post story from last Thursday:
  • Like the Depp-Heard coverage, the forces that Sacasas describes can be deeply cynical and destructive. They’re also almost always exhausting for those of us consuming them
  • Examining and discussing and understanding the past is important, and our technologies are enormously helpful in this respect.
  • Sacasas compared the way our media ecosystem works—and all these feedback loops—to a novelty finger trap. “Almost every action generates more difficult conditions—to struggle is to feed the thing that’s keeping you bogged down.”
  • As politicians—especially those on the far right—transition into full time influencers, they no longer need to govern even reasonably effectively to gain power. They don’t need to show what they’ve done for their constituents. Simply culture warring—posting—is enough. The worse the post, the more attention it gets, and the more power they accrue.
  • There's a reason Marjorie Taylor Greene raised $9 million and Sarah Palin has only raised $600,000. MTG has recognized something Palin used to know. Her job is to say something terrible every day so we do all her viral marketing for her.
  • One outcome of elected officials adopting the influencer model is a politics that is obsessed with, and stuck in, the past. I don’t just mean a focus on making America “great again,” but a politics that is obsessed with relitigating its recent past.
  • we are forever talking about Hillary’s emails or Hunter Biden’s laptop or Merrick Garland’s thwarted Supreme Court seat or the legitimacy of the previous election.
  • How do we break the cycle? Is silence our best weapon to starve the attention? That feels wrong. I don’t have answers, but Sacasas has given me a valuable guiding question: How do we train our attention on our present and future, when so much of our life is spent ensconced in dispatches from the recent past?
Javier E

Why Study History? (1985) | AHA - 0 views

  • Isn't there quite enough to learn about the world today? Why add to the burden by looking at the past
  • Historical knowledge is no more and no less than carefully and critically constructed collective memory. As such it can both make us wiser in our public choices and more richly human in our private lives.
  • Collective memory is similar, though its loss does not immediately paralyze everyday private activity. But ignorance of history-that is, absent or defective collective memory-does deprive us of the best available guide for public action, especially in encounters with outsider
  • ...37 more annotations...
  • Without individual memory, a person literally loses his or her identity, and would not know how to act in encounters with others. Imagine waking up one morning unable to tell total strangers from family and friends!
  • Often it is enough for experts to know about outsiders, if their advice is listened to. But democratic citizenship and effective participation in the determination of public policy require citizens to share a collective memory, organized into historical knowledge and belief
  • This value of historical knowledge obviously justifies teaching and learning about what happened in recent times, for the way things are descends from the way they were yesterday and the day before that
  • in fact, institutions that govern a great deal of our everyday behavior took shape hundreds or even thousands of years ago
  • Only an acquaintance with the entire human adventure on earth allows us to understand these dimensions of contemporary reality.
  • Memory is not something fixed and forever. As time passes, remembered personal experiences take on new meanings.
  • Collective memory is quite the same. Historians are always at work reinterpreting the past, asking new questions, searching new sources and finding new meanings in old documents in order to bring the perspective of new knowledge and experience to bear on the task of understanding the past.
  • what we know and believe about history is always changing. In other words, our collective, codified memory alters with time just as personal memories do, and for the same reasons.
  • skeptics are likely to conclude that history has no right to take student time from other subjects. If what is taught today is not really true, how can it claim space in a crowded school curriculum?
  • what if the world is more complicated and diverse than words can ever tell? What if human minds are incapable of finding' neat pigeon holes into which everything that happens will fit?
  • What if we have to learn to live with uncertainty and probabilities, and act on the basis of the best guesswork we are capable of?
  • Then, surely, the changing perspectives of historical understanding are the very best introduction we can have to the practical problems of real life. Then, surely, a serious effort to understand the interplay of change and continuity in human affairs is the only adequate introduction human beings can have to the confusing flow of events that constitutes the actual, adult world.
  • it follows that study of history is essential for every young person.
  • Systematic sciences are not enough. They discount time, and therefore oversimplify reality, especially human reality.
  • Memory, indeed, makes us human. History, our collective memory, carefully codified and critically revised, makes us social, sharing ideas and ideals with others so as to form all sorts of different human groups
  • The varieties of history are enormous; facts and probabilities about the past are far too numerous for anyone to comprehend them all. Every sort of human group has its own histor
  • Where to start? How bring some sort of order to the enormous variety of things known and believed about the past?
  • Early in this century, teachers and academic administrators pretty well agreed that two sorts of history courses were needed: a survey of the national history of the United States and a survey of European history.
  • This second course was often broadened into a survey of Western civilization in the 1930s and 1940s
  • But by the 1960s and 1970s these courses were becoming outdated, left behind by the rise of new kinds social and quantitative history, especially the history of women, of Blacks, and of other formerly overlooked groups within the borders of the United States, and of peoples emerging from colonial status in the world beyond our borders.
  • much harder to combine old with new to make an inclusive, judiciously balanced (and far less novel) introductory course for high school or college students.
  • But abandoning the effort to present a meaningful portrait of the entire national and civilizational past destroyed the original justification for requiring students to study history
  • Competing subjects abounded, and no one could or would decide what mattered most and should take precedence. As this happened, studying history became only one among many possible ways of spending time in school.
  • The costs of this change are now becoming apparent, and many concerned persons agree that returning to a more structured curriculum, in which history ought to play a prominent part, is imperative.
  • three levels of generality seem likely to have the greatest importance for ordinary people.
  • First is family, local, neighborhood history
  • Second is national history, because that is where political power is concentrated in our time.
  • Last is global history, because intensified communications make encounters with all the other peoples of the earth increasingly important.
  • Other pasts are certainly worth attention, but are better studied in the context of a prior acquaintance with personal-local, national, and global history. That is because these three levels are the ones that affect most powerfully what all other groups and segments of society actually do.
  • National history that leaves out Blacks and women and other minorities is no longer acceptable; but American history that leaves out the Founding Fathers and the Constitution is not acceptable either. What is needed is a vision of the whole, warts and all.
  • the study of history does not lead to exact prediction of future events. Though it fosters practical wisdom, knowledge of the past does not permit anyone to know exactly what is going to happen
  • Consequently, the lessons of history, though supremely valuable when wisely formulated, become grossly misleading when oversimplifiers try to transfer them mechanically from one age to another, or from one place to another.
  • Predictable fixity is simply not the human way of behaving. Probabilities and possibilities-together with a few complete surprises-are what we live with and must learn to expect.
  • Second, as acquaintance with the past expands, delight in knowing more and more can and often does become an end in itself.
  • On the other hand, studying alien religious beliefs, strange customs, diverse family patterns and vanished social structures shows how differently various human groups have tried to cop
  • Broadening our humanity and extending our sensibilities by recognizing sameness and difference throughout the recorded past is therefore an important reason for studying history, and especially the history of peoples far away and long ago
  • For we can only know ourselves by knowing how we resemble and how we differ from others. Acquaintance with the human past is the only way to such self knowledge.
mortondo

The Importance of History - Gutenberg College - 0 views

  • But history does matter. It has been said that he who controls the past controls the future. Our view of history shapes the way we view the present, and therefore it dictates what answers we offer for existing problems. Let me offer a few examples to indicate how this might be true.
  • I must have a good understanding of the past in order to know how to deal wisely with these children in the present.
  • Whenever you return to the doctor, he or she pulls out a file which contains all the notes from past visits. This file is a history of your health. Doctors understand very clearly that the past matters.
  • ...13 more annotations...
  • “History is a story about the past that is significant and true.” This simple definition contains two words packed with meaning which must be understood in order to understand history.
  • Significance is determined by the historian. The historian sorts through the evidence and presents only that which, given his particular world view, is significant.
  • Therefore, the community of historians has a large say in deciding what about the past is significant.
  • But historians are just as much a part of society as anyone else, and we are all greatly influenced by those around us. As a result, the community of historians tends to share the same notion of significance as is held by society as a whole. Therefore, historians tend to tell stories which reflect the dominant values of the society in which they live.
  • If we refuse to listen to history, we will find ourselves fabricating a past that reinforces our understanding of current problems.
  • The past does not change, but history changes with every generation.
  • Two people can read the same document, however, and interpret it very differently.
  • When Columbus talked about his desire to evangelize the natives, Marshall took him very seriously; Marshall can identify with such desires and is willing to take Columbus at face value at this point. Zinn, on the other hand, does not take these same statements at face value; he dismisses them by saying, “He was full of religious talk. . . ” (p. 3), implying that Columbus was not sincere.
  • This raises the awkward question, “Can we learn from history?” If every historian reads his own world view into the past, can the past ever break through and speak to us?
  • The answer is “yes.” The past speaks in a voice audible to those who want to hear and to listen attentively. Establishing what really happened at a given point in history is much like establishing the guilt or innocence of an accused criminal in a courtroom trial.
  • But even though most histories are built on facts, the histories can be very different, even contradictory, because falsehoods can be constructed solely with facts.
  • People tend to underestimate the power of history. If I want to convince you that capitalism is evil, I could simply tell you that capitalism is evil, but this is likely to have little effect on the skeptical.
  • History teaches values. If it is true history, it teaches true values; if it is pseudo-history, it teaches false values. The history taught to our children is playing a role in shaping their values and beliefs—a much greater role than we may suspect.  
Javier E

U.S. History Has Plenty of Good and Bad. Here's How to See Both. - WSJ - 0 views

  • I believe that most of us are willing to broaden our understanding of our country’s history to look at both the best and the worst. But we often can’t—not for intellectual reasons but because of unrecognized psychological ones. Understanding those psychological roadblocks is a formidable challenge. But it’s crucial to do so if we want to get past them.
  • Let’s begin with the four reasons our minds sometimes make it hard to have a more honest, nuanced view of our history.
  • First, our minds tend to play down our wrongdoing from the past.
  • ...20 more annotations...
  • Our minds are asymmetric judges, applying harsher moral judgment to present and future transgressions than past ones. It is as if the past becomes blurry. We even tend to blame the victim of a past event more than we blame the victim of a future one.
  • Third, our minds struggle with the negative emotions that our country’s complicated past gives rise to.
  • Research shows we are drawn to a sentimental form of history—nostalgia—which leads us to feel more loved, more protected, and even more competent in our ability to start and maintain relationships.
  • Nostalgia is often tied to the identities that we care most deeply about, such as our family or national identity. And, nostalgia is big business—in fashion, advertising, music and tourism, among other things.
  • Second, our minds tend to overplay sweet memories that favor our ancestors from the past.
  • When we learn about historical atrocities, particularly ones that expose our limited knowledge, contradict the narratives we believe, or implicate our own ancestors, we might experience shame, guilt, disbelief or anger. In response, we have a natural desire to pull away from the new knowledge and perhaps even refute it, rather than try to better understand it.
  • Fourth, our minds want to pick either a beautiful or a brutal narrative.
  • Contradictions, though, pocket our history, beginning with forefathers who had an extraordinary vision of equality, and simultaneously enslaved other humans
  • Our minds resist the paradoxes that characterize our country’s past. It’s so much less psychologically painful to pick one path than to grapple with both ideas at the same time.
  • Tools to useWhile the past is in the past, we can address the psychological challenge, however formidable, in the present. We have tools that will help, and I anticipate (and hope) that our debates will take on more psychological nuance as we shift from arguments over whether to explore our history more fully to how to do it.
  • For example, research shows the importance of returning to our values again and again as a way of inoculating us from setbacks
  • The daily arguments over curfews or messy rooms or study habits can cause us to shut down (“Do whatever you want”) or double down (“I’m your parent and you’ll do what I say”). Instead, it’s helpful to remind ourselves and our children that a parent has three jobs—to teach them, to protect them and to love them. Just doing that can ground us, and enable us to stay engaged, resilient and calm.
  • Similarly, when we confront a historical event, it can help to reflect on questions like, “Which American ideals do you most value?” and, “How do you hope others see your country?
  • You can even write out your responses, share them with others, and reread what you have written. Think of it as a values booster shot
  • Say, for instance, that you deeply value freedom. Keeping this value in your thoughts can help you notice the ways in which this country has delivered on the promise of freedom in important ways. But it also enables you to consider the disheartening realizations when those freedoms are not upheld.
  • esearch by Wendy Smith and others shows that we are capable of embracing paradox, rather than rejecting it. It doesn’t always come naturally. But we simply need to give ourselves permission to allow multiple truths to coexist.
  • In a paradox mind-set, we allow both of these things to be true. When both are true, we can challenge our either/or assumptions, and be more creative in finding solutions.
  • When you spot the paradox, allow both things to be true and observe if your mind shifts from solving the unsolvable puzzle (reconciling how can both of these things be true) to more deeply processing the knowledge that you may otherwise have pushed away. This is the greater resilience and creativity that comes with a paradox mind-set.
  • We simply need to accept that the formidable challenge will require us to be intentional in our approach.
  • In doing so, we become what I call “gritty patriots.” Psychologist Angela Duckworth defines grit as “passion and perseverance in pursuit of a meaningful, long-term goal.” Love of country is not something we are entitled to; it is something we work toward, with grit.
Javier E

On Grand Strategy (John Lewis Gaddis) - 0 views

  • minds. Ordinary experience, he pointed out, is filled with “ends equally ultimate . . . , the realization of some of which must inevitably involve the sacrifice of others.” The choices facing us are less often between stark alternatives—good versus evil, for instance—than between good things we can’t have simultaneously. “One can save one’s soul, or one can found or maintain or serve a great and glorious State,” Berlin wrote, “but not always both at once.”
  • We resolve these dilemmas by stretching them over time. We seek certain things now, put off others until later, and regard still others as unattainable. We select what fits where, and then decide which we can achieve when. The process can be difficult: Berlin emphasized the “necessity and agony of choice.” But if such choices were to disappear, he added, so too would “the freedom to choose,” and hence liberty itself.24
  • only narratives can show dilemmas across time. It’s not enough to display choices like slivers on a microscope slide. We need to see change happen, and we can do that only by reconstituting the past as histories, biographies, poems, plays, novels, or films. The best of these sharpen and shade simultaneously: they compress what’s happening in order to clarify, even as they blur, the line between instruction and entertainment. They are, in short, dramatizations. And a fundamental requirement of these is never to bore.
  • ...74 more annotations...
  • When Thaddeus Stevens (Tommy Lee Jones) asks the president how he can reconcile so noble an aim with such malodorous methods, Lincoln recalls what his youthful years as a surveyor taught him: [A] compass . . . [will] point you true north from where you’re standing, but it’s got no advice about the swamps and deserts and chasms
  • chasms that you’ll encounter along the way. If in pursuit of your destination, you plunge ahead, heedless of obstacles, and achieve nothing more than to sink in a swamp . . . , [then] what’s the use of knowing true north?
  • The real Lincoln, as far as I know, never said any of this, and the real Berlin, sadly, never got to see Spielberg’s film. But Tony Kushner’s screenplay shows Fitzgerald’s linkage of intelligence, opposing ideas, and the ability to function: Lincoln keeps long-term aspirations and immediate necessities in mind at the same time. It reconciles Berlin’s foxes and hedgehogs with his insistence on the inevitability—and the unpredictability—of choice:
  • Whether we approach reality from the top down or the bottom up, Tolstoy seems to be saying, an infinite number of possibilities exist at an indeterminate number of levels, all simultaneously. Some are predictable, most aren’t, and only dramatization—free from the scholar’s enslavement to theory and archives—can begin to represent them.
  • what is “training,” as Clausewitz understands it? It’s being able to draw upon principles extending across time and space, so that you’ll have a sense of what’s worked before and what hasn’t. You then apply these to the situation at hand: that’s the role of scale. The result is a plan, informed by the past, linked to the present, for achieving some future goal.
  • I think he’s describing here an ecological sensitivity that equally respects time, space, and scale. Xerxes never had it, despite Artabanus’ efforts. Tolstoy approximated it, if only in a novel. But Lincoln—who lacked an Artabanus and who didn’t live to read War and Peace—seems somehow to have achieved it, by way of a common sense that’s uncommon among great leaders.
  • It’s worth remembering also that Lincoln—and Shakespeare—had a lifetime to become who they were. Young people today don’t, because society so sharply segregates general education, professional training, ascent within an organization, responsibility for it, and then retirement.
  • This worsens a problem Henry Kissinger identified long ago: that the “intellectual capital” leaders accumulate prior to reaching the top is all they’ll be able to draw on while at the top.37 There’s less time now than Lincoln had to learn anything new.
  • A gap has opened between the study of history and the construction of theory, both of which are needed if ends are to be aligned with means. Historians, knowing that their field rewards specialized research, tend to avoid the generalizations
  • Theorists, keen to be seen as social “scientists,” seek “reproducibility” in results: that replaces complexity with simplicity in the pursuit of predictability. Both communities neglect relationships between the general and the particular—between universal and local knowledge—that nurture strategic thinking.
  • concrete events in time and space—the sum of the actual experience of actual men and women in their relation to one another and to an actual three-dimensional, empirically experienced, physical environment—this alone contained the truth,
  • Collaboration, in theory, could have secured the sea and the land from all future dangers. That would have required, though, the extension of trust, a quality with strikingly shallow roots in the character of all Greeks.
  • The only solution then is to improvise, but this is not just making it up as you go along. Maybe you’ll stick to the plan, maybe you’ll modify it, maybe you’ll scrap it altogether. Like Lincoln, though, you’ll know your compass heading, whatever the unknowns that lie between you and your destination. You’ll have in your mind a range of options for dealing with these, based—as if from Machiavelli—upon hard-won lessons from those who’ve gone before.
  • The past and future are no more equivalent, in Thucydides, than are capabilities and aspirations in strategy—they are, however, connected.
  • The past we can know only from imperfect sources, including our own memories. The future we can’t know, other than that it will originate in the past but then depart from it. Thucydides’ distinction between resemblance and reflection—between patterns surviving across time and repetitions degraded by time—aligns the asymmetry, for it suggests that the past prepares us for the future only when, however imperfectly, it transfers. Just as capabilities restrict aspirations to what circumstances will allow.
  • Insufficiency demands indirection, and that, Sun Tzu insists, requires maneuver: [W]hen capable, feign incapacity; when active, inactivity. When near, make it appear that you are far; when far away, that you are near. Offer an enemy a bait to lure him; feign disorder and strike him. . . . When he concentrates, prepare against him; where he is strong, avoid him. . . . Pretend inferiority and encourage his arrogance. . . . Keep him under a strain and wear him down. Opposites held in mind simultaneously, thus, are “the strategist’s keys to victory.”
  • it was Pericles who, more than anyone else, unleashed the Peloponnesian War—the unintended result of constructing a culture to support a strategy.
  • By the mid-450s Pericles, who agreed, had finished the walls around Athens and Piraeus, allowing total reliance on the sea in any future war. The new strategy made sense, but it made the Athenians, as Thucydides saw, a different people. Farmers, traditionally, had sustained Athens: their fields and vineyards supplied the city in peacetime, and their bodies filled the ranks of its infantry and cavalry when wars came. Now, though, their properties were expendable and their influence diminished.
  • If Athens were to rely upon the ardor of individuals, then it would have to inspire classes within the city and peoples throughout the empire—even as it retained the cohesiveness of its rival Sparta, still in many ways a small town.
  • Pericles used his “funeral oration,” delivered in Athens at the end of the Peloponnesian War’s first year, to explain what he hoped for. The dead had given their lives, he told the mourners, for the universality of Athenian distinctiveness: Athens imitated no one, but was a pattern for everyone. How, though, to reconcile these apparent opposites? Pericles’ solution was to connect scale, space, and time: Athenian culture would appeal to the city, the empire, and the ages.
  • The city had acquired its “friends,” Pericles acknowledged, by granting favors, “in order by continued kindness to keep the recipient in [its] debt; while the debtor [knows] that the return he makes will be a payment, not a free gift.” Nevertheless, the Athenians had provided these benefits “not from calculations of expediency, but in the confidence of liberality.” What he meant was that Athens would make its empire at once more powerful and more reassuring than that of any rival.
  • It could in this way project democracy across cultures because insecure states, fearing worse, would freely align with Athens.22 Self-interest would become comfort and then affinity.
  • The Athenians’ strategy of walling their cities, however, had reshaped their character, obliging them restlessly to roam the world. Because they had changed, they would have to change others—that’s what having an empire means—but how many, to what extent, and by what means? No one, not even Pericles, could easily say.
  • Equality, then, was the loop in Pericles’ logic. He saw both it and empire as admirable, but was slow to sense that encouraging one would diminish the other.
  • Like Lincoln, Pericles looked ahead to the ages. He even left them monuments and sent them messages. But he didn’t leave behind a functional state: it would take well over two millennia for democracy again to become a model with mass appeal.
  • as Thucydides grimly observes, war “brings most men’s character to a level with their fortunes.”
  • “Island” strategies require steady nerves. You have to be able to watch smoke rise on horizons you once controlled without losing your own self-confidence, or shaking that of allies, or strengthening that of adversaries.
  • For the abstractions of strategy and the emotions of strategists can never be separated: they can only be balanced. The weight attached to each, however, will vary with circumstances. And the heat of emotions requires only an instant to melt abstractions drawn from years of cool reflection.
  • if credibility is always in doubt, then capabilities must become infinite or bluffs must become routine. Neither approach is sustainable: that’s why walls exist in the first place.
  • he encouraged his readers to seek “knowledge of the past as an aid to the understanding of the future, which in the course of human things must resemble if it does not reflect it.” For without some sense of the past the future can be only loneliness: amnesia is a solitary affliction.
  • But to know the past only in static terms—as moments frozen in time and space—would be almost as disabling, because we’re the progeny of progressions across time and space that shift from small scales to big ones and back again. We know these through narratives, whether historical or fictional or a combination of both.
  • No one can anticipate everything that might happen. Sensing possibilities, though, is better than having no sense at all of what to expect. Sun Tzu seeks sense—even common sense—by tethering principles, which are few, to practices, which are many.
  • Clausewitz’s concept of training, however, retains its relevance. It’s the best protection we have against strategies getting stupider as they become grander, a recurring problem in peace as well as war. It’s the only way to combine the apparent opposites of planning and improvisation: to teach the common sense that comes from knowing when to be a hedgehog and when a fox.
  • Victories must connect: otherwise they won’t lead anywhere. They can’t be foreseen, though, because they arise from unforeseen opportunities. Maneuvering, thus, requires planning, but also improvisation. Small triumphs in a single arena set up larger ones elsewhere, allowing weaker contenders to become stronger.
  • The actions of man, Kennan concluded, “are governed not so much by what he intellectually believes as by what he vividly realizes.”
  • Nor is it clear, even now, whether Christianity caused Rome’s “fall”—as Gibbon believed—or—as the legacies of Augustus suggest—secured Rome’s institutional immortalities. These opposites have shaped “western” civilization ever since. Not least by giving rise to two truly grand strategies, parallel in their purposes but devised a thousand years apart
  • Augustine shows that reality always falls short of the ideal: one can strive toward it, but never expect to achieve it. Seeking, therefore, is the best man can manage in a fallen world, and what he seeks is his choice. Nevertheless, not all ends are legitimate; not all means are appropriate. Augustine seeks, therefore, to guide choice by respecting choice. He does this through an appeal to reason: one might even say to common sense.
  • A peaceful faith—the only source of justice for Christians—can’t flourish without protection, whether through toleration, as in pre-Constantine Rome, or by formal edict, as afterward.20 The City of God is a fragile structure within the sinful City of Man. It’s this that leads Christians to entrust authority to selected sinners—we call it “politics”—and Augustine, for all his piety, is a political philosopher.
  • Augustine concluded that war, if necessary to save the state, could be a lesser evil than peace—and that the procedural prerequisites for necessity could be stated. Had provocation occurred? Had competent authority exhausted peaceful alternatives? Would the resort to violence be a means chosen, not an end in itself? Was the expenditure of force proportionate to its purposes, so that it wouldn’t destroy what it was meant to defend?
  • No one before Augustine, however, had set standards to be met by states in choosing war. This could be done only within an inclusionary monotheism, for only a God claiming universal authority could judge the souls of earthly rulers. And only Augustine, in his era, spoke so self-confidently for Him. The
  • Augustine’s great uncertainty was the status of souls in the City of Man, for only the fittest could hope to enter the City of God. Pre-Christian deities had rarely made such distinctions: the pagan afterlife was equally grim for heroes, scoundrels, and all in between.25 Not so, though, with the Christian God: behavior in life would make a huge difference in death. It was vital, then, to fight wars within rules. The stakes could hardly be higher.
  • Alignment, in turn, implies interdependence. Justice is unattainable in the absence of order, peace may require the fighting of wars, Caesar must be propitiated—perhaps even, like Constantine, converted—if man is to reach God. Each capability brings an aspiration within reach, much as Sun Tzu’s practices tether his principles, but what’s the nature of the tether? I think it’s proportionality: the means employed must be appropriate to—or at least not corrupt—the end envisaged. This, then, is Augustine’s tilt: toward a logic of strategy transcending time, place, culture, circumstance, and the differences between saints and sinners.
  • a more revealing distinction may lie in temperament: to borrow from Milan Kundera,37 Machiavelli found “lightness of being” bearable. For Augustine—perhaps because traumatized as a youth by a pear tree—it was unendurable.
  • “I judge that it might be true that fortune is arbiter of half our actions, but also that she leaves the other half, or close to it, for us to govern.” Fifty percent fortune, fifty percent man—but zero percent God. Man is, however precariously, on his own.
  • States, Machiavelli suggests, operate similarly. If governed badly, men’s rapacity will soon overwhelm them, whether through internal rebellion or external war. But if run with virtù—his untranslatable term for planning without praying40—states can constrain, if not in all ways control, the workings of fortune, or chance. The skills needed are those of imitation, adaptation, and approximation.
  • Machiavelli commends the study of history, “for since men almost always walk on paths beaten by others and proceed in their actions by imitation . . . , a prudent man should always enter upon the paths beaten by great men, and imitate those who have been most excellent, so that if his own virtue does not reach that far, it is at least in the odor of it.”
  • What, then, to do? It helped that Machiavelli and Berlin had lightness of being, for their answer is the same: don’t sweat it. Learn to live with the contradictions. Machiavelli shows “no trace of agony,” Berlin points out, and he doesn’t either:
  • Eternal truths have little to do with any of this, beyond the assurance that circumstances will change. Machiavelli knows, as did Augustine, that what makes sense in one situation may not in the next. They differ, though, in that Machiavelli, expecting to go to Hell, doesn’t attempt to resolve such disparities. Augustine, hoping for Heaven, feels personally responsible for them. Despite his afflictions, Machiavelli often sees comedy.42 Despite his privileges, Augustine carries a tragic burden of guilt. Machiavelli sweats, but not all the time. Augustine never stops.
  • “Lightness of being,” then, is the ability, if not to find the good in bad things, then at least to remain afloat among them, perhaps to swim or to sail through them, possibly even to take precautions that can keep you dry. It’s not to locate logic in misfortunes, or to show that they’re for the best because they reflect God’s will.
  • Augustine and Machiavelli agree that wars should be fought—indeed that states should be run—by pre-specifiable procedures. Both know that aspirations aren’t capabilities. Both prefer to connect them through checklists, not commandments.43
  • Augustine admits, which is why good men may have to seek peace by shedding blood. The greater privilege, however, is to avert “that calamity which others are under the necessity of producing.” Machiavelli agrees, but notes that a prince so infrequently has this privilege that if he wishes to remain in power he must “learn to be able not to be good,” and to use this proficiency or not use it “according to necessity.”51 As fits man’s fallen state, Augustine sighs. As befits man, Machiavelli simplifies.
  • As Machiavelli’s finest translator has put it: “[J]ustice is no more reasonable than what a person’s prudence tells him he must acquire for himself, or must submit to, because men cannot afford justice in any sense that transcends their own preservation.”53
  • princes need advisers. The adviser can’t tell the prince what to do, but he can suggest what the prince should know. For Machiavelli this means seeking patterns—across time, space, and status—by shifting perspectives. “[J]ust as those who sketch landscapes place themselves down in the plain to consider the nature of mountains . . . and to consider the nature of low places place themselves high atop mountains,
  • Machiavelli embraces, then, a utilitarian morality: you proportion your actions to your objective, not to progress from one nebulous city to another, but because some things have been shown to work and others haven’t.60
  • Who, then, will oversee them? They’ll do it themselves, Machiavelli replies, by balancing power. First, there’ll be a balance among states, unlike older Roman and Catholic traditions of universality. Machiavelli anticipates the statecraft of Richelieu, Metternich, Bismarck,
  • But Machiavelli understands balancing in a second and subtler sense, conveyed more explicitly in The Discourses than in The Prince: [I]t is only in republics that the common good is looked to properly in that all that promotes it is carried out; and, however much this or that private person may be the loser on this account, there are so many who benefit thereby that the common good can be realized in spite of those few who suffer in consequence.64 This idea of an internal equilibrium within which competition strengthens community wouldn’t appear again until Adam Smith unveiled an “invisible hand” in The Wealth of Nations (1776), until the American Founding Fathers drafted and in The Federalist justified constitutional checks and balances (1787–88), and until Immanuel Kant linked republics, however distantly, with Perpetual Peace (1795).
  • Machiavelli’s great transgression, Berlin concluded, was to confirm what everyone knows but no one will admit: that ideals “cannot be attained.” Statecraft, therefore, can never balance realism against idealism: there are only competing realisms. There is no contest, in governing, between politics and morality: there is only politics. And no state respects Christian teaching on saving souls. The incompatibilities are irreconcilable. To deny this is, in Berlin’s words but in Machiavelli’s mind, to “vacillate, fall between two stools, and end in weakness and failure.”
  • And approximation? “[P]rudent archers,” Machiavelli points out, knowing the strength of their bow, “set their aim much higher than the place intended, not to reach such height with their arrow, but to be able with the aid of so high an aim to achieve their plan.”41 For there will be deflection—certainly from gravity, perhaps from wind, who knows from what else? And the target itself will probably be moving.
  • Augustine’s City of God no longer exists on earth. The City of Man, which survives, has no single path to salvation. “[T]he belief that the correct, objectively valid solution to the question of how men should live can in principle be discovered,” Berlin finds, “is itself in principle not true.” Machiavelli thus split open the rock “upon which Western beliefs and lives had been founded.” It was he “who lit the fatal fuse.”
  • Machiavelli’s blood ran colder than was ordinary: he praised Cesare Borgia, for example, and he refused to condemn torture despite having suffered it (Augustine, never tortured, took a similar position).75 Machiavelli was careful, however, to apportion enormities: they should only forestall greater horrors—violent revolution, defeat in war, descent into anarchy, mass killing, or what we would today call “genocide.”
  • Berlin sees in this an “economy of violence,” by which he means holding a “reserve of force always in the background to keep things going in such a way that the virtues admired by [Machiavelli] and by the classical thinkers to whom he appeals can be protected and allowed to flower.”76 It’s no accident that Berlin uses the plural. For it comes closer than the singular, in English, to Machiavelli’s virtù, implying no single standard by which men must live.
  • “[T]here are many different ends that men may seek and still be fully rational,” Berlin insists, “capable of understanding . . . and deriving light from each other.” Otherwise, civilizations would exist in “impenetrable bubble[s],” incomprehensible to anyone on the outside. “Intercommunication between cultures in time and space is possible only because what makes men human is common to them, and acts as a bridge between them. But our values are ours, and theirs are theirs.”
  • Perhaps there are other worlds in which all principles are harmonized, but “it is on earth that we live, and it is here that we must believe and act.”77 By shattering certainty, Machiavelli showed how. “[T]he dilemma has never given men peace since it came to light,” Berlin lightly concludes, “but we have learnt to live with it.”
  • Posterity has long regarded Augustine and Machiavelli as pivots in the history of “western” thought because each, with enduring effects, shifted long-standing relationships between souls and states.
  • Philip promises obedience to God, not his subjects. Elizabeth serves her subjects, fitting God to their interests. The king, looking to Heaven, venerates. The queen, feet on earth, calculates. The differences test the ideas of Augustine and Machiavelli against the demands of statecraft at the dawn of the modern age.
  • Relishing opposites, the queen was constant only in her patriotism, her insistence on keeping ends within means, and her determination—a requirement for pivoting—never to be pinned down.
  • Pivoting requires gyroscopes, and Elizabeth’s were the best of her era. She balanced purposefulness with imagination, guile, humor, timing, and an economy in movement that, however extravagant her display, kept her steady on the tightrope she walked.
  • Machiavelli, thinking gyroscopically, advised his prince to be a lion and a fox, the former to frighten wolves, the latter to detect snares. Elizabeth went him one better by being lion, fox, and female, a combination the crafty Italian might have learned to appreciate. Philip was a grand lion, but he was only a lion.
  • princes can through conscientiousness, Machiavelli warned, become trapped. For a wise ruler “cannot observe faith, nor should he, when such observance turns against him, and the causes that made him promise have been eliminated. . . . Nor does a prince ever lack legitimate causes to color his failure to observe faith.”46
  • What we like to recall as the Elizabethan “golden age” survived only through surveillance and terror: that was another of its contradictions, maintained regretfully with resignation.
  • The queen’s instincts were more humane than those of her predecessors, but too many contemporaries were trying to kill her. “Unlike her sister, Elizabeth never burned men for their faith,” her recent biographer Lisa Hilton has written. “She tortured and hanged them for treason.”60 Toleration, Machiavelli might have said, had turned against Elizabeth. She wanted to be loved—who wouldn’t? It was definitely safer for princes, though, to be feared.
  • “The failure of the Spanish Armada,” Geoffrey Parker has argued, “laid the American continent open to invasion and colonization by northern Europeans, and thus made possible the creation of the United States.” If that’s right, then the future pivoted on a single evening—August 7, 1588—owing to a favorable wind, a clever lord admiral, and a few fiery ships. Had he succeeded, Philip would have required Elizabeth to end all English voyages to America.4
  • In contrast to Spain’s “new world” colonies—and to the territories that France, more recently, had claimed (but barely settled) along the banks of the St. Lawrence, the Great Lakes, and the Ohio and Mississippi rivers—British America “was a society whose political and administrative institutions were more likely to evolve from below than to be imposed from above.”10 That made it a hodgepodge, but also a complex adaptive system.
  • The principles seem at odds—how can supremacies share?—but within that puzzle, the modern historian Robert Tombs has suggested, lay the foundations of England’s post-Stuart political culture: [S]uspicion of Utopias and zealots; trust in common sense and experience; respect for tradition; preference for gradual change; and the view that “compromise” is victory, not betrayal. These things stem from the failure of both royal absolutism and of godly republicanism: costly failures, and fruitful ones.
Javier E

Excerpt: 'Shame' by Shelby Steele - ABC News - 0 views

  • cable
  • since the 1960s, “liberal” and “conservative” have come to function almost like national identities in their own right. To be one or the other is not merely to lean left or right—toward “labor” or toward “business”— within a common national identity; it is to belong to a different vision of America altogether, a vision that seeks to supersede the opposing vision and to establish itself as the nation’s common identity. Today the Left and the Right don’t work within a shared understanding of the national purpose; nor do they seek such an understanding. Rather, each seeks to win out over the other and to define the nation by its own terms.
  • t was all the turmoil of the 1960s—the civil rights and women’s movements, Vietnam, the sexual revolution, and so on—that triggered this change by making it clear that America could not go back to being the country it had been before. It would have to reinvent itself. It would have to become a better country. Thus, the reinvention of America as a country shorn of its past sins became an unspoken, though extremely powerful, mandate in our national politics
  • ...42 more annotations...
  • Liberals and conservatives could no longer think of themselves simply as political rivals competing within a common and settled American identity. That identity was no longer settled—or even legitimate—because it was stigmatized in the 1960s as racist, sexist, and imperialistic
  • It was no longer enough for the proponents of these perspectives merely to vie over the issues of the day. Both worldviews would now have to evolve into full-blown ideologies capable of projecting a new political and cultural vision of America.
  • This is how the mandate of the 1960s to reinvent America launched the infamous “culture war” between liberalism and conservatis
  • When we argue over health care or immigration or Middle East policy, it is as if two distinct Americas were arguing, each with a different idea of what it means to be an American. And these arguments are intense and often uncivil, because each side feels that its American identity is at risk from the other side. So the conflict is very much a culture war, with each side longing for “victory” over the other, and each side seeing itself as America’s last and best hope.
  • Since the 1960s, this war has divided up our culture into what might be called “identity territories.”
  • America’s universities are now almost exclusively left-leaning; most public-policy think tanks are right-leaning. Talk radio is conservative; National Public Radio and the major television networks are liberal. On cable television, almost every news and commentary channel is a recognizable identity territory—Fox/ right; MSNBC/left; CNN/left. In the print media our two great national newspapers are the liberal New York Times and the conservative Wall Street Journal (especially in the editorial pages). The Pulitzer Prize and MacArthur Grants are left; the Bradley Prize is right. The blogosphere is notoriously divided by political stripe. And then there are “red” and “blue” states, cities, towns, and even neighborhoods. At election time, Americans can see on television a graphic of their culture war: those blue and red electoral maps that give us a virtual topography of political identity.
  • cable
  • In the America envisioned by both ideologies, there is no racism or sexism or imperialism to be embarrassed by. After all, ideologies project idealized images of the near-perfect America that they promise to deliver. Thus, in one’s ideological identity, one can find the innocence that is no longer possible—since the 1960s—in America’s defamed national identity.
  • To announce oneself as a liberal or a conservative is like announcing oneself as a Frenchman or a Brit. It is virtually an announcement of tribal identity, and it means something much larger than ideology
  • Nationalism—the nationalist impulse—is passion itself; it is atavistic, beyond the reach of reason, a secular sacredness. The nationalist is expected to be intolerant of all opposition to his nation’s sovereignty, and is most often willing to defend that sovereignty with his life.
  • when we let nationalism shape the form of our liberal or conservative identities—when we practice our ideological leaning as if it were a divine right, an atavism to be defended at all cost—then we put ourselves on a warlike footing. We feel an impunity toward our opposition, and we grant ourselves a scorched-earth license to fight back.
  • yes, like my young nemesis, I could experience my ideology as a nationalism. But unlike him I wanted to discipline that impulse, to subject my ideology—and all the policies it fostered—to every sort of test of truth and effectiveness. And I was ready to modify accordingly, to disabuse myself of even long-held beliefs that didn’t pan out in reality
  • these disparities— and many others—most certainly had their genesis in centuries of racial oppression. But post-1960s liberalism conflates the past with the present: it argues that today’s racial disparities are caused by precisely the same white racism that caused them in the past—thus the poetic truth that blacks today remain stymied and victimized by white racism.
  • I had stated a hard fact: that since the 1960s, white racism had lost so much of its authority, power, and legitimacy that it was no longer, in itself, a prohibitive barrier to black advancement. Blacks have now risen to every level of American society, including the presidency. If you are black and you want to be a poet, or a doctor, or a corporate executive, or a movie star, there will surely be barriers to overcome, but white racism will be among the least of them. You will be far more likely to receive racial preferences than to suffer racial discrimination.
  • But past oppression cannot be conflated into present-day oppression. It is likely, for example, that today’s racial disparities are due more to dysfunctions within the black community, and—I would argue—to liberal social policies that have encouraged us to trade more on our past victimization than to overcome the damage done by that victimization through dint of our own pride and will
  • The young man at Aspen demanded to speak so that he could corral people back into a prescribed correctness and away from a more open-minded approach to the complex problems that our racial history has left us to deal with—problems that the former victims of this history will certainly bear the greatest responsibility for overcoming
  • there also comes a time when he must stop thinking of himself as a victim by acknowledging that—existentially—his fate is always in his own hands. One of the more pernicious cor- ruptions of post-1960s liberalism is that it undermined the spirit of self-help and individual responsibility in precisely the people it sought to uplif
  • he truth—that  blacks had now achieved a level of freedom comparable to that of all others
  • what was not true—that racism was still the greatest barrier to black advancement
  • Poetic truth—this assertion of a broad characteristic “truth” that invalidates actual truth—is contemporary liberalism’s greatest source of power. It is also liberalism’s most fundamental corruption.
  • the great trick of modern liberalism is to link its poetic truths (false as they may be) with innocence from all the great sins of America’s past—racism, sexism, imperial- ism, capitalist greed
  • if you want to be politically correct, if you want to be seen as someone who is cleansed of America’s past ugliness, you will go along with the poetic truth that racism is still a great barrier for blacks.
  • A distinction must be made. During and immediately after the 1960s, racism and sexism were still more literal truth than poetic truth. As we moved through the 1970s, 1980s, and 1990s, America morally evolved so that these old American evils became more “poetic” than literal
  • Yet redeeming America from these evils has become liberalism’s rationale for demanding real power in the real world—the political and cultural power to create social programs, to socially engineer on a national scale, to expand welfare, to entrench group preferences in American institutions, and so on
  • what happens to liberal power when America actually earns considerable redemption—when there are more women than men in the nation’s medical schools, when a black can serve as the president, when public accommodations are open to anyone with the price of the ticket?
  • My young antagonist in Aspen was not agitated by some racial injustice. He would have only relished a bit of good old-fashioned racial injustice, since it would have justified his entire political identit
  • a divide like this suggests that America has in fact become two Americas, two political cultures forever locked in a “cold war” within a single society. This implies a spiritual schism within America itself, and, following from that, the prospect of perpetual and hopeless debate—the kind of ego-driven debate in which both sides want the other side to “think like us.
  • Today, liberal and conservative Americans are often contemptuous of each other with a passion that would more logically be reserved for foreign enemies.
  • Our national debate over foreign and domestic issues has come to be framed as much by poetic truths as by dispassionate assessments of the realities we face
  • the poetic truth that blacks are still held back from full equality by ongoing “structural” racism carries more authority than the objective truth: that today racism is not remotely the barrier to black advancement in American life that it once was.
  • In foreign affairs, the poetic truth that we Americans are essentially imperialistic cowboys bent on exploiting the world has more credibility than the obvious truth, which is that our wealth and power (accumulated over centuries of unprecedented innovation in a context of freedom) has often drawn us into the unwanted role of policing a turbulent world—and, it must be added, to the world’s immense benefit.
  • Today the actual facts fail to support the notion that racial victimization is a prevailing truth of American life. So today, a poetic truth, like “black victimization,” or the ongoing “repression of women,” or the systematic “abuse of the environment,” must be imposed on society not by fact and reason but by some regime of political correctness
  • Poetic license occurs when poets take a certain liberty with the conventional rules of grammar and syntax in order to achieve an effect. They break the rules in order to create a more beautiful or more powerful effect than would otherwise be possible. Adapting this idea of license and rule breaking to the realm of ideology, we might say that “poetic truth” disregards the actual truth in order to assert a larger essential truth that supports one’s ideological position
  • He could subscribe to “diversity,” “inclusiveness,” and “social justice” and think himself solidly on the side of the good. The problem is that these prescriptions only throw fuzzy and unattainable idealisms at profound problems
  • What is “diversity” beyond a vague apologia, an amorphous expression of goodwill that offers no objective assessment whatsoever of the actual problems that minority groups face?
  • The danger here is that the nation’s innocence— its redemption from past sins—becomes linked to a kind of know-nothingism
  • We can’t afford to know, for example, that America’s military might—a vulgarity in the minds of many—has stabilized vast stretches of Asia and Europe since World War II, so that nations under the umbrella of our power have become prosperous trading partners today
  • Today’s great divide comes from a shallowness of understanding. We don’t altogether know what to do with our history
  • many of our institutions are being held in thrall to the idea of moral intimidation as power. Try to get a job today as an unapologetic conservative in the average American university, or in the State Department, or on public radio.
  • We all know, to the point of cliché, what the solutions are: mutual respect, empathy, flexibility, compromise
  • We can’t admit today that the lives of minorities are no longer stunted by either prejudice or “white privilege.
  • hose who doubt this will always point to today’s long litany of racial disparities. Blacks are still behind virtually all other groups by the most important measures of social and economic well-being: educational achievement, home ownership, employment levels, academic test scores, marriage rates, household net worth, and so on. The fact that seven out of ten black women are single, along with the fact that 70 percent of first black marriages fail (47 percent for whites), means that black women are married at roughly half the rate of white women and divorced at twice the rate. Thus it is not surprising that nearly three-quarters of all black children are born out of wedlock. In 2008, black college students were three times more likely than whites to graduate with a grade point average below a meager 2.5—this on top of a graduation rate for blacks of only 42 percent, according to the Journal of Blacks in Higher Education. Consequently, blacks in general have the highest college dropout rate and the lowest grade point average of any student group in America
proudsa

Obama in Hiroshima: Why It's So Hard for Countries to Apologize - The Atlantic - 0 views

  • When Barack Obama goes to Hiroshima on May 27, becoming the first sitting U.S. president to visit the site of the world’s first nuclear attack, he will not apologize on behalf of his country for carrying out that strike 71 years ago.
  • But he will affirm America’s “moral responsibility,” as the only nation to have used nuclear weapons, to prevent their future use. He will recognize the painful past, but he won’t revisit it. When it’s all over, we still won’t know whether or not he thinks there’s something about the atomic bombings to be sorry for.
  • This is especially true with the Hiroshima bombing, where the competing stories about what happened are so morally complex. Nevertheless, political apologies do occur.
  • ...11 more annotations...
  • Lind compared Trudeau’s statement to President Bill Clinton’s apology for America’s failure to intervene in the Rwandan Genocide. Clinton was saying sorry for inaction rather than action, she said, and there was no American constituency to be offended by his expression of regret.
  • Germany’s various apologies for the crimes of the Nazis, Lind added, are the shining exception in international affairs, not the rule: “The world we live in is one in which countries routinely whitewash their past violence. They routinely even lie about their past violence. They sometimes glorify their past violence.”
  • ut Lind has found that apologies and reparations for those wrongs can be damaging as well, since such actions are likely to polarize people within those countries. The best approach, she says, is recognizing and remembering wrongs in ways that unify rather than divide—that emphasize shared suffering, not perpetrators and victims.
  • “Liberals have this idea that the way to be a strong nation is to be transparent about the past, and to be self-critical, and to constantly question your leaders, and constantly ask, ‘Are we living up to our own values?’ And so this kind of historical reckoning with the past [that Obama is undertaking in Hiroshima]—they love that
  • Conservatives, meanwhile, “say that national strength comes from national unity, and national unity is best served by instilling pride in people, and pride comes from remembering the really great things that we’ve done, and remembering what’s different and great about America. … And so they would say, ‘What are you doing talking about all the people we killed? Why aren’t you celebrating that we brought democracy to Japan?
  • Politics. Everyone has their own story.
  • The vast majority of Japanese don’t think the atomic bombings were justified, and that belief has only become more widespread over time. But Japanese leaders have not demanded that the U.S. government apologize for the attacks, in part because they don’t want to jeopardize the flourishing U.S.-Japanese alliance or encourage calls for Japan to apologize for its own wartime aggression.
  • This lack of a demand from the Japanese, Lind argues, has created the friendly space in which a visit like Obama’s can take place—and in which the two countries can adopt a common narrative about the atomic bombings.
  • Lind compared the situation to the way in which American and European leaders, including German leaders, now gather in Normandy to commemorate the Allied invasion of Nazi-occupied France
  • “For most Japanese, Obama’s visit actually fits with the Japanese story about the bomb—that the atomic bomb gave Japan its postwar mission for peace,”
  • that the atomic bomb ended the war and saved American lives. So the Japanese bomb story begins in 1945 and goes forward in the mission for peace. The American bomb story ends in ’45. Those are two separate stories. They will not cross. And the president’s position that the lesson of the past is for a non-nuclear future, it’s almost like a third story.”
  •  
    Another view on reflecting about Hrioshima
Javier E

History Majors Are Becoming a Thing of the Past, Except in the Ivy League - 0 views

  • According to a new analysis by the American Historical Association, the number of students choosing to major in history at the nation’s colleges has plummeted. Undergraduate history majors have fallen by more than a third in less than a decade, declining to their lowest levels since the ’80s.
  • If anything, the trend is accelerating. The undergraduate history major seems to be on the way out.
  • Of all college majors since the financial crash of 2008, data from the National Center for Education Statistics show that none has fallen faster than history, which has experienced the steepest declines by far in student concentrators. In 2008, there were 34,642 majors in history; by 2017, the most recent year for which data are available, the number had fallen to 24,266.
  • ...12 more annotations...
  • Between 2016 and 2017 alone, some 1,500 fewer American undergraduates chose to major in history. The drop-off has continued even among students who entered college long after the economic recovery began.
  • Where a student studies also has little impact in numbers choosing history (except in one sector of campuses): History majors have fallen across the board and throughout the country—at research universities, small and large colleges, private and state institutions, among white and nonwhite students, as well as among both men and women.
  • colleges have begun cutting faculty wholesale, hollowing out their history offerings. Due to steep falloff in student interest, last summer the University of Akron announced plans to eliminate advanced degrees and significantly reduce course offerings in history. Like bobby sox and saddles shoes, study of the past at many colleges and universities seems not just to have gone out of style; it is going away.
  • several among the nation’s elite colleges seem to be in the midst of a renaissance in interest in history. Their undergraduates are flocking to history courses and the history major. An article in the Yale Daily News, the Yale College newspaper, found that the history major is, according to a Yale professor of history there, “thriving.”
  • Contrary to nationwide trends, the history major, which fell in popularity at Yale after the 2008 crisis, has now zoomed back up to be the third most popular major. About 10 percent of this year’s graduating class—129 students—majored in history.
  • Nationally only about 1 percent of students choose history as their major.
  • According to the Yale Daily News reporter, Yale plans to add 11 new history professors in 2019 to meet rising demand.
  • The same goes at other Ivies. Princeton has also hired new history faculty in response to rising student numbers.
  • At Brown, the director of undergraduate studies in history said that along with more history majors, non-majors are clamoring to take history classes. Brown has had to expand its history course offerings, as enrollments grew in just a year from 1,082 students to 1,385 students.
  • Even in today’s relatively good times, few students can feel certain enough about their future to major in the study of the past. Majoring in history, it seems, has become just another luxury item the anxious majority of undergraduates cannot afford.
  • Few history majors—and even fewer of those who take history courses while in college—become historians, but they do move on to become citizens. Knowledge of the past provides young people with a sense of place and a concept of temporal continuity, lessons to apply to the present and future, an interpretive framework and perspective for navigating the choppy global world.
  • An epidemic of historical amnesia already plagues this country, which has often paid a terrible price and done grave harm to other foreign people and lands due to its ignorance of the past
Javier E

The Myth of Western Civilization - The Atlantic - 0 views

  • Democracy is a struggle, not a trophy and not a bragging right. This is not a matter of being polite and sensitive. It is understanding that we live on the edge of the volcano, that the volcano is in us. Judt is keenly aware that late 20th century Europe's accomplishments could be wrecked by the simple actions of men.
  • What Judt wants us to see is the tenuousness of human creations, and thus the tenuousness of the West, itself. Having concluded that Europe (though not its Eastern half) has finally, in fits and starts, come to grapple with the Holocaust, he grows skeptical: Evil, above all evil on the scale practiced by Nazi Germany, can never be satisfactorily remembered. The very enormity of the crime renders all memorialisation incomplete. Its inherent implausibility—the sheer difficulty of conceiving of it in calm retrospect—opens the door to diminution and even denial. Impossible to remember as it truly was, it is inherently vulnerable to being remembered as it wasn’t. Against this challenge memory itself is helpless
  • From Timothy Snyder's Bloodlands: No matter which technology was used, the killing was personal. People who starved were observed, often from watchtowers, by those who denied them food. People who were shot were seen through the sights of rifles at very close range, or held by two men while a third placed a pistol at the base of the skull. People who were asphyxiated were rounded up, put on trains, and then rushed into the gas chambers. They lost their possessions and then their clothes and then, if they were women, their hair. Each one of them died a different death, since each one of them had lived a different life.
  • ...10 more annotations...
  • the European super-nation has long needed to believe itself above the world, above native America, above Asia, and particularly above Africa. The truth is more disconcerting: The dark continent has never been South of the Sahara, but South of Minsk and East of Aachen in the jungles of the European soul. 
  • I don't have any gospel of my own. Postwar, and the early pages of Bloodlands, have revealed a truth to me: I am an atheist. (I have recently realized this.) I don't believe the arc of the universe bends towards justice. I don't even believe in an arc. I believe in chaos. I believe powerful people who think they can make Utopia out of chaos should be watched closely.
  • I don't know that it all ends badly. But I think it probably does.
  • I'm also not a cynic. I think that those of us who reject divinity, who understand that there is no order, there is no arc, that we are night travelers on a great tundra, that stars can't guide us, will understand that the only work that will matter, will be the work done by us.
  • Maybe the very myths I decry are necessary for that work. I don't know. But history is a brawny refutation for that religion brings morality.
  • "History contributes to the disenchantment of the world," writes Judt. ...most of what it has to offer is discomforting, even disruptive—which is why it is not always politically prudent to wield the past as a moral cudgel with which to beat and berate a people for its past sins.
  • But history does need to be learned—and periodically re-learned. In a popular Soviet-era joke, a listener calls up ‘Armenian Radio’ with a question: ‘Is it possible’, he asks, ‘to foretell the future?’ Answer: ‘Yes, no problem. We know exactly what the future will be. Our problem is with the past: that keeps changing’.  So it does—and not only in totalitarian societies. 
  • All the same, the rigorous investigation and interrogation of Europe’s competing pasts—and the place occupied by those pasts in Europeans’ collective sense of themselves—has been one of the unsung achievements and sources of European unity in recent decades. It is, however, an achievement that will surely lapse unless ceaselessly renewed.
  • Europe’s barbarous recent history, the dark ‘other’ against which post-war Europe was laboriously constructed, is already beyond recall for young Europeans.  Within a generation the memorials and museums will be gathering dust—visited, like the battlefields of the Western Front today, only by aficionados and relatives. If in years to come we are to remember why it seemed so important to build a certain sort of Europe out of the crematoria of Auschwitz, only history can help us.
  • If Europeans are to maintain this vital link—if Europe’s past is to continue to furnish Europe’s present with admonitory meaning and moral purpose—then it will have to be taught afresh with each passing generation. ‘European Union’ may be a response to history, but it can never be a substitute.
Javier E

Spanish Election: The Left Won, the Right Didn't Lose - The Atlantic - 0 views

  • far from marking a clear rejection of the fascist past, the Spanish election can just as easily be understood as a sign that the lessons of the postwar period have lost their hold over European politics. The past hasn’t lost, in other words; it’s just been forgotten.
  • Seventy-five years after World War II, the German taboo against far-right politics has been breached.
  • Vox gained 10 percent of the vote on Sunday, proving that it has significant national appeal. Though Vox did not “win” the elections, its ascent was the day’s most significant development—more than the PSOE’s victory—one that marks a true turning point in Spain’s, and Europe’s, attitude toward the past.
  • ...7 more annotations...
  • When I was born in 1982, the Communist system that the Soviet Union had, in the war’s immediate aftermath, imposed upon the continent’s eastern half seemed unlikely to collapse anytime soon. Meanwhile, the liberal democracies in its western half looked remarkably stable, in part because the brutal reality of totalitarianism had robbed extremists of the allure they had once enjoyed. Both systems, in their own ways, were based on an explicit rejection of the fascist politics that had led the continent on the path to perdition.
  • In 2000, when I turned 18, the fault lines of World War II no longer marked the continent as starkly, but the war’s lessons seemed to mark its politics even more deeply: In Central and Eastern Europe, citizens who had purchased their freedom with decades of suffering could be counted upon to guard it jealously. Meanwhile, the political culture of Western European countries—especially those, such as Germany, that bore the heaviest historical responsibility for World War II, or those, such as Spain and Portugal, that had been ruled by fascists well into the 1970s—seemed committed to moderation.
  • Meanwhile, countries that thought they were immune to the resurgence of the far right are finding out that the salutary effects of their violent past have started to fade.
  • Two decades into the 21st century, both of these assumptions have turned out to be wishful illusions. Across Central Europe, citizens have freely elected authoritarian populists
  • Sánchez said that the past had lost
  • After three-quarters of a century in which the legacy of World War II shaped the basic categories of the continent’s politics, its lessons are being thrown overboard.
  • Europe’s long 20th century is coming to an end. The past is lost—and the future is far less certain than many European politicians appear to grasp.
Javier E

How America Went Haywire - The Atlantic - 0 views

  • You are entitled to your own opinion, but you are not entitled to your own facts.
  • Why are we like this?The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.
  • The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites.
  • ...92 more annotations...
  • Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.
  • Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.
  • Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking
  • The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.
  • The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them
  • Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.
  • we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.
  • We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.
  • For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.
  • It was a headquarters for a new religion of no religion, and for “science” containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.
  • These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is “a theory not a fact.”
  • The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.” Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.
  • And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.” If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.
  • Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible … There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.
  • During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent.
  • The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.
  • over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explained—so was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself
  • Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger.
  • then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.
  • To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.” “What is ‘real’ to a Tibetan monk may not be ‘real’ to an American businessman.”
  • In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.
  • Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”
  • Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certaint
  • Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.
  • Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies.
  • Conspiracy became the high-end Hollywood dramatic premise—Chinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream.
  • t more and more people on both sides would come to believe that an extraordinarily powerful cabal—international organizations and think tanks and big businesses and politicians—secretly ran America.
  • Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the ’60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the ’70s on the paranoid left before it became a fixture on the right.
  • Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didn’t subside, but grew and thrived—and came to seem unexceptional.
  • Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.
  • Starting in the ’80s, loving America and making money and having a family were no longer unfashionable.The sense of cultural and political upheaval and chaos dissipated—which lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.
  • For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.
  • Relativism became entrenched in academia—tenured, you could say
  • as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”
  • After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.
  • America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.
  • In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment
  • Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.
  • I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did.
  • Limbaugh’s virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day.
  • Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.
  • Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries.
  • there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the ’60s, and now the match was lit and thrown
  • After the ’60s and ’70s happened as they happened, the internet may have broken America’s dynamic balance between rational thinking and magical thinking for good.
  • Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers
  • Why did Senator Daniel Patrick Moynihan begin remarking frequently during the ’80s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say
  • Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.
  • On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.
  • Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any “alternative” theory or belief seems to generate more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of result
  • If more and more of a political party’s members hold more and more extreme and extravagantly supernatural beliefs, doesn’t it make sense that the party will be more and more open to make-believe in its politics?
  • an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.
  • Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, “Individuals’ explicit religious and paranormal beliefs” are the best predictors of their “perception of purpose in life events”—their tendency “to view the world in terms of agency, purpose, and design.”
  • Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the ’60s, that exceptional religiosity has fed the tendency to believe in conspiracies.
  • Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.
  • People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, America’s unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.
  • Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?
  • One reason, I think, is religion. The GOP is now quite explicitly Christian
  • , as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but ratherthey are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.
  • Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades.
  • The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that “a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government”? Yes, say 34 percent of Republican voters, according to Public Policy Polling.
  • starting in the ’90s, the farthest-right quarter of Americans, let’s say, couldn’t and wouldn’t adjust their beliefs to comport with their side’s victories and the dramatically new and improved realities. They’d made a god out of Reagan, but they ignored or didn’t register that he was practical and reasonable, that he didn’t completely buy his own antigovernment rhetoric.
  • Another way the GOP got loopy was by overdoing libertarianism
  • Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and don’t spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish
  • For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans
  • Karl Rove was stone-cold cynical, the Wizard of Oz’s evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how “judicious study of discernible reality [is] … not the way the world really works anymore.” These leaders were rational people who understood that a large fraction of citizens don’t bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.
  • But over the past few decades, a lot of the rabble they roused came to believe all the untruths. “The problem is that Republicans have purposefully torn down the validating institutions,”
  • “They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse.”
  • What had been the party’s fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.
  • The Christian takeover happened gradually, but then quickly in the end, like a phase change from liquid to gas. In 2008, three-quarters of the major GOP presidential candidates said they believed in evolution, but in 2012 it was down to a third, and then in 2016, just one did
  • A two-to-one majority of Republicans say they “support establishing Christianity as the national religion,” according to Public Policy Polling.
  • Although constitutionally the U.S. can have no state religion, faith of some kind has always bordered on mandatory for politicians.
  • What connects them all, of course, is the new, total American embrace of admixtures of reality and fiction and of fame for fame’s sake. His reality was a reality show before that genre or term existed
  • When he entered political show business, after threatening to do so for most of his adult life, the character he created was unprecedented—presidential candidate as insult comic with an artificial tan and ridiculous hair, shamelessly unreal and whipped into shape as if by a pâtissier.
  • Republicans hated Trump’s ideological incoherence—they didn’t yet understand that his campaign logic was a new kind, blending exciting tales with a showmanship that transcends ideology.
  • Trump waited to run for president until he sensed that a critical mass of Americans had decided politics were all a show and a sham. If the whole thing is rigged, Trump’s brilliance was calling that out in the most impolitic ways possible, deriding his straight-arrow competitors as fakers and losers and liars—because that bullshit-calling was uniquely candid and authentic in the age of fake.
  • Trump took a key piece of cynical wisdom about show business—the most important thing is sincerity, and once you can fake that, you’ve got it made—to a new level: His actual thuggish sincerity is the opposite of the old-fashioned, goody-goody sanctimony that people hate in politicians.
  • Trump’s genius was to exploit the skeptical disillusion with politics—there’s too much equivocating; democracy’s a charade—but also to pander to Americans’ magical thinking about national greatness. Extreme credulity is a fraternal twin of extreme skepticism.
  • Trump launched his political career by embracing a brand-new conspiracy theory twisted around two American taproots—fear and loathing of foreigners and of nonwhites.
  • The fact-checking website PolitiFact looked at more than 400 of his statements as a candidate and as president and found that almost 50 percent were false and another 20 percent were mostly false.
  • He gets away with this as he wouldn’t have in the 1980s or ’90s, when he first talked about running for president, because now factual truth really is just one option. After Trump won the election, he began referring to all unflattering or inconvenient journalism as “fake news.”
  • indeed, their most honest defense of his false statements has been to cast them practically as matters of religious conviction—he deeply believes them, so … there. When White House Press Secretary Sean Spicer was asked at a press conference about the millions of people who the president insists voted illegally, he earnestly reminded reporters that Trump “has believed that for a while” and “does believe that” and it’s “been a long-standing belief that he’s maintained” and “it’s a belief that he has maintained for a while.”
  • Which is why nearly half of Americans subscribe to that preposterous belief themselves. And in Trump’s view, that overrides any requirement for facts.
  • he idea that progress has some kind of unstoppable momentum, as if powered by a Newtonian law, was always a very American belief. However, it’s really an article of faith, the Christian fantasy about history’s happy ending reconfigured during and after the Enlightenment as a set of modern secular fantasies
  • I really can imagine, for the first time in my life, that America has permanently tipped into irreversible decline, heading deeper into Fantasyland. I wonder whether it’s only America’s destiny, exceptional as ever, to unravel in this way. Or maybe we’re just early adopters, the canaries in the global mine
  • I do despair of our devolution into unreason and magical thinking, but not everything has gone wrong.
  • I think we can slow the flood, repair the levees, and maybe stop things from getting any worse. If we’re splitting into two different cultures, we in reality-based America—whether the blue part or the smaller red part—must try to keep our zone as large and robust and attractive as possible for ourselves and for future generations
  • We need to firmly commit to Moynihan’s aphorism about opinions versus facts. We must call out the dangerously untrue and unreal
  • do not give acquaintances and friends and family members free passes. If you have children or grandchildren, teach them to distinguish between true and untrue as fiercely as you do between right and wrong and between wise and foolish.
  • How many Americans now inhabit alternate realities?
  • reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.
  • Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”
  • A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.
lmunch

United States Records Its Worst Week Yet for Virus Cases - The New York Times - 0 views

  • The country reported a record of more than 500,000 new coronavirus cases in the past week.
  • Almost a third saw a record in the past week.
  • In the Upper Midwest and Mountain West, records are being smashed almost daily, and in some counties as much as 5 percent of the population has tested positive for the virus to date.
  • ...9 more annotations...
  • Hospitalization data, which the Covid Tracking Project collects at the state level, shows that the number of people hospitalized with the coronavirus reached record highs in almost half of states in recent weeks.
  • Recent studies have provided some hope that improved treatment has led to a better survival rate among those ill enough to be hospitalized. But experts worry that the 46 percent increase in hospitalizations compared with a month ago could overwhelm hospital capacity — especially in rural areas with limited health resources — and roll back improvements in survival rates.
  • In the past month, about a third of U.S. counties hit a daily record of more deaths than any other time during the pandemic.
  • The daily death toll is lower than it was at its peak, but on average, about 800 people who contracted the coronavirus are dying each day
  • The recent surge in cases has not yet brought a similar surge in reported deaths, which can lag cases by up to several weeks. But already deaths are increasing in about half of states.
  • The outlook for the pandemic continues to worsen, and many areas of the United States are experiencing their worst weeks yet.
  • It’s not just a few areas driving the surge, as was the case early on. Half of U.S. counties saw new cases peak during the past month.
  • And in some less populous places, a record number is not necessarily a very high one. Orleans County, Vt., for example, saw eight cases in the past week — a record for the rural county of about 27,000 people on the Canadian border, but hardly a severe outbreak.
  • Taylor County, Fla., a Gulf Coast county of similar size, had 32 cases in the past week, four times as many as Orleans but far fewer than the record 600 new cases it had during the first week of August.
Javier E

In fight to lead America's future, battle rages over its racial past - The Washington Post - 0 views

  • The political and pedagogical firefight encapsulates a broader debate that has erupted across the country about what to teach about race, history and the intersection of the two. It underscores how the nation’s metastasizing culture wars — now firmly ensconced in the nation’s classrooms — have broadened to strip Americans of a shared sense of history, leaving many to view the past through the filter of contemporary polarization.
  • “Most of our prior arguments were about who to include in the story, not the story itself,” said Jonathan Zimmerman, a University of Pennsylvania professor who studies the history of education. “America has lost a shared national narrative.”
  • history has become a defining topic for contenders angling for the presidency.
  • ...9 more annotations...
  • DeSantis, whose “anti-woke” agenda has put Florida at the forefront of revising how Black history is taught, has come under fire for supporting a set of standards for middle school instruction that include teaching “how slaves developed skills which, in some instances, could be applied for their personal benefit.”
  • It all portends hopeless bifurcation, Zimmerman said, suggesting that America is becoming a place where what students learn about the country’s past will depend whether they live in a conservative or liberal state.
  • Conservatives contend that instruction on race and history has shifted since then to reflect liberal ideologies and values in ways inappropriate to the schoolhouse. They have advocated returning to a more traditional way of teaching American history, one less critical of the nation’s past flaws and less explicit about linking current inequalities to past injustices.
  • The dueling American histories “are about not just what has happened, but what we do about it going forward,” he said. “If you can tell a story that removes the harm that has been done, if you can tell a story that removes the violence, that removes the disenfranchisement, that removes the targeting of certain communities — then what you do is you change the way we believe we have to deal with it.”
  • Events during Trump’s presidency — including a deadly white-supremacist rally in Charlottesville, the murder of George Floyd and the publication of the New York Times’s 1619 Project reexamining the role of slavery in America’s founding — propelled the country toward a cultural conflagration over the idea that America’s history of systemic racism was still affecting minorities today.
  • Various institutions embraced the Black Lives Matter movement and sought to take actions aimed at acknowledging and curing past injustices. The movement was especially potent among liberals, and then-candidate Biden reoriented much of his campaign in the summer of 2020 to focus on “equity.”
  • A string of recent activity — from Supreme Court decision striking down college affirmative action programs to mass shootings by white supremacists to book bans by some Republican officials — has propelled the issue back to the forefront of Democratic agenda.
  • After the 2020 protests over Floyd’s murder, more than 160 Confederate memorials were removed, relocated or renamed, according to the Southern Poverty Law Center. Still, a January 2022 report found that there were still 723 monuments and 741 roadways dedicated to Confederates, as well as hundreds of schools, counties, parks, buildings and holidays.
  • “There are states that can remove history from a textbook, but they can never destroy the physical places where history happened,” Leggs said. “Historic preservation is all the more important at this moment in our history, and through our work, we can ignite both a cultural reckoning and cultural renaissance.”
Javier E

Is Holocaust Education Making Anti-Semitism Worse? - The Atlantic - 0 views

  • Explore
  • The recent rise in American anti-Semitism is well documented. I could fill pages with FBI hate-crime statistics, or with a list of violent attacks from the past six years or even the past six months, or with the growing gallery of American public figures saying vile things about Jews. Or I could share stories you probably haven’t heard, such as one about a threatened attack on a Jewish school in Ohio in March 2022—where the would-be perpetrator was the school’s own security guard. But none of that would capture the vague sense of dread one encounters these days in the Jewish community, a dread unprecedented in my lifetime.
  • What I didn’t expect was the torrent of private stories I received from American Jew
  • ...137 more annotations...
  • well-meaning people everywhere from statehouses to your local middle school have responded to this surging anti-Semitism by doubling down on Holocaust education. Before 2016, only seven states required Holocaust education in schools. In the past seven years, 18 more have passed Holocaust-education mandates
  • These casual stories sickened me in their volume and their similarity, a catalog of small degradations. At a time when many people in other minority groups have become bold in publicizing the tiniest of slights, these American Jews instead expressed deep shame in sharing these stories with me, feeling that they had no right to complain. After all, as many of them told me, it wasn’t the Holocaust.
  • These people talked about bosses and colleagues who repeatedly ridiculed them with anti-Semitic “jokes,” friends who turned on them when they mentioned a son’s bar mitzvah or a trip to Israel, romantic partners who openly mocked their traditions, classmates who defaced their dorm rooms and pilloried them online, teachers and neighbors who parroted conspiratorial lies. I was surprised to learn how many people were getting pennies thrown at them in 21st-century Americ
  • the blood libel, which would later be repurposed as a key part of the QAnon conspiracy theory. This craze wasn’t caused by one-party control over printing presses, but by the lie’s popularity
  • I have come to the disturbing conclusion that Holocaust education is incapable of addressing contemporary anti-Semitism. In fact, in the total absence of any education about Jews alive today, teaching about the Holocaust might even be making anti-Semitism worse.
  • The Illinois Holocaust Museum & Education Center is a victim of its own success. When I arrived on a weekday morning to join a field trip from a local Catholic middle school, the museum was having a light day, with only 160 students visiting
  • the docent established that the ’30s featured media beyond town criers, and that one-party control over such media helped spread propaganda. “If radio’s controlled by a certain party, you have to question that,” she said. “Back then, they didn’t.”
  • I wondered about that premise. Historians have pointed out that it doesn’t make sense to assume that people in previous eras were simply stupider than we are, and I doubted that 2020s Americans could outsmart 1930s Germans in detecting media bias. Propaganda has been used to incite violent anti-Semitism since ancient times, and only rarely because of one-party control.
  • The Nazi project was about murdering Jews, but also about erasing Jewish civilization. The museum’s valiant effort to teach students that Jews were “just like everyone else,” after Jews have spent 3,000 years deliberately not being like everyone else, felt like another erasur
  • I was starting to see how isolating the Holocaust from the rest of Jewish history made it hard for even the best educators to upload this irrational reality into seventh-grade brains.
  • the docent began by saying, “Let’s establish facts. Is Judaism a religion or a nationality?
  • My stomach sank. The question betrayed a fundamental misunderstanding of Jewish identity—Jews predate the concepts of both religion and nationality. Jews are members of a type of social group that was common in the ancient Near East but is uncommon in the West today: a joinable tribal group with a shared history, homeland, and culture, of which a nonuniversalizing religion is but one feature
  • Millions of Jews identify as secular, which would be illogical if Judaism were merely a religion. But every non-Jewish society has tried to force Jews into whatever identity boxes it knows best—which is itself a quiet act of domination.
  • “Religion, right,” the docent affirmed. (Later, in the gallery about Kristallnacht, she pointed out how Jews had been persecuted for having the “wrong religion,” which would have surprised the many Jewish converts to Christianity who wound up murdered. I know the docent knew this; she later told me she had abbreviated things to hustle our group to the museum’s boxcar.)
  • The docent motioned toward the prewar gallery’s photos showing Jewish school groups and family outings, and asked how the students would describe their subjects’ lives, based on the pictures.“Normal,” a girl said.“Normal, perfect,” the docent said. “They paid taxes, they fought in the wars—all of a sudden, things changed.”
  • the museum had made a conscious decision not to focus on the long history of anti-Semitism that preceded the Holocaust, and made it possible. To be fair, adequately covering this topic would have required an additional museum
  • The bedrock assumption that has endured for nearly half a century is that learning about the Holocaust inoculates people against anti-Semitism. But it doesn’t
  • Then there was the word normal. More than 80 percent of Jewish Holocaust victims spoke Yiddish, a 1,000-year-old European Jewish language spoken around the world, with its own schools, books, newspapers, theaters, political organizations, advertising, and film industry. On a continent where language was tightly tied to territory, this was hardly “normal.” Traditional Jewish practices—which include extremely detailed rules governing food and clothing and 100 gratitude blessings recited each day—were not “normal” either.
  • the idea of sudden change—referring to not merely the Nazi takeover, but the shift from a welcoming society to an unwelcoming one—was also reinforced by survivors in videos around the museum
  • Teaching children that one shouldn’t hate Jews, because Jews are “normal,” only underlines the problem: If someone doesn’t meet your version of “normal,” then it’s fine to hate them.
  • When I asked about worst practices in Holocaust education, Szany had many to share, which turned out to be widely agreed-upon among American Holocaust educators.
  • First on the list: “simulations.” Apparently some teachers need to be told not to make students role-play Nazis versus Jews in class, or not to put masking tape on the floor in the exact dimensions of a boxcar in order to cram 200 students into i
  • Szany also condemned Holocaust fiction such as the international best seller The Boy in the Striped Pajamas, an exceedingly popular work of ahistorical Christian-savior schlock
  • She didn’t feel that Anne Frank’s diary was a good choice either, because it’s “not a story of the Holocaust”—it offers little information about most Jews’ experiences of persecution, and ends before the author’s capture and murder.
  • Other officially failed techniques include showing students gruesome images, and prompting self-flattery by asking “What would you have done?
  • Yet another bad idea is counting objects. This was the conceit of a widely viewed 2004 documentary called Paper Clips, in which non-Jewish Tennessee schoolchildren, struggling to grasp the magnitude of 6 million murdered Jews, represented those Jews by collecting millions of paper clips
  • it is demeaning to represent Jewish people as office supplies.
  • Best practices, Szany explained, are the opposite: focusing on individual stories, hearing from survivors and victims in their own words. The Illinois museum tries to “rescue the individuals from the violence,
  • In the language I often encountered in Holocaust-education resources, people who lived through the Holocaust were neatly categorized as “perpetrators,” “victims,” “bystanders,” or “upstanders.” Jewish resisters, though, were rarely classified as “upstanders.
  • I felt as I often had with actual Holocaust survivors I’d known when I was younger: frustrated as they answered questions I hadn’t asked, and vaguely insulted as they treated me like an annoyance to be managed. (I bridged this divide once I learned Yiddish in my 20s, and came to share with them a vast vocabulary of not only words, but people, places, stories, ideas—a way of thinking and being that contained not a few horrific years but centuries of hard-won vitality and resilience
  • Szany at last explained to me what the dead Elster couldn’t: The woman who sheltered his sister took only girls because it was too easy for people to confirm that the boys were Jews.
  • I realized that I wouldn’t have wanted to hear this answer from Elster. I did not want to make this thoughtful man sit onstage and discuss his own circumcision with an audience of non-Jewish teenagers. The idea felt just as dehumanizing as pulling down a boy’s pants to reveal a reality of embodied Judaism that, both here and in that barn, had been drained of any meaning beyond persecution
  • Here I am in a boxcar, I thought, and tried to make it feel real. I spun my head to take in the immersive scene, which swung around me as though I were on a rocking ship. I felt dizzy and disoriented, purely physical feelings that distracted me. Did this not count as a simulation
  • I had visited Auschwitz in actual reality, years ago. With my headset on, I tried to summon the emotional intensity I remembered feeling then. But I couldn’t, because all of the things that had made it powerful were missing. When I was there, I was touching things, smelling things, sifting soil between my fingers that the guide said contained human bone ash, feeling comforted as I recited the mourner’s prayer, the kaddish, with others, the ancient words an undertow of paradox and praise: May the great Name be blessed, forever and ever and ever
  • Students at the Skokie museum can visit an area called the Take a Stand Center, which opens with a bright display of modern and contemporary “upstanders,” including activists such as the Nobel laureate Malala Yousafzai and the athlete Carli Lloyd. Szany had told me that educators “wanted more resources” to connect “the history of the Holocaust to lessons of today.” (I heard this again and again elsewhere too.) As far as I could discern, almost nobody in this gallery was Jewish.
  • As Szany ran a private demo of the technology for me, I asked how visitors react to it. “They’re more comfortable with the holograms than the real survivors,” Szany said. “Because they know they won’t be judged.”
  • t the post-Holocaust activists featured in this gallery were nearly all people who had stood up for their own group. Only Jews, the unspoken assumption went, were not supposed to stand up for themselves.
  • Visitors were asked to “take the pledge” by posting notes on a wall (“I pledge to protect the Earth!” “I pledge to be KIND!”)
  • It was all so earnest that for the first time since entering the museum, I felt something like hope. Then I noticed it: “Steps for Organizing a Demonstration.” The Nazis in Skokie, like their predecessors, had known how to organize a demonstration. They hadn’t been afraid to be unpopular. They’d taken a stand.
  • I left the museum haunted by the uncomfortable truth that the structures of a democratic society could not really prevent, and could even empower, dangerous, irrational rage. Something of that rage haunted me too.
  • the more I thought about it, the less obvious it seemed. What were students being taught to “take a stand” for? How could anyone, especially young people with little sense of proportion, connect the murder of 6 million Jews to today without landing in a swamp of Holocaust trivialization, like the COVID-protocol protesters who’d pinned Jewish stars to their shirt and carried posters of Anne Frank?
  • weren’t they and others like them doing exactly what Holocaust educators claimed they wanted people to do?
  • The 2019 law was inspired by a changing reality in Washington and around the country. In recent years, Kennedy said, she’s received more and more messages about anti-Semitic vandalism and harassment in schools. For example, she told me, “someone calls and says, ‘There’s a swastika drawn in the bathroom.’ ”
  • Maybe not, Kennedy admitted. “What frightens me is that small acts of anti-Semitism are becoming very normalized,” she said. “We’re getting used to it. That keeps me up at night.”“Sadly, I don’t think we can fix this,” Regelbrugge said. “But we’re gonna die trying.”
  • Almost every city where I spoke with Holocaust-museum educators, whether by phone or in person, had also been the site of a violent anti-Semitic attack in the years since these museums had opened
  • I was struck by how minimally these attacks were discussed in the educational materials shared by the museums.
  • In fact, with the exception of Kennedy and Regelbrugge, no one I spoke with mentioned these anti-Semitic attacks at all.
  • The failure to address contemporary anti-Semitism in most of American Holocaust education is, in a sense, by design
  • the story of the (mostly non-Jewish) teachers in Massachusetts and New Jersey who created the country’s first Holocaust curricula, in the ’70s. The point was to teach morality in a secular society. “Everyone in education, regardless of ethnicity, could agree that Nazism was evil and that the Jews were innocent victims,” Fallace wrote, explaining the topic’s appeal. “Thus, teachers used the Holocaust to activate the moral reasoning of their students”—to teach them to be good people.
  • The idea that Holocaust education can somehow serve as a stand-in for public moral education has not left us. And because of its obviously laudable goals, objecting to it feels like clubbing a baby seal. Who wouldn’t want to teach kids to be empathetic?
  • by this logic, shouldn’t Holocaust education, because of its moral content alone, automatically inoculate people against anti-Semitism?
  • Apparently not. “Essentially the moral lessons that the Holocaust is often used to teach reflect much the same values that were being taught in schools before the Holocaust,”
  • (Germans in the ’30s, after all, were familiar with the Torah’s commandment, repeated in the Christian Bible, to love their neighbors.) This fact undermines nearly everything Holocaust education is trying to accomplish, and reveals the roots of its failure.
  • One problem with using the Holocaust as a morality play is exactly its appeal: It flatters everyone. We can all congratulate ourselves for not committing mass murder.
  • This approach excuses current anti-Semitism by defining anti-Semitism as genocide in the past
  • When anti-Semitism is reduced to the Holocaust, anything short of murdering 6 million Jews—like, say, ramming somebody with a shopping cart, or taunting kids at school, or shooting up a Jewish nonprofit, or hounding Jews out of entire countries—seems minor by comparison.
  • If we teach that the Holocaust happened because people weren’t nice enough—that they failed to appreciate that humans are all the same, for instance, or to build a just society—we create the self-congratulatory space where anti-Semitism grow
  • One can believe that humans are all the same while being virulently anti-Semitic, because according to anti-Semites, Jews, with their millennia-old insistence on being different from their neighbors, are the obstacle to humans all being the same
  • One can believe in creating a just society while being virulently anti-Semitic, because according to anti-Semites, Jews, with their imagined power and privilege, are the obstacle to a just society
  • To inoculate people against the myth that humans have to erase their differences in order to get along, and the related myth that Jews, because they have refused to erase their differences, are supervillains, one would have to acknowledge that these myths exist
  • To really shatter them, one would have to actually explain the content of Jewish identity, instead of lazily claiming that Jews are just like everyone else.
  • one of several major Holocaust-curriculum providers, told me about the “terrible Jew jokes” she’d heard from her own students in Virginia. “They don’t necessarily know where they come from or even really why they’re saying them,” Goss said. “Many kids understand not to say the N-word, but they would say, ‘Don’t be such a Jew.’ ”
  • There’s a decline in history education at the same time that there’s a rise in social media,”
  • “We’ve done studies with our partners at Holocaust centers that show that students are coming in with questions about whether the Holocaust was an actual event. That wasn’t true 20 years ago.”
  • Goss believes that one of the reasons for the lack of stigma around anti-Semitic conspiracy theories and jokes is baked into the universal-morality approach to Holocaust education. “The Holocaust is not a good way to teach about ‘bullying,’ 
  • Echoes & Reflections’ lesson plans do address newer versions of anti-Semitism, including the contemporary demonization of Israel’s existence—as opposed to criticism of Israeli policies—and its manifestation in aggression against Jews. Other Holocaust-curriculum providers also have material on contemporary anti-Semitism.
  • providers rarely explain or explore who Jews are today—and their raison d’être remains Holocaust education.
  • Many teachers had told me that their classrooms “come alive” when they teach about the Holocaust
  • Holocaust-education materials are just plain better than those on most other historical topics. All of the major Holocaust-education providers offer lessons that teachers can easily adapt for different grade levels and subject areas. Instead of lecturing and memorization, they use participation-based methods such as group work, hands-on activities, and “learner driven” projects.
  • A 2019 Pew Research Center survey found a correlation between “warm” feelings about Jews and knowledge about the Holocaust—but the respondents who said they knew a Jewish person also tended to be more knowledgeable about the Holocaust, providing a more obvious source for their feelings
  • In 2020, Echoes & Reflections published a commissioned study of 1,500 college students, comparing students who had been exposed to Holocaust education in high school with those who hadn’t. The published summary shows that those who had studied the Holocaust were more likely to tolerate diverse viewpoints, and more likely to privately support victims of bullying scenarios, which is undoubtedly good news. It did not, however, show a significant difference in respondents’ willingness to defend victims publicly, and students who’d received Holocaust education were less likely to be civically engaged—in other words, to be an “upstander.”
  • These studies puzzled me. As Goss told me, the Holocaust was not about bullying—so why was the Echoes study measuring that? More important, why were none of these studies examining awareness of anti-Semitism, whether past or present?
  • One major study addressing this topic was conducted in England, where a national Holocaust-education mandate has been in place for more than 20 years. In 2016, researchers at University College London’s Centre for Holocaust Education published a survey of more than 8,000 English secondary-school students, including 244 whom they interviewed at length.
  • The study’s most disturbing finding was that even among those who studied the Holocaust, there was “a very common struggle among many students to credibly explain why Jews were targeted” in the Holocaust—that is, to cite anti-Semitism
  • “many students appeared to regard [Jews’] existence as problematic and a key cause of Nazi victimisation.” In other words, students blamed the Holocaust on the Jews
  • This result resembles that of a large 2020 survey of American Millennials and Gen Zers, in which 11 percent of respondents believed that Jews caused the Holocaust. The state with the highest percentage of respondents believing this—an eye-popping 19 percent—was New York, which has mandated Holocaust education since the 1990s.
  • Worse, in the English study, “a significant number of students appeared to tacitly accept some of the egregious claims once circulated by Nazi propaganda,” instead of recognizing them as anti-Semitic myths.
  • One typical student told researchers, “Is it because like they were kind of rich, so maybe they thought that that was kind of in some way evil, like the money didn’t belong to them[;] it belonged to the Germans and the Jewish people had kind of taken that away from them?
  • Another was even more blunt: “The Germans, when they saw the Jews were better off than them, kind of, I don’t know, it kind of pissed them off a bit.” Hitler’s speeches were more eloquent in making similar points.
  • One of the teachers I met was Benjamin Vollmer, a veteran conference participant who has spent years building his school’s Holocaust-education program. He teaches eighth-grade English in Venus, Texas, a rural community with 5,700 residents; his school is majority Hispanic, and most students qualify for free or reduced-price lunch. When I asked him why he focuses on the Holocaust, his initial answer was simple: “It meets the TEKS.”
  • The TEKS are the Texas Essential Knowledge and Skills, an elaborate list of state educational requirements that drive standardized testing
  • it became apparent that Holocaust education was something much bigger for his students: a rare access point to a wider world. Venus is about 30 miles from Dallas, but Vollmer’s annual Holocaust-museum field trip is the first time that many of his students ever leave their town.
  • “It’s become part of the school culture,” Vollmer said. “In eighth grade, they walk in, and the first thing they ask is, ‘When are we going to learn about the Holocaust?’
  • Vollmer is not Jewish—and, as is common for Holocaust educators, he has never had a Jewish student. (Jews are 2.4 percent of the U.S. adult population, according to a 2020 Pew survey.) Why not focus on something more relevant to his students, I asked him, like the history of immigration or the civil-rights movement?
  • I hadn’t yet appreciated that the absence of Jews was precisely the appeal.“Some topics have been so politicized that it’s too hard to teach them,” Vollmer told me. “Making it more historical takes away some of the barriers to talking about it.”
  • Wouldn’t the civil-rights movement, I asked, be just as historical for his students?He paused, thinking it through. “You have to build a level of rapport in your class before you have the trust to explore your own history,” he finally said.
  • “The Holocaust happened long ago, and we’re not responsible for it,” she said. “Anything happening in our world today, the wool comes down over our eyes.” Her colleague attending the conference with her, a high-school teacher who also wouldn’t share her name, had tried to take her mostly Hispanic students to a virtual-reality experience called Carne y Arena, which follows migrants attempting to illegally cross the U.S.-Mexico border. Her administrators refused, claiming that it would traumatize students. But they still learn about the Holocaust.
  • Student discomfort has been a legal issue in Texas. The state’s House Bill 3979, passed in 2021, is one of many “anti-critical-race-theory” laws that conservative state legislators have introduced since 2020. The bill forbade teachers from causing students “discomfort, guilt, anguish, or any other form of psychological distress on account of the individual’s race or sex,” and also demanded that teachers introduce “diverse and contending perspectives” when teaching “controversial” topics, “without giving deference to any one perspective.
  • These vaguely worded laws stand awkwardly beside a 2019 state law mandating Holocaust education for Texas students at all grade levels during an annual Holocaust Remembrance Week
  • the administrator who’d made the viral remarks in Southlake is a strong proponent of Holocaust education, but was acknowledging a reality in that school district. Every year, the administrator had told Higgins, some parents in her district object to their children reading the Nobel laureate Elie Wiesel’s memoir Night—because it isn’t their “belief” that the Holocaust happened.
  • In one model lesson at the conference, participants examined a speech by the Nazi official Heinrich Himmler about the need to murder Jews, alongside a speech by the Hebrew poet and ghetto fighter Abba Kovner encouraging a ghetto uprising. I only later realized that this lesson plan quite elegantly satisfied the House bill’s requirement of providing “contending perspectives.”
  • The next day, I asked the instructor if that was an unspoken goal of her lesson plan. With visible hesitation, she said that teaching in Texas can be like “walking the tightrope.” This way, she added, “you’re basing your perspectives on primary texts and not debating with Holocaust deniers.” Less than an hour later, a senior museum employee pulled me aside to tell me that I wasn’t allowed to interview the staff.
  • Many of the visiting educators at the conference declined to talk with me, even anonymously; nearly all who did spoke guardedly. The teachers I met, most of whom were white Christian women, did not seem to be of any uniform political bent. But virtually all of them were frustrated by what administrators and parents were demanding of them.
  • Two local middle-school teachers told me that many parents insist on seeing reading lists. Parents “wanting to keep their kid in a bubble,” one of them said, has been “the huge stumbling block.”
  • “It is healthy to begin this study by talking about anti-Semitism, humanizing the victims, sticking to primary sources, and remaining as neutral as possible.”
  • Wasn’t “remaining as neutral as possible” exactly the opposite of being an upstander?
  • In trying to remain neutral, some teachers seemed to want to seek out the Holocaust’s bright side—and ask dead Jews about i
  • We watched a brief introduction about Glauben’s childhood and early adolescence in the Warsaw Ghetto and in numerous camps. When the dead man appeared, one teacher asked, “Was there any joy or happiness in this ordeal? Moments of joy in the camps?”
  • These experiences, hardly unusual for Jewish victims, were not the work of a faceless killing machine. Instead they reveal a gleeful and imaginative sadism. For perpetrators, this was fun. Asking this dead man about “joy” seemed like a fundamental misunderstanding of the Holocaust. There was plenty of joy, just on the Nazi side.
  • In the educational resources I explored, I did not encounter any discussions of sadism—the joy derived from humiliating people, the dopamine hit from landing a laugh at someone else’s expense, the self-righteous high from blaming one’s problems on others—even though this, rather than the fragility of democracy or the passivity of bystanders, is a major origin point of all anti-Semitism
  • To anyone who has spent 10 seconds online, that sadism is familiar, and its source is familiar too: the fear of being small, and the desire to feel big by making others feel small instead.
  • Nazis were, among other things, edgelords, in it for the laughs. So, for that matter, were the rest of history’s anti-Semites, then and now. For Americans today, isn’t this the most relevant insight of all?
  • “People say we’ve learned from the Holocaust. No, we didn’t learn a damn thing,”
  • “People glom on to this idea of the upstander,” she said. “Kids walk away with the sense that there were a lot of upstanders, and they think, Yes, I can do it too.”
  • The problem with presenting the less inspiring reality, she suggested, is how parents or administrators might react. “If you teach historical anti-Semitism, you have to teach contemporary anti-Semitism. A lot of teachers are fearful, because if you try to connect it to today, parents are going to call, or administrators are going to call, and say you’re pushing an agenda.”
  • But weren’t teachers supposed to “push an agenda” to stop hatred? Wasn’t that the entire hope of those survivors who built museums and lobbied for mandates and turned themselves into holograms?
  • I asked Klett why no one seemed to be teaching anything about Jewish culture. If the whole point of Holocaust education is to “humanize” those who were “dehumanized,” why do most teachers introduce students to Jews only when Jews are headed for a mass grave? “There’s a real fear of teaching about Judaism,” she confided. “Especially if the teacher is Jewish.”
  • Teachers who taught about industrialized mass murder were scared of teaching about … Judaism? Why?
  • “Because the teachers are afraid that the parents are going to say that they’re pushing their religion on the kids.”
  • “Survivors have told me, ‘Thank you for teaching this. They’ll listen to you because you’re not Jewish,’ ” she said. “Which is weird.”
  • perhaps we could be honest and just say “There is no point in teaching any of this”—because anti-Semitism is so ingrained in our world that even when discussing the murders of 6 million Jews, it would be “pushing an agenda” to tell people not to hate them, or to tell anyone what it actually means to be Jewish
  • The Dallas Museum was the only one I visited that opened with an explanation of who Jews are. Its exhibition began with brief videos about Abraham and Moses—limiting Jewish identity to a “religion” familiar to non-Jews, but it was better than nothing. The museum also debunked the false charge that the Jews—rather than the Romans—killed Jesus, and explained the Jews’ refusal to convert to other faiths. It even had a panel or two about contemporary Dallas Jewish life. Even so, a docent there told me that one question students ask is “Are any Jews still alive today?”
  • American Holocaust education, in this museum and nearly everywhere else, never ends with Jews alive today. Instead it ends by segueing to other genocides, or to other minorities’ suffering
  • But when one reaches the end of the exhibition on American slavery at the National Museum of African American History and Culture, in Washington, D.C., one does not then enter an exhibition highlighting the enslavement of other groups throughout world history, or a room full of interactive touchscreens about human trafficking today, asking that visitors become “upstanders” in fighting i
  • That approach would be an insult to Black history, ignoring Black people’s current experiences while turning their past oppression into nothing but a symbol for something else, something that actually matters.
  • It is dehumanizing to be treated as a symbol. It is even more dehumanizing to be treated as a warning.
  • How should we teach children about anti-Semitism?
  • Decoster began her conference workshop by introducing “vocabulary must-knows.” At the top of her list: anti-Semitism.
  • “If you don’t explain the ism,” she cautioned the teachers in the room, “you will need to explain to the kids ‘Why the Jews?’ Students are going to see Nazis as aliens who bring with them anti-Semitism when they come to power in ’33, and they take it back away at the end of the Holocaust in 1945.”
  • She asked the teachers, “What’s the first example of the persecution of the Jews in history?”
  • “Think ancient Egypt,” Decoster said. “Does this sound familiar to any of you?”“They’re enslaved by the Egyptian pharaoh,” a teacher said
  • I wasn’t sure that the biblical Exodus narrative exactly qualified as “history,” but it quickly became clear that wasn’t Decoster’s point. “Why does the pharaoh pick on the Jews?” she asked. “Because they had one God.”
  • I was stunned. Rarely in my journey through American Holocaust education did I hear anyone mention a Jewish belief.
  • “The Jews worship one God, and that’s their moral structure. Egyptian society has multiple gods whose authority goes to the pharaoh. When things go wrong, you can see how Jews as outsiders were perceived by the pharaoh as the threat.”
  • This unexpected understanding of Jewish belief revealed a profound insight about Judaism: Its rejection of idolatry is identical to its rejection of tyranny. I could see how that might make people uncomfortable.
  • Decoster moved on to a snazzy infographic of a wheel divided in thirds, each explaining a component of anti-Semitism
  • “Racial Antisemitism = False belief that Jews are a race and a threat to other races,”
  • Anti-Judaism = Hatred of Jews as a religious group,”
  • then “Anti-Jewish Conspiracy Theory = False belief that Jews want to control and overtake the world.” The third part, the conspiracy theory, was what distinguished anti-Semitism from other bigotries. It allowed closed-minded people to congratulate themselves for being open-minded—for “doing their own research,” for “punching up,” for “speaking truth to power,” while actually just spreading lies.
  • Wolfson clarified for his audience what this centuries-long demonization of Jews actually means, citing the scholar David Patterson, who has written: “In the end, the antisemite’s claim is not that all Jews are evil, but rather that all evil is Jewish.”
  • Wolfson told the teachers that it was important that “anti-Semitism should not be your students’ first introduction to Jews and Judaism.” He said this almost as an aside, just before presenting the pig-excrement image. “If you’re teaching about anti-Semitism before you teach about the content of Jewish identity, you’re doing it wrong.
  • this—introducing students to Judaism by way of anti-Semitism—was exactly what they were doing. The same could be said, I realized, for nearly all of American Holocaust education.
  • The Holocaust educators I met across America were all obsessed with building empathy, a quality that relies on finding commonalities between ourselves and others.
  • a more effective way to address anti-Semitism might lie in cultivating a completely different quality, one that happens to be the key to education itself: curiosity. Why use Jews as a means to teach people that we’re all the same, when the demand that Jews be just like their neighbors is exactly what embedded the mental virus of anti-Semitism in the Western mind in the first place? Why not instead encourage inquiry about the diversity, to borrow a de rigueur word, of the human experience?
  • I want a hologram of the late Rabbi Jonathan Sacks telling people about what he called “the dignity of difference.”
  • I want to mandate this for every student in this fractured and siloed America, even if it makes them much, much more uncomfortable than seeing piles of dead Jews doe
  • There is no empathy without curiosity, no respect without knowledge, no other way to learn what Jews first taught the world: love your neighbor
Javier E

Opinion | Climate Change Is Real. Markets, Not Governments, Offer the Cure. - The New Y... - 0 views

  • For years, I saw myself not as a global-warming denier (a loaded term with its tendentious echo of Holocaust denial) but rather as an agnostic on the causes of climate change and a scoffer at the idea that it was a catastrophic threat to the future of humanity.
  • It’s not that I was unalterably opposed to the idea that, by pumping carbon dioxide into the atmosphere, modern civilization was contributing to the warming by 1 degree Celsius and the inches of sea-level rise the planet had experienced since the dawn of the industrial age. It’s that the severity of the threat seemed to me wildly exaggerated and that the proposed cures all smacked of old-fashioned statism mixed with new-age religion.
  • Hadn’t we repeatedly lived through previous alarms about other, allegedly imminent, environmental catastrophes that didn’t come to pass, like the belief, widespread in the 1970s, that overpopulation would inevitably lead to mass starvation? And if the Green Revolution had spared us from that Malthusian nightmare, why should we not have confidence that human ingenuity wouldn’t also prevent the parade of horribles that climate change was supposed to bring about?
  • ...63 more annotations...
  • I had other doubts, too. It seemed hubristic, or worse, to make multitrillion-dollar policy bets based on computer models trying to forecast climate patterns decades into the future. Climate activists kept promoting policies based on technologies that were either far from mature (solar energy) or sometimes actively harmful (biofuels).
  • Expensive efforts to curb greenhouse gas emissions in Europe and North America seemed particularly fruitless when China, India and other developing countries weren’t about to curb their own appetite for fossil fuels
  • just how fast is Greenland’s ice melting right now? Is this an emergency for our time, or is it a problem for the future?
  • His pitch was simple: The coastline we have taken for granted for thousands of years of human history changed rapidly in the past on account of natural forces — and would soon be changing rapidly and disastrously by man-made ones. A trip to Greenland, which holds one-eighth of the world’s ice on land (most of the rest is in Antarctica) would show me just how drastic those changes have been. Would I join him?
  • Greenland is about the size of Alaska and California combined and, except at its coasts, is covered by ice that in places is nearly two miles thick. Even that’s only a fraction of the ice in Antarctica, which is more than six times as large
  • Greenland’s ice also poses a nearer-term risk because it is melting faster. If all its ice were to melt, global sea levels would rise by some 24 feet. That would be more than enough to inundate hundreds of coastal cities in scores of nations, from Jakarta and Bangkok to Copenhagen and Amsterdam to Miami and New Orleans.
  • There was also a millenarian fervor that bothered me about climate activism, with its apocalyptic imagery (the Statue of Liberty underwater) and threats of doom unless we were willing to live far more frugally.
  • “We haven’t had a good positive mass balance year since the late 1990s,” he told me in a follow-on email when I asked him to explain the data for me. The losses can vary sharply by year. The annualized average over the past 30 years, he added, is 170 gigatons per year. That’s the equivalent of about 5,400 tons of ice loss per second. That “suggests that Greenland ice loss has been tracking the I.P.P.C. worse-case, highest-carbon-emission scenario.
  • The data shows unmistakably that Greenland’s ice is not in balance. It is losing far more than it is gaining.
  • scientists have been drilling ice-core samples from Greenland for decades, giving them a very good idea of climatic changes stretching back thousands of years. Better yet, a pair of satellites that detect anomalies in Earth’s gravity fields have been taking measurements of the sheet regularly for nearly 20 years, giving scientists a much more precise idea of what is happening.
  • it’s hard to forecast with any precision what that means. “Anyone who says they know what the sea level is going to be in 2100 is giving you an educated guess,” said NASA’s Willis. “The fact is, we’re seeing these big ice sheets melt for the first time in history, and we don’t really know how fast they can go.”
  • His own educated guess: “By 2100, we are probably looking at more than a foot or two and hopefully less than seven or eight feet. But we are struggling to figure out just how fast the ice sheets can melt. So the upper end of range is still not well known.”
  • On the face of it, that sounds manageable. Even if sea levels rise by eight feet, won’t the world have nearly 80 years to come to grips with the problem, during which technologies that help us mitigate the effects of climate change while adapting to its consequences are likely to make dramatic advances?
  • Won’t the world — including countries that today are poor — become far richer and thus more capable of weathering the floods, surges and superstorms?
  • The average rate at which sea level is rising around the world, he estimates, has more than tripled over the past three decades, to five millimeters a year from 1.5 millimeters. That may still seem minute, yet as the world learned during the pandemic, exponential increases have a way of hitting hard.
  • “When something is on a straight line or a smooth curve, you can plot its trajectory,” Englander said. “But sea level, like earthquakes and mudslides, is something that happens irregularly and can change rather quickly and surprise us. The point is, you can no longer predict the future by the recent past.”
  • In The Wall Street Journal’s editorial pages, where I used to work, the theoretical physicist Steven Koonin, a former under secretary for science in the Obama administration’s Energy Department, cast doubt on the threat from Thwaites in a voice that could have once been mine. He also thinks the risks associated with Greenland’s melting are less a product of human-induced global warming than of natural cycles in North Atlantic currents and temperatures, which over time have a way of regressing to the mean.
  • Even the poorest countries, while still unacceptably vulnerable, are suffering far fewer human and economic losses to climate-related disasters.
  • Another climate nonalarmist is Roger Pielke Jr., a professor of environmental studies at the University of Colorado Boulder. I call Pielke a nonalarmist rather than a skeptic because he readily acknowledges that the challenges associated with climate change, including sea-level rise, are real, serious and probably unstoppable, at least for many decades.
  • “If we have to have a problem,” he told me when I reached him by phone, “we probably want one with a slow onset that we can see coming. It’s not like an asteroid coming from space.”
  • “Since the 1940s, the impact of floods as a proportion of U.S. gross domestic product has dropped by 70 percent-plus,” Pielke said. “We see this around the world, across phenomena. The story is that fewer people are dying and we are having less damage proportional to G.D.P.”
  • “Much climate reporting today highlights short-term changes when they fit the narrative of a broken climate but then ignores or plays down changes when they don’t, often dismissing them as ‘just weather,’” he wrote in February.
  • Global warming is real and getting worse, Pielke said, yet still it’s possible that humanity will be able to adapt to, and compensate for, its effects.
  • A few years ago, I would have found voices like Koonin’s and Pielke’s persuasive. Now I’m less sure. What intervened was a pandemic.
  • That’s what I thought until the spring of 2020, when, along with everyone else, I experienced how swiftly and implacably nature can overwhelm even the richest and most technologically advanced societies. It was a lesson in the sort of intellectual humility I recommended for others
  • It was also a lesson in thinking about risk, especially those in the category known as high-impact, low-probability events that seem to be hitting us with such regularity in this century: the attacks of Sept. 11, 2001; the tsunamis of 2004 and 2011, the mass upheavals in the Arab world
  • What if the past does nothing to predict the future? What if climate risks do not evolve gradually and relatively predictably but instead suddenly soar uncontrollably? How much lead time is required to deal with something like sea-level rise? How do we weigh the risks of underreacting to climate change against the risks of overreacting to it?
  • I called Seth Klarman, one of the world’s most successful hedge-fund managers, to think through questions of risk. While he’s not an expert on climate change, he has spent decades thinking deeply about every manner of risk
  • And we will almost certainly have to do it from sources other than Russia, China, the Democratic Republic of Congo and other places that pose unacceptable strategic, environmental or humanitarian risks
  • “If you face something that is potentially existential,” he explained, “existential for nations, even for life as we know it, even if you thought the risk is, say, 5 percent, you’d want to hedge against it.”
  • “One thing we try to do,” he said, “is we buy protection when it’s really inexpensive, even when we think we may well not need it.” The forces contributing to climate change, he noted, echoing Englander, “might be irreversible sooner than the damage from climate change has become fully apparent. You can’t say it’s far off and wait when, if you had acted sooner, you might have dealt with it better and at less cost. We have to act now.”
  • In other words, an ounce of prevention is worth a pound of cure. That’s particularly true if climate change is akin to cancer — manageable or curable in its earlier stages, disastrous in its later ones.
  • As I’ve always believed, knowing there is grave risk to future generations — and expecting current ones to make immediate sacrifices for it — defies most of what we know about human nature. So I began to think more deeply about that challenge, and others.
  • For the world to achieve the net-zero goal for carbon dioxide emissions by 2050, according to the International Energy Agency, we will have to mine, by 2040, six times the current amounts of critical minerals — nickel, cobalt, copper, lithium, manganese, graphite, chromium, rare earths and other minerals and elements — needed for electric vehicles, wind turbines and solar panels.
  • The poster child for this kind of magical thinking is Germany, which undertook a historic Energiewende — “energy revolution” — only to come up short. At the turn of the century, Germany got about 85 percent of its primary energy from fossil fuels. Now it gets about 78 percent, a puny reduction, considering that the country has spent massive sums on renewables to increase the share of electricity it generates from them.
  • As in everything else in life, so too with the environment: There is no such thing as a free lunch. Whether it’s nuclear, biofuels, natural gas, hydroelectric or, yes, wind and solar, there will always be serious environmental downsides to any form of energy when used on a massive scale. A single industrial-size wind turbine, for instance, typically requires about a ton of rare earth metals as well as three metric tons of copper, which is notoriously destructive and dirty to mine.
  • no “clean energy” solution will easily liberate us from our overwhelming and, for now, inescapable dependence on fossil fuels.
  • Nobody brings the point home better than Vaclav Smil, the Canadian polymath whose most recent book, “How the World Really Works,” should be required reading for policymakers and anyone else interested in a serious discussion about potential climate solutions.
  • “I’ve talked to so many experts and seen so much evidence,” he told me over Zoom, “I’m convinced the climate is changing, and addressing climate change has become a philanthropic priority of mine.”
  • Things could turn a corner once scientists finally figure out a technical solution to the energy storage problem. Or when governments and local actors get over their NIMBYism when it comes to permitting and building a large energy grid to move electricity from Germany’s windy north to its energy-hungry south. Or when thoughtful environmental activists finally come to grips with the necessity of nuclear energy
  • Till then, even as I’ve come to accept the danger we face, I think it’s worth extending the cancer metaphor a little further: Just as cancer treatments, when they work at all, can have terrible side effects, much the same can be said of climate treatments: The gap between an accurate diagnosis and effective treatment remains dismayingly wide
  • Only when countries like Vietnam and China turned to a different model, of largely bottom-up, market-driven development, did hundreds of millions of people get lifted out of destitution.
  • the most important transformation has come in agriculture, which uses about 70 percent of the world’s freshwater supply.
  • Farmers gradually adopted sprinkler and drip irrigation systems, rather than more wasteful flood irrigation, not to conserve water but because the technology provided higher crop yields and larger profit margins.
  • Water shortages “will spur a revolutionary, aggressive approach to getting rid of flood irrigation,” said Seth Siegel, the chief sustainability officer of the Israeli AgTech company N-Drip. “Most of this innovation will be driven by free-market capitalism, with important incentives from government and NGOs.
  • meaningful environmental progress has been made through market forces. In this century, America’s carbon dioxide emissions across fuel types have fallen to well below 5,000 million metric tons per year, from a peak of about 6,000 million in 2007, even as our inflation-adjusted G.D.P. has grown by over 50 percent and total population by about 17 percent.
  • 1) Engagement with critics is vital. Insults and stridency are never good tools of persuasion, and trying to cow or censor climate skeptics into silence rarely works
  • the biggest single driver in emissions reductions from 2005 to 2017 was the switch from coal to natural gas for power generation, since gas produces roughly half the carbon dioxide as coal. This, in turn, was the result of a fracking revolution in the past decade, fiercely resisted by many environmental activists, that made the United States the world’s largest gas producer.
  • In the long run, we are likelier to make progress when we adopt partial solutions that work with the grain of human nature, not big ones that work against it
  • Renewables, particularly wind power, played a role. So did efficiency mandates.
  • The problem with our civilization isn’t overconfidence. It’s polarization, paralysis and a profound lack of trust in all institutions, including the scientific one
  • Devising effective climate policies begins with recognizing the reality of the social and political landscape in which all policy operates. Some thoughts on how we might do better:
  • They may not be directly related to climate change but can nonetheless have a positive impact on it. And they probably won’t come in the form of One Big Idea but in thousands of little ones whose cumulative impacts add up.
  • 2) Separate facts from predictions and predictions from policy. Global warming is a fact. So is the human contribution to it. So are observed increases in temperature and sea levels. So are continued increases if we continue to do more of the same. But the rate of those increases is difficult to predict even with the most sophisticated computer modeling
  • 3) Don’t allow climate to become a mainly left-of-center concern. One reason the topic of climate has become so anathema to many conservatives is that so many of the proposed solutions have the flavor, and often the price tag, of old-fashioned statism
  • 4) Be honest about the nature of the challenge. Talk of an imminent climate catastrophe is probably misleading, at least in the way most people understand “imminent.”
  • A more accurate description of the challenge might be a “potentially imminent tipping point,” meaning the worst consequences of climate change can still be far off but our ability to reverse them is drawing near. Again, the metaphor of cancer — never safe to ignore and always better to deal with at Stage 2 than at Stage 4 — can be helpful.
  • 5) Be humble about the nature of the solutions. The larger the political and financial investment in a “big fix” response to climate change on the scale of the Energiewende, the greater the loss in time, capital and (crucially) public trust when it doesn’t work as planned
  • 6) Begin solving problems our great-grandchildren will face. Start with sea-level rise
  • We can also stop providing incentives for building in flood-prone areas by raising the price of federal flood insurance to reflect the increased risk more accurately.
  • 7) Stop viewing economic growth as a problem. Industrialization may be the leading cause of climate change. But we cannot and will not reverse it through some form of deindustrialization, which would send the world into poverty and deprivation
  • 8) Get serious about the environmental trade-offs that come with clean energy. You cannot support wind farms but hinder the transmission lines needed to bring their power to the markets where they are needed.
  • 9) A problem for the future is, by its very nature, a moral one. A conservative movement that claims to care about what we owe the future has the twin responsibility of setting an example for its children and at the same time preparing for that future.
Javier E

The New History Wars - The Atlantic - 0 views

  • Critical historians who thought they were winning the fight for control within the academy now face dire retaliation from outside the academy. The dizzying turn from seeming triumph in 2020 to imminent threat in 2022 has unnerved many practitioners of the new history. Against this background, they did not welcome it when their association’s president suggested that maybe their opponents had a smidgen of a point.
  • a background reality of the humanities in the contemporary academy: a struggle over who is entitled to speak about what. Nowhere does this struggle rage more fiercely than in anything to do with the continent of Africa. Who should speak? What may be said? Who will be hired?
  • One of the greatest American Africanists was the late Philip Curtin. He wrote one of the first attempts to tally the exact number of persons trafficked by the transatlantic slave trade. Upon publication in 1972, his book was acclaimed as a truly pioneering work of history. By 1995, however, he was moved to protest against trends in the discipline at that time in an article in the Chronicle of Higher Education:I am troubled by increasing evidence of the use of racial criteria in filling faculty posts in the field of African history … This form of intellectual apartheid has been around for several decades, but it appears to have become much more serious in the past few years, to the extent that white scholars trained in African history now have a hard time finding jobs.
  • ...15 more annotations...
  • The political and methodological stresses within the historical profession are intensified by economic troubles. For a long time, but especially since the economic crisis of 2008, university students have turned away from the humanities, preferring to major in fields that seem to offer more certain and lucrative employment. Consequently, academic jobs in the humanities and especially in history have become radically more precarious for younger faculty—even as universities have sought to meet diversity goals in their next-generation hiring by expanding offerings in history-adjacent specialties, such as gender and ethnic studies.
  • Yet this silence has consequences, too. One of the most unsettling is the displacement of history by mythmaking
  • The result has produced a generational divide. Younger scholars feel oppressed and exploited by universities pressing them to do more labor for worse pay with less security than their elders; older scholars feel that overeager juniors are poised to pounce on the least infraction as an occasion to end an elder’s career and seize a job opening for themselves. Add racial difference as an accelerant, and what was intended as an interesting methodological discussion in a faculty newsletter can explode into a national culture war.
  • Much of academia is governed these days by a joke from the Soviet Union: “If you think it, don’t speak it. If you speak it, don’t write it. If you write it, don’t sign it. But if you do think it, speak it, write it, and sign it—don’t be surprised.”
  • ne obvious escape route from the generational divide in the academy—and the way the different approaches to history, presentist and antiquarian, tend to map onto it—is for some people, especially those on the older and whiter side of the divide, to keep their mouths shut about sensitive issues
  • mythmaking is spreading from “just the movies” to more formal and institutional forms of public memory. If old heroes “must fall,” their disappearance opens voids for new heroes to be inserted in their place—and that insertion sometimes requires that new history be fabricated altogether, the “bad history” that Sweet tried to warn against.
  • If it is not the job of the president of the American Historical Association to confront those questions, then whose is it?
  • Sweet used a play on words—“Is History History?”—for the title of his complacency-shaking essay. But he was asking not whether history is finished, done with, but Is history still history? Is it continuing to do what history is supposed to do? Or is it being annexed for other purposes, ideological rather than historical ones?
  • Advocates of studying the more distant past to disturb and challenge our ideas about the present may accuse their academic rivals of “presentism.”
  • In real life, of course, almost everybody who cares about history believes in a little of each option. But how much of each? What’s the right balance? That’s the kind of thing that historians do argue about, and in the arguing, they have developed some dismissive labels for one another
  • Those who look to the more recent past to guide the future may accuse the other camp of “antiquarianism.”
  • The accusation of presentism hurts because it implies that the historian is sacrificing scholarly objectivity for ideological or political purposes. The accusation of antiquarianism stings because it implies that the historian is burrowing into the dust for no useful purpose at all.
  • In his mind, he was merely reopening one of the most familiar debates in professional history: the debate over why? What is the value of studying the past? To reduce the many available answers to a stark choice: Should we study the more distant past to explore its strangeness—and thereby jolt ourselves out of easy assumptions that the world we know is the only possible one?
  • Or should we study the more recent past to understand how our world came into being—and thereby learn some lessons for shaping the future?
  • The August edition of the association’s monthly magazine featured, as usual, a short essay by the association’s president, James H. Sweet, a professor at the University of Wisconsin at Madison. Within hours of its publication, an outrage volcano erupted on social media. A professor at Cornell vented about the author’s “white gaze.”
1 - 20 of 2226 Next › Last »
Showing 20 items per page