Skip to main content

Home/ History Readings/ Group items tagged quantity

Rss Feed Group items tagged

Javier E

It All Turns on Affection-Wendell E. Berry Lecture | National Endowment for the Humanities - 0 views

  • Wallace Stegner. He thought rightly that we Americans, by inclination at least, have been divided into two kinds: “boomers” and “stickers.” Boomers, he said, are “those who pillage and run,” who want “to make a killing and end up on Easy Street,” whereas stickers are “those who settle, and love the life they have made and the place they have made it in.”2 “Boomer” names a kind of person and a kind of ambition that is the major theme, so far, of the history of the European races in our country. “Sticker” names a kind of person and also a desire that is, so far, a minor theme of that history, but a theme persistent enough to remain significant and to offer, still, a significant hope.
  • We may, as we say, “know” statistical sums, but we cannot imagine them. It is by imagination that knowledge is “carried to the heart” (to borrow again from Allen Tate).5 The faculties of the mind—reason, memory, feeling, intuition, imagination, and the rest—are not distinct from one another. Though some may be favored over others and some ignored, none functions alone. But the human mind, even in its wholeness, even in instances of greatest genius, is irremediably limited. Its several faculties, when we try to use them separately or specialize them, are even more limited.
  • The fact is that we humans are not much to be trusted with what I am calling statistical knowledge, and the larger the statistical quantities the less we are to be trusted. We don’t learn much from big numbers. We don’t understand them very well, and we aren’t much affected by them. The reality that is responsibly manageable by human intelligence is much nearer in scale to a small rural community or urban neighborhood than to the “globe.”
  • ...3 more annotations...
  • Propriety of scale in all human undertakings is paramount, and we ignore it. We are now betting our lives on quantities that far exceed all our powers of comprehension. We believe that we have built a perhaps limitless power of comprehension into computers and other machines, but our minds remain as limited as ever. Our trust that machines can manipulate to humane effect quantities that are unintelligible and unimaginable to humans is incorrigibly strange.
  • We cannot know the whole truth, which belongs to God alone, but our task nevertheless is to seek to know what is true. And if we offend gravely enough against what we know to be true, as by failing badly enough to deal affectionately and responsibly with our land and our neighbors, truth will retaliate with ugliness, poverty, and disease. The crisis of this line of thought is the realization that we are at once limited and unendingly responsible for what we know and do.
  • It is a horrible fact that we can read in the daily paper, without interrupting our breakfast, numerical reckonings of death and destruction that ought to break our hearts or scare us out of our wits. This brings us to an entirely practical question:  Can we—and, if we can, how can we—make actual in our minds the sometimes urgent things we say we know? This obviously cannot be accomplished by a technological breakthrough, nor can it be accomplished by a big thought. Perhaps it cannot be accomplished at all.
Javier E

Losing Earth: The Decade We Almost Stopped Climate Change - The New York Times - 0 views

  • As Malcolm Forbes Baldwin, the acting chairman of the president’s Council for Environmental Quality, told industry executives in 1981, “There can be no more important or conservative concern than the protection of the globe itself.”
  • Among those who called for urgent, immediate and far-reaching climate policy were Senators John Chafee, Robert Stafford and David Durenberger; the E.P.A. administrator, William K. Reilly; and, during his campaign for president, George H.W. Bush.
  • It was understood that action would have to come immediately. At the start of the 1980s, scientists within the federal government predicted that conclusive evidence of warming would appear on the global temperature record by the end of the decade, at which point it would be too late to avoid disaster.
  • ...180 more annotations...
  • If the world had adopted the proposal widely endorsed at the end of the ’80s — a freezing of carbon emissions, with a reduction of 20 percent by 2005 — warming could have been held to less than 1.5 degrees.
  • Action had to be taken, and the United States would need to lead. It didn’t.
  • There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
  • The first suggestion to Rafe Pomerance that humankind was destroying the conditions necessary for its own survival came on Page 66 of the government publication EPA-600/7-78-019. It was a technical report about coal
  • ‘This Is the Whole Banana’ Spring 1979
  • here was an urgent problem that demanded their attention, MacDonald believed, because human civilization faced an existential crisis. In “How to Wreck the Environment,” a 1968 essay published while he was a science adviser to Lyndon Johnson, MacDonald predicted a near future in which “nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.” One of the most potentially devastating such weapons, he believed, was the gas that we exhaled with every breath: carbon dioxide. By vastly increasing carbon emissions, the world’s most advanced militaries could alter weather patterns and wreak famine, drought and economic collapse.
  • the Jasons. They were like one of those teams of superheroes with complementary powers that join forces in times of galactic crisis. They had been brought together by federal agencies, including the C.I.A, to devise scientific solutions to national-security problems: how to detect an incoming missile; how to predict fallout from a nuclear bomb; how to develop unconventional weapons, like plague-infested rats.
  • Agle pointed to an article about a prominent geophysicist named Gordon MacDonald, who was conducting a study on climate change with the Jasons, the mysterious coterie of elite scientists to which he belonged
  • During the spring of 1977 and the summer of 1978, the Jasons met to determine what would happen once the concentration of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. It was an arbitrary milestone, the doubling, but a useful one, as its inevitability was not in question; the threshold would most likely be breached by 2035.
  • The Jasons’ report to the Department of Energy, “The Long-Term Impact of Atmospheric Carbon Dioxide on Climate,” was written in an understated tone that only enhanced its nightmarish findings: Global temperatures would increase by an average of two to three degrees Celsius; Dust Bowl conditions would “threaten large areas of North America, Asia and Africa”; access to drinking water and agricultural production would fall, triggering mass migration on an unprecedented scale. “Perhaps the most ominous feature,” however, was the effect of a changing climate on the poles. Even a minimal warming “could lead to rapid melting” of the West Antarctic ice sheet. The ice sheet contained enough water to raise the level of the oceans 16 feet.
  • MacDonald explained that he first studied the carbon-dioxide issue when he was about Pomerance’s age — in 1961, when he served as an adviser to John F. Kennedy. Pomerance pieced together that MacDonald, in his youth, had been something of a prodigy: In his 20s, he advised Dwight D. Eisenhower on space exploration; at 32, he became a member of the National Academy of Sciences; at 40, he was appointed to the inaugural Council on Environmental Quality, where he advised Richard Nixon on the environmental dangers of burning coal. He monitored the carbon-dioxide problem the whole time, with increasing alarm.
  • They were surprised to learn how few senior officials were familiar with the Jasons’ findings, let alone understood the ramifications of global warming. At last, having worked their way up the federal hierarchy, the two went to see the president’s top scientist, Frank Press.
  • Thus began the Gordon and Rafe carbon-dioxide roadshow. Beginning in the spring of 1979, Pomerance arranged informal briefings with the E.P.A., the National Security Council, The New York Times, the Council on Environmental Quality and the Energy Department, which, Pomerance learned, had established an Office of Carbon Dioxide Effects two years earlier at MacDonald’s urging
  • . Out of respect for MacDonald, Press had summoned to their meeting what seemed to be the entire senior staff of the president’s Office of Science and Technology Policy — the officials consulted on every critical matter of energy and national security. What Pomerance had expected to be yet another casual briefing assumed the character of a high-level national-security meeting.
  • MacDonald would begin his presentation by going back more than a century to John Tyndall — an Irish physicist who was an early champion of Charles Darwin’s work and died after being accidentally poisoned by his wife. In 1859, Tyndall found that carbon dioxide absorbed heat and that variations in the composition of the atmosphere could create changes in climate. These findings inspired Svante Arrhenius, a Swedish chemist and future Nobel laureate, to deduce in 1896 that the combustion of coal and petroleum could raise global temperatures. This warming would become noticeable in a few centuries, Arrhenius calculated, or sooner if consumption of fossil fuels continued to increase.
  • Four decades later, a British steam engineer named Guy Stewart Callendar discovered that, at the weather stations he observed, the previous five years were the hottest in recorded history. Humankind, he wrote in a paper, had become “able to speed up the processes of Nature.” That was in 1939.
  • MacDonald’s history concluded with Roger Revelle, perhaps the most distinguished of the priestly caste of government scientists who, since the Manhattan Project, advised every president on major policy; he had been a close colleague of MacDonald and Press since they served together under Kennedy. In a 1957 paper written with Hans Suess, Revelle concluded that “human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.” Revelle helped the Weather Bureau establish a continuous measurement of atmospheric carbon dioxide at a site perched near the summit of Mauna Loa on the Big Island of Hawaii, 11,500 feet above the sea — a rare pristine natural laboratory on a planet blanketed by fossil-fuel emissions.
  • After nearly a decade of observation, Revelle had shared his concerns with Lyndon Johnson, who included them in a special message to Congress two weeks after his inauguration. Johnson explained that his generation had “altered the composition of the atmosphere on a global scale” through the burning of fossil fuels, and his administration commissioned a study of the subject by his Science Advisory Committee. Revelle was its chairman, and its 1965 executive report on carbon dioxide warned of the rapid melting of Antarctica, rising seas, increased acidity of fresh waters — changes that would require no less than a coordinated global effort to forestall.Yet emissions continued to rise, and at this rate, MacDonald warned, they could see a snowless New England, the swamping of major coastal cities, as much as a 40 percent decline in national wheat production, the forced migration of about one-quarter of the world’s population. Not within centuries — within their own lifetimes.
  • On May 22, Press wrote a letter to the president of the National Academy of Sciences requesting a full assessment of the carbon-dioxide issue. Jule Charney, the father of modern meteorology, would gather the nation’s top oceanographers, atmospheric scientists and climate modelers to judge whether MacDonald’s alarm was justified — whether the world was, in fact, headed to cataclysm.
  • If Charney’s group confirmed that the world was careering toward an existential crisis, the president would be forced to act.
  • Hansen turned from the moon to Venus. Why, he tried to determine, was its surface so hot? In 1967, a Soviet satellite beamed back the answer: The planet’s atmosphere was mainly carbon dioxide. Though once it may have had habitable temperatures, it was believed to have succumbed to a runaway greenhouse effect: As the sun grew brighter, Venus’s ocean began to evaporate, thickening the atmosphere, which forced yet greater evaporation — a self-perpetuating cycle that finally boiled off the ocean entirely and heated the planet’s surface to more than 800 degrees Fahrenheit
  • At the other extreme, Mars’s thin atmosphere had insufficient carbon dioxide to trap much heat at all, leaving it about 900 degrees colder. Earth lay in the middle, its Goldilocks greenhouse effect just strong enough to support life.
  • We want to learn more about Earth’s climate, Jim told Anniek — and how humanity can influence it. He would use giant new supercomputers to map the planet’s atmosphere. They would create Mirror Worlds: parallel realities that mimicked our own. These digital simulacra, technically called “general circulation models,” combined the mathematical formulas that governed the behavior of the sea, land and sky into a single computer model. Unlike the real world, they could be sped forward to reveal the future.
  • The government officials, many of them scientists themselves, tried to suppress their awe of the legends in their presence: Henry Stommel, the world’s leading oceanographer; his protégé, Carl Wunsch, a Jason; the Manhattan Project alumnus Cecil Leith; the Harvard planetary physicist Richard Goody. These were the men who, in the last three decades, had discovered foundational principles underlying the relationships among sun, atmosphere, land and ocean — which is to say, the climate.
  • When, at Charney’s request, Hansen programmed his model to consider a future of doubled carbon dioxide, it predicted a temperature increase of four degrees Celsius. That was twice as much warming as the prediction made by the most prominent climate modeler, Syukuro Manabe, whose government lab at Princeton was the first to model the greenhouse effect. The difference between the two predictions — between warming of two degrees Celsius and four degrees Celsius — was the difference between damaged coral reefs and no reefs whatsoever, between thinning forests and forests enveloped by desert, between catastrophe and chaos.
  • The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
  • within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius
  • The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
  • After the publication of the Charney report, Exxon decided to create its own dedicated carbon-dioxide research program, with an annual budget of $600,000. Only Exxon was asking a slightly different question than Jule Charney. Exxon didn’t concern itself primarily with how much the world would warm. It wanted to know how much of the warming Exxon could be blamed for.
  • “It behooves us to start a very aggressive defensive program,” Shaw wrote in a memo to a manager, “because there is a good probability that legislation affecting our business will be passed.”
  • Shaw turned to Wallace Broecker, a Columbia University oceanographer who was the second author of Roger Revelle’s 1965 carbon-dioxide report for Lyndon Johnson. In 1977, in a presentation at the American Geophysical Union, Broecker predicted that fossil fuels would have to be restricted, whether by taxation or fiat. More recently, he had testified before Congress, calling carbon dioxide “the No.1 long-term environmental problem.” If presidents and senators trusted Broecker to tell them the bad news, he was good enough for Exxon.
  • The company had been studying the carbon-dioxide problem for decades, since before it changed its name to Exxon. In 1957, scientists from Humble Oil published a study tracking “the enormous quantity of carbon dioxide” contributed to the atmosphere since the Industrial Revolution “from the combustion of fossil fuels.” Even then, the observation that burning fossil fuels had increased the concentration of carbon in the atmosphere was well understood and accepted by Humble’s scientists.
  • The American Petroleum Institute, the industry’s largest trade association, asked the same question in 1958 through its air-pollution study group and replicated the findings made by Humble Oil. So did another A.P.I. study conducted by the Stanford Research Institute a decade later, in 1968, which concluded that the burning of fossil fuels would bring “significant temperature changes” by the year 2000 and ultimately “serious worldwide environmental changes,” including the melting of the Antarctic ice cap and rising seas.
  • The ritual repeated itself every few years. Industry scientists, at the behest of their corporate bosses, reviewed the problem and found good reasons for alarm and better excuses to do nothing. Why should they act when almost nobody within the United States government — nor, for that matter, within the environmental movement — seemed worried?
  • Why take on an intractable problem that would not be detected until this generation of employees was safely retired? Worse, the solutions seemed more punitive than the problem itself. Historically, energy use had correlated to economic growth — the more fossil fuels we burned, the better our lives became. Why mess with that?
  • That June, Jimmy Carter signed the Energy Security Act of 1980, which directed the National Academy of Sciences to start a multiyear, comprehensive study, to be called “Changing Climate,” that would analyze social and economic effects of climate change. More urgent, the National Commission on Air Quality, at the request of Congress, invited two dozen experts, including Henry Shaw himself, to a meeting in Florida to propose climate policy.
  • On April 3, 1980, Senator Paul Tsongas, a Massachusetts Democrat, held the first congressional hearing on carbon-dioxide buildup in the atmosphere. Gordon MacDonald testified that the United States should “take the initiative” and develop, through the United Nations, a way to coordinate every nation’s energy policies to address the problem.
  • During the expansion of the Clean Air Act, he pushed for the creation of the National Commission on Air Quality, charged with ensuring that the goals of the act were being met. One such goal was a stable global climate. The Charney report had made clear that goal was not being met, and now the commission wanted to hear proposals for legislation. It was a profound responsibility, and the two dozen experts invited to the Pink Palace — policy gurus, deep thinkers, an industry scientist and an environmental activist — had only three days to achieve it, but the utopian setting made everything seem possible
  • We have less time than we realize, said an M.I.T. nuclear engineer named David Rose, who studied how civilizations responded to large technological crises. “People leave their problems until the 11th hour, the 59th minute,” he said. “And then: ‘Eloi, Eloi, Lama Sabachthani?’ ” — “My God, my God, why hast thou forsaken me?”
  • The attendees seemed to share a sincere interest in finding solutions. They agreed that some kind of international treaty would ultimately be needed to keep atmospheric carbon dioxide at a safe level. But nobody could agree on what that level was.
  • William Elliott, a NOAA scientist, introduced some hard facts: If the United States stopped burning carbon that year, it would delay the arrival of the doubling threshold by only five years. If Western nations somehow managed to stabilize emissions, it would forestall the inevitable by only eight years. The only way to avoid the worst was to stop burning coal. Yet China, the Soviet Union and the United States, by far the world’s three largest coal producers, were frantically accelerating extraction.
  • “Do we have a problem?” asked Anthony Scoville, a congressional science consultant. “We do, but it is not the atmospheric problem. It is the political problem.” He doubted that any scientific report, no matter how ominous its predictions, would persuade politicians to act.
  • The talk of ending oil production stirred for the first time the gentleman from Exxon. “I think there is a transition period,” Henry Shaw said. “We are not going to stop burning fossil fuels and start looking toward solar or nuclear fusion and so on. We are going to have a very orderly transition from fossil fuels to renewable energy sources.”
  • What if the problem was that they were thinking of it as a problem? “What I am saying,” Scoville continued, “is that in a sense we are making a transition not only in energy but the economy as a whole.” Even if the coal and oil industries collapsed, renewable technologies like solar energy would take their place. Jimmy Carter was planning to invest $80 billion in synthetic fuel. “My God,” Scoville said, “with $80 billion, you could have a photovoltaics industry going that would obviate the need for synfuels forever!”
  • nobody could agree what to do. John Perry, a meteorologist who had worked as a staff member on the Charney report, suggested that American energy policy merely “take into account” the risks of global warming, though he acknowledged that a nonbinding measure might seem “intolerably stodgy.” “It is so weak,” Pomerance said, the air seeping out of him, “as to not get us anywhere.”
  • Scoville pointed out that the United States was responsible for the largest share of global carbon emissions. But not for long. “If we’re going to exercise leadership,” he said, “the opportunity is now.
  • One way to lead, he proposed, would be to classify carbon dioxide as a pollutant under the Clean Air Act and regulate it as such. This was received by the room like a belch. By Scoville’s logic, every sigh was an act of pollution. Did the science really support such an extreme measure? The Charney report did exactly that, Pomerance said.
  • Slade, the director of the Energy Department’s carbon-dioxide program, considered the lag a saving grace. If changes did not occur for a decade or more, he said, those in the room couldn’t be blamed for failing to prevent them. So what was the problem?
  • “Call it whatever.” Besides, Pomerance added, they didn’t have to ban coal tomorrow. A pair of modest steps could be taken immediately to show the world that the United States was serious: the implementation of a carbon tax and increased investment in renewable energy. Then the United States could organize an international summit meeting to address climate change
  • these two dozen experts, who agreed on the major points and had made a commitment to Congress, could not draft a single paragraph. Hours passed in a hell of fruitless negotiation, self-defeating proposals and impulsive speechifying. Pomerance and Scoville pushed to include a statement calling for the United States to “sharply accelerate international dialogue,” but they were sunk by objections and caveats.
  • They never got to policy proposals. They never got to the second paragraph. The final statement was signed by only the moderator, who phrased it more weakly than the declaration calling for the workshop in the first place. “The guide I would suggest,” Jorling wrote, “is whether we know enough not to recommend changes in existing policy.”
  • Pomerance had seen enough. A consensus-based strategy would not work — could not work — without American leadership. And the United States wouldn’t act unless a strong leader persuaded it to do so — someone who would speak with authority about the science, demand action from those in power and risk everything in pursuit of justice.
  • The meeting ended Friday morning. On Tuesday, four days later, Ronald Reagan was elected president.
  • ‘Otherwise, They’ll Gurgle’ November 1980-September 1981
  • In the midst of this carnage, the Council on Environmental Quality submitted a report to the White House warning that fossil fuels could “permanently and disastrously” alter Earth’s atmosphere, leading to “a warming of the Earth, possibly with very serious effects.” Reagan did not act on the council’s advice. Instead, his administration considered eliminating the council.
  • After the election, Reagan considered plans to close the Energy Department, increase coal production on federal land and deregulate surface coal mining. Once in office, he appointed James Watt, the president of a legal firm that fought to open public lands to mining and drilling, to run the Interior Department. “We’re deliriously happy,” the president of the National Coal Association was reported to have said. Reagan preserved the E.P.A. but named as its administrator Anne Gorsuch, an anti-regulation zealot who proceeded to cut the agency’s staff and budget by about a quarter
  • Reagan “has declared open war on solar energy,” the director of the nation’s lead solar-energy research agency said, after he was asked to resign). Reagan appeared determined to reverse the environmental achievements of Jimmy Carter, before undoing those of Richard Nixon, Lyndon Johnson, John F. Kennedy and, if he could get away with it, Theodore Roosevelt.
  • When Reagan considered closing the Council on Environmental Quality, its acting chairman, Malcolm Forbes Baldwin, wrote to the vice president and the White House chief of staff begging them to reconsider; in a major speech the same week, “A Conservative’s Program for the Environment,” Baldwin argued that it was “time for today’s conservatives explicitly to embrace environmentalism.” Environmental protection was not only good sense. It was good business. What could be more conservative than an efficient use of resources that led to fewer federal subsidies?
  • Meanwhile the Charney report continued to vibrate at the periphery of public consciousness. Its conclusions were confirmed by major studies from the Aspen Institute, the International Institute for Applied Systems Analysis near Vienna and the American Association for the Advancement of Science. Every month or so, nationally syndicated articles appeared summoning apocalypse: “Another Warning on ‘Greenhouse Effect,’ ” “Global Warming Trend ‘Beyond Human Experience,’ ” “Warming Trend Could ‘Pit Nation Against Nation.’
  • Pomerance read on the front page of The New York Times on Aug. 22, 1981, about a forthcoming paper in Science by a team of seven NASA scientists. They had found that the world had already warmed in the past century. Temperatures hadn’t increased beyond the range of historical averages, but the scientists predicted that the warming signal would emerge from the noise of routine weather fluctuations much sooner than previously expected. Most unusual of all, the paper ended with a policy recommendation: In the coming decades, the authors wrote, humankind should develop alternative sources of energy and use fossil fuels only “as necessary.” The lead author was James Hansen.
  • Pomerance listened and watched. He understood Hansen’s basic findings well enough: Earth had been warming since 1880, and the warming would reach “almost unprecedented magnitude” in the next century, leading to the familiar suite of terrors, including the flooding of a 10th of New Jersey and a quarter of Louisiana and Florida. But Pomerance was excited to find that Hansen could translate the complexities of atmospheric science into plain English.
  • 7. ‘We’re All Going to Be the Victims’ March 1982
  • Gore had learned about climate change a dozen years earlier as an undergraduate at Harvard, when he took a class taught by Roger Revelle. Humankind was on the brink of radically transforming the global atmosphere, Revelle explained, drawing Keeling’s rising zigzag on the blackboard, and risked bringing about the collapse of civilization. Gore was stunned: Why wasn’t anyone talking about this?
  • Most in Congress considered the science committee a legislative backwater, if they considered it at all; this made Gore’s subcommittee, which had no legislative authority, an afterthought to an afterthought. That, Gore vowed, would change. Environmental and health stories had all the elements of narrative drama: villains, victims and heroes. In a hearing, you could summon all three, with the chairman serving as narrator, chorus and moral authority. He told his staff director that he wanted to hold a hearing every week.
  • The Revelle hearing went as Grumbly had predicted. The urgency of the issue was lost on Gore’s older colleagues, who drifted in and out while the witnesses testified. There were few people left by the time the Brookings Institution economist Lester Lave warned that humankind’s profligate exploitation of fossil fuels posed an existential test to human nature. “Carbon dioxide stands as a symbol now of our willingness to confront the future,” he said. “It will be a sad day when we decide that we just don’t have the time or thoughtfulness to address those issues.”
  • That night, the news programs featured the resolution of the baseball strike, the ongoing budgetary debate and the national surplus of butter.
  • There emerged, despite the general comity, a partisan divide. Unlike the Democrats, the Republicans demanded action. “Today I have a sense of déjà vu,” said Robert Walker, a Republican from Pennsylvania. In each of the last five years, he said, “we have been told and told and told that there is a problem with the increasing carbon dioxide in the atmosphere. We all accept that fact, and we realize that the potential consequences are certainly major in their impact on mankind.” Yet they had failed to propose a single law. “Now is the time,” he said. “The research is clear. It is up to us now to summon the political will.”
  • Hansen flew to Washington to testify on March 25, 1982, performing before a gallery even more thinly populated than at Gore’s first hearing on the greenhouse effect. Gore began by attacking the Reagan administration for cutting funding for carbon-dioxide research despite the “broad consensus in the scientific community that the greenhouse effect is a reality.” William Carney, a Republican from New York, bemoaned the burning of fossil fuels and argued passionately that science should serve as the basis for legislative policy
  • the experts invited by Gore agreed with the Republicans: The science was certain enough. Melvin Calvin, a Berkeley chemist who won the Nobel Prize for his work on the carbon cycle, said that it was useless to wait for stronger evidence of warming. “You cannot do a thing about it when the signals are so big that they come out of the noise,” he said. “You have to look for early warning signs.”
  • Hansen’s job was to share the warning signs, to translate the data into plain English. He explained a few discoveries that his team had made — not with computer models but in libraries. By analyzing records from hundreds of weather stations, he found that the surface temperature of the planet had already increased four-tenths of a degree Celsius in the previous century. Data from several hundred tide-gauge stations showed that the oceans had risen four inches since the 1880s
  • It occurred to Hansen that this was the only political question that mattered: How long until the worst began? It was not a question on which geophysicists expended much effort; the difference between five years and 50 years in the future was meaningless in geologic time. Politicians were capable of thinking only in terms of electoral time: six years, four years, two years. But when it came to the carbon problem, the two time schemes were converging.
  • “Within 10 or 20 years,” Hansen said, “we will see climate changes which are clearly larger than the natural variability.” James Scheuer wanted to make sure he understood this correctly. No one else had predicted that the signal would emerge that quickly. “If it were one or two degrees per century,” he said, “that would be within the range of human adaptability. But we are pushing beyond the range of human adaptability.” “Yes,” Hansen said.
  • How soon, Scheuer asked, would they have to change the national model of energy production? Hansen hesitated — it wasn’t a scientific question. But he couldn’t help himself. He had been irritated, during the hearing, by all the ludicrous talk about the possibility of growing more trees to offset emissions. False hopes were worse than no hope at all: They undermined the prospect of developing real solutions. “That time is very soon,” Hansen said finally. “My opinion is that it is past,” Calvin said, but he was not heard because he spoke from his seat. He was told to speak into the microphone. “It is already later,” Calvin said, “than you think.”
  • From Gore’s perspective, the hearing was an unequivocal success. That night Dan Rather devoted three minutes of “CBS Evening News” to the greenhouse effect. A correspondent explained that temperatures had increased over the previous century, great sheets of pack ice in Antarctica were rapidly melting, the seas were rising; Calvin said that “the trend is all in the direction of an impending catastrophe”; and Gore mocked Reagan for his shortsightedness. Later, Gore could take credit for protecting the Energy Department’s carbon-dioxide program, which in the end was largely preserved.
  • 8. ‘The Direction of an Impending Catastrophe’ 1982
  • Following Henry Shaw’s recommendation to establish credibility ahead of any future legislative battles, Exxon had begun to spend conspicuously on global-warming research. It donated tens of thousands of dollars to some of the most prominent research efforts, including one at Woods Hole led by the ecologist George Woodwell, who had been calling for major climate policy as early as the mid-1970s, and an international effort coordinated by the United Nations. Now Shaw offered to fund the October 1982 symposium on climate change at Columbia’s Lamont-Doherty campus.
  • David boasted that Exxon would usher in a new global energy system to save the planet from the ravages of climate change. He went so far as to argue that capitalism’s blind faith in the wisdom of the free market was “less than satisfying” when it came to the greenhouse effect. Ethical considerations were necessary, too. He pledged that Exxon would revise its corporate strategy to account for climate change, even if it were not “fashionable” to do so. As Exxon had already made heavy investments in nuclear and solar technology, he was “generally upbeat” that Exxon would “invent” a future of renewable energy.
  • Hansen had reason to feel upbeat himself. If the world’s largest oil-and-gas company supported a new national energy model, the White House would not stand in its way. The Reagan administration was hostile to change from within its ranks. But it couldn’t be hostile to Exxon.
  • The carbon-dioxide issue was beginning to receive major national attention — Hansen’s own findings had become front-page news, after all. What started as a scientific story was turning into a political story.
  • The political realm was itself a kind of Mirror World, a parallel reality that crudely mimicked our own. It shared many of our most fundamental laws, like the laws of gravity and inertia and publicity. And if you applied enough pressure, the Mirror World of politics could be sped forward to reveal a new future. Hansen was beginning to understand that too.
  • 1. ‘Caution, Not Panic’ 1983-1984
  • in the fall of 1983, the climate issue entered an especially long, dark winter. And all because of a single report that had done nothing to change the state of climate science but transformed the state of climate politics.
  • After the publication of the Charney report in 1979, Jimmy Carter had directed the National Academy of Sciences to prepare a comprehensive, $1 million analysis of the carbon-dioxide problem: a Warren Commission for the greenhouse effect. A team of scientist-dignitaries — among them Revelle, the Princeton modeler Syukuro Manabe and the Harvard political economist Thomas Schelling, one of the intellectual architects of Cold War game theory — would review the literature, evaluate the consequences of global warming for the world order and propose remedies
  • Then Reagan won the White House.
  • the incipient report served as the Reagan administration’s answer to every question on the subject. There could be no climate policy, Fred Koomanoff and his associates said, until the academy ruled. In the Mirror World of the Reagan administration, the warming problem hadn’t been abandoned at all. A careful, comprehensive solution was being devised. Everyone just had to wait for the academy’s elders to explain what it was.
  • The committee’s chairman, William Nierenberg — a Jason, presidential adviser and director of Scripps, the nation’s pre-eminent oceanographic institution — argued that action had to be taken immediately, before all the details could be known with certainty, or else it would be too late.
  • Better to bet on American ingenuity to save the day. Major interventions in national energy policy, taken immediately, might end up being more expensive, and less effective, than actions taken decades in the future, after more was understood about the economic and social consequences of a warmer planet. Yes, the climate would change, mostly for the worst, but future generations would be better equipped to change with it.
  • Government officials who knew Nierenberg were not surprised by his conclusions: He was an optimist by training and experience, a devout believer in the doctrine of American exceptionalism, one of the elite class of scientists who had helped the nation win a global war, invent the most deadly weapon conceivable and create the booming aerospace and computer industries. America had solved every existential problem it had confronted over the previous generation; it would not be daunted by an excess of carbon dioxide. Nierenberg had also served on Reagan’s transition team. Nobody believed that he had been directly influenced by his political connections, but his views — optimistic about the saving graces of market forces, pessimistic about the value of government regulation — reflected all the ardor of his party.
  • That’s what Nierenberg wrote in “Changing Climate.” But it’s not what he said in the press interviews that followed. He argued the opposite: There was no urgent need for action. The public should not entertain the most “extreme negative speculations” about climate change (despite the fact that many of those speculations appeared in his report). Though “Changing Climate” urged an accelerated transition to renewable fuels, noting that it would take thousands of years for the atmosphere to recover from the damage of the last century, Nierenberg recommended “caution, not panic.” Better to wait and see
  • The damage of “Changing Climate” was squared by the amount of attention it received. Nierenberg’s speech in the Great Hall, being one-500th the length of the actual assessment, received 500 times the press coverage. As The Wall Street Journal put it, in a line echoed by trade journals across the nation: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: You can cope.”
  • On “CBS Evening News,” Dan Rather said the academy had given “a cold shoulder” to a grim, 200-page E.P.A. assessment published earlier that week (titled “Can We Delay a Greenhouse Warming?”; the E.P.A.’s answer, reduced to a word, was no). The Washington Post described the two reports, taken together, as “clarion calls to inaction.
  • George Keyworth II, Reagan’s science adviser. Keyworth used Nierenberg’s optimism as reason to discount the E.P.A.’s “unwarranted and unnecessarily alarmist” report and warned against taking any “near-term corrective action” on global warming. Just in case it wasn’t clear, Keyworth added, “there are no actions recommended other than continued research.”
  • Edward David Jr., two years removed from boasting of Exxon’s commitment to transforming global energy policy, told Science that the corporation had reconsidered. “Exxon has reverted to being mainly a supplier of conventional hydrocarbon fuels — petroleum products, natural gas and steam coal,” David said. The American Petroleum Institute canceled its own carbon-dioxide research program, too.
  • Exxon soon revised its position on climate-change research. In a presentation at an industry conference, Henry Shaw cited “Changing Climate” as evidence that “the general consensus is that society has sufficient time to technologically adapt to a CO₂ greenhouse effect.” If the academy had concluded that regulations were not a serious option, why should Exxon protest
  • 2. ‘You Scientists Win’ 1985
  • 3. The Size of The Human Imagination Spring-Summer 1986
  • Curtis Moore’s proposal: Use ozone to revive climate. The ozone hole had a solution — an international treaty, already in negotiation. Why not hitch the milk wagon to the bullet train? Pomerance was skeptical. The problems were related, sure: Without a reduction in CFC emissions, you didn’t have a chance of averting cataclysmic global warming. But it had been difficult enough to explain the carbon issue to politicians and journalists; why complicate the sales pitch? Then again, he didn’t see what choice he had. The Republicans controlled the Senate, and Moore was his connection to the Senate’s environmental committee.
  • Pomerance met with Senator John Chafee, a Republican from Rhode Island, and helped persuade him to hold a double-barreled hearing on the twin problems of ozone and carbon dioxide on June 10 and 11, 1986
  • F.Sherwood Rowland, Robert Watson, a NASA scientist, and Richard Benedick, the administration’s lead representative in international ozone negotiations, would discuss ozone; James Hansen, Al Gore, the ecologist George Woodwell and Carl Wunsch, a veteran of the Charney group, would testify about climate change.
  • As Pomerance had hoped, fear about the ozone layer ensured a bounty of press coverage for the climate-change testimony. But as he had feared, it caused many people to conflate the two crises. One was Peter Jennings, who aired the video on ABC’s “World News Tonight,” warning that the ozone hole “could lead to flooding all over the world, also to drought and to famine.”
  • The confusion helped: For the first time since the “Changing Climate” report, global-warming headlines appeared by the dozen. William Nierenberg’s “caution, not panic” line was inverted. It was all panic without a hint of caution: “A Dire Forecast for ‘Greenhouse’ Earth” (the front page of The Washington Post); “Scientists Predict Catastrophes in Growing Global Heat Wave” (Chicago Tribune); “Swifter Warming of Globe Foreseen” (The New York Times).
  • After three years of backsliding and silence, Pomerance was exhilarated to see interest in the issue spike overnight. Not only that: A solution materialized, and a moral argument was passionately articulated — by Rhode Island’s Republican senator no less. “Ozone depletion and the greenhouse effect can no longer be treated solely as important scientific questions,” Chafee said. “They must be seen as critical problems facing the nations of the world, and they are problems that demand solutions.”
  • The old canard about the need for more research was roundly mocked — by Woodwell, by a W.R.I. colleague named Andrew Maguire, by Senator George Mitchell, a Democrat from Maine. “Scientists are never 100 percent certain,” the Princeton historian Theodore Rabb testified. “That notion of total certainty is something too elusive ever to be sought.” As Pomerance had been saying since 1979, it was past time to act. Only now the argument was so broadly accepted that nobody dared object.
  • The ozone hole, Pomerance realized, had moved the public because, though it was no more visible than global warming, people could be made to see it. They could watch it grow on video. Its metaphors were emotionally wrought: Instead of summoning a glass building that sheltered plants from chilly weather (“Everything seems to flourish in there”), the hole evoked a violent rending of the firmament, inviting deathly radiation. Americans felt that their lives were in danger. An abstract, atmospheric problem had been reduced to the size of the human imagination. It had been made just small enough, and just large enough, to break through.
  • Four years after “Changing Climate,” two years after a hole had torn open the firmament and a month after the United States and more than three dozen other nations signed a treaty to limit use of CFCs, the climate-change corps was ready to celebrate. It had become conventional wisdom that climate change would follow ozone’s trajectory. Reagan’s E.P.A. administrator, Lee M. Thomas, said as much the day he signed the Montreal Protocol on Substances That Deplete the Ozone Layer (the successor to the Vienna Convention), telling reporters that global warming was likely to be the subject of a future international agreement
  • Congress had already begun to consider policy — in 1987 alone, there were eight days of climate hearings, in three committees, across both chambers of Congress; Senator Joe Biden, a Delaware Democrat, had introduced legislation to establish a national climate-change strategy. And so it was that Jim Hansen found himself on Oct. 27 in the not especially distinguished ballroom of the Quality Inn on New Jersey Avenue, a block from the Capitol, at “Preparing for Climate Change,” which was technically a conference but felt more like a wedding.
  • John Topping was an old-line Rockefeller Republican, a Commerce Department lawyer under Nixon and an E.P.A. official under Reagan. He first heard about the climate problem in the halls of the E.P.A. in 1982 and sought out Hansen, who gave him a personal tutorial. Topping was amazed to discover that out of the E.P.A.’s 13,000-person staff, only seven people, by his count, were assigned to work on climate, though he figured it was more important to the long-term security of the nation than every other environmental issue combined.
  • Glancing around the room, Jim Hansen could chart, like an arborist counting rings on a stump, the growth of the climate issue over the decade. Veterans like Gordon MacDonald, George Woodwell and the environmental biologist Stephen Schneider stood at the center of things. Former and current staff members from the congressional science committees (Tom Grumbly, Curtis Moore, Anthony Scoville) made introductions to the congressmen they advised. Hansen’s owlish nemesis Fred Koomanoff was present, as were his counterparts from the Soviet Union and Western Europe. Rafe Pomerance’s cranium could be seen above the crowd, but unusually he was surrounded by colleagues from other environmental organizations that until now had shown little interest in a diffuse problem with no proven fund-raising record. The party’s most conspicuous newcomers, however, the outermost ring, were the oil-and-gas executives.
  • That evening, as a storm spat and coughed outside, Rafe Pomerance gave one of his exhortative speeches urging cooperation among the various factions, and John Chafee and Roger Revelle received awards; introductions were made and business cards earnestly exchanged. Not even a presentation by Hansen of his research could sour the mood. The next night, on Oct. 28, at a high-spirited dinner party in Topping’s townhouse on Capitol Hill, the oil-and-gas men joked with the environmentalists, the trade-group representatives chatted up the regulators and the academics got merrily drunk. Mikhail Budyko, the don of the Soviet climatologists, settled into an extended conversation about global warming with Topping’s 10-year-old son. It all seemed like the start of a grand bargain, a uniting of factions — a solution.
  • Hansen was accustomed to the bureaucratic nuisances that attended testifying before Congress; before a hearing, he had to send his formal statement to NASA headquarters, which forwarded it to the White House’s Office of Management and Budget for approval. “Major greenhouse climate changes are a certainty,” he had written. “By the 2010s [in every scenario], essentially the entire globe has very substantial warming.”
  • By all appearances, plans for major policy continued to advance rapidly. After the Johnston hearing, Timothy Wirth, a freshman Democratic senator from Colorado on the energy committee, began to plan a comprehensive package of climate-change legislation — a New Deal for global warming. Wirth asked a legislative assistant, David Harwood, to consult with experts on the issue, beginning with Rafe Pomerance, in the hope of converting the science of climate change into a new national energy policy.
  • In March 1988, Wirth joined 41 other senators, nearly half of them Republicans, to demand that Reagan call for an international treaty modeled after the ozone agreement. Because the United States and the Soviet Union were the world’s two largest contributors of carbon emissions, responsible for about one-third of the world total, they should lead the negotiations. Reagan agreed. In May, he signed a joint statement with Mikhail Gorbachev that included a pledge to cooperate on global warming.
  • Al Gore himself had, for the moment, withdrawn his political claim to the issue. In 1987, at the age of 39, Gore announced that he was running for president, in part to bring attention to global warming, but he stopped emphasizing it after the subject failed to captivate New Hampshire primary voters.
  • 5. ‘You Will See Things That You Shall Believe’ Summer 1988
  • It was the hottest and driest summer in history. Everywhere you looked, something was bursting into flames. Two million acres in Alaska incinerated, and dozens of major fires scored the West. Yellowstone National Park lost nearly one million acres. Smoke was visible from Chicago, 1,600 miles away.
  • In Nebraska, suffering its worst drought since the Dust Bowl, there were days when every weather station registered temperatures above 100 degrees. The director of the Kansas Department of Health and Environment warned that the drought might be the dawning of a climatic change that within a half century could turn the state into a desert.
  • On June 22 in Washington, where it hit 100 degrees, Rafe Pomerance received a call from Jim Hansen, who was scheduled to testify the following morning at a Senate hearing called by Timothy Wirth. “I hope we have good media coverage tomorrow,” Hansen said.
  • Hansen had just received the most recent global temperature data. Just over halfway into the year, 1988 was setting records. Already it had nearly clinched the hottest year in history. Ahead of schedule, the signal was emerging from the noise. “I’m going to make a pretty strong statement,” Hansen said.
  • Hansen returned to his testimony. He wrote: “The global warming is now large enough that we can ascribe with a high degree of confidence a cause-and-effect relationship to the greenhouse effect.” He wrote: “1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.” He wrote: “The greenhouse effect has been detected, and it is changing our climate now.”
  • “We have only one planet,” Senator Bennett Johnston intoned. “If we screw it up, we have no place to go.” Senator Max Baucus, a Democrat from Montana, called for the United Nations Environment Program to begin preparing a global remedy to the carbon-dioxide problem. Senator Dale Bumpers, a Democrat of Arkansas, previewed Hansen’s testimony, saying that it “ought to be cause for headlines in every newspaper in America tomorrow morning.” The coverage, Bumpers emphasized, was a necessary precursor to policy. “Nobody wants to take on any of the industries that produce the things that we throw up into the atmosphere,” he said. “But what you have are all these competing interests pitted against our very survival.”
  • Hansen, wiping his brow, spoke without affect, his eyes rarely rising from his notes. The warming trend could be detected “with 99 percent confidence,” he said. “It is changing our climate now.” But he saved his strongest comment for after the hearing, when he was encircled in the hallway by reporters. “It is time to stop waffling so much,” he said, “and say that the evidence is pretty strong that the greenhouse effect is here.”
  • The press followed Bumpers’s advice. Hansen’s testimony prompted headlines in dozens of newspapers across the country, including The New York Times, which announced, across the top of its front page: “Global Warming Has Begun, Expert Tells Senate.”
  • Rafe Pomerance called his allies on Capitol Hill, the young staff members who advised politicians, organized hearings, wrote legislation. We need to finalize a number, he told them, a specific target, in order to move the issue — to turn all this publicity into policy. The Montreal Protocol had called for a 50 percent reduction in CFC emissions by 1998. What was the right target for carbon emissions? It wasn’t enough to exhort nations to do better. That kind of talk might sound noble, but it didn’t change investments or laws. They needed a hard goal — something ambitious but reasonable. And they needed it soon: Just four days after Hansen’s star turn, politicians from 46 nations and more than 300 scientists would convene in Toronto at the World Conference on the Changing Atmosphere, an event described by Philip Shabecoff of The New York Times as “Woodstock for climate change.”
  • Pomerance had a proposal: a 20 percent reduction in carbon emissions by 2000. Ambitious, Harwood said. In all his work planning climate policy, he had seen no assurance that such a steep drop in emissions was possible. Then again, 2000 was more than a decade off, so it allowed for some flexibility.
  • Mintzer pointed out that a 20 percent reduction was consistent with the academic literature on energy efficiency. Various studies over the years had shown that you could improve efficiency in most energy systems by roughly 20 percent if you adopted best practices.
  • Of course, with any target, you had to take into account the fact that the developing world would inevitably consume much larger quantities of fossil fuels by 2000. But those gains could be offset by a wider propagation of the renewable technologies already at hand — solar, wind, geothermal. It was not a rigorous scientific analysis, Mintzer granted, but 20 percent sounded plausible. We wouldn’t need to solve cold fusion or ask Congress to repeal the law of gravity. We could manage it with the knowledge and technology we already had.
  • Besides, Pomerance said, 20 by 2000 sounds good.
  • The conference’s final statement, signed by all 400 scientists and politicians in attendance, repeated the demand with a slight variation: a 20 percent reduction in carbon emissions by 2005. Just like that, Pomerance’s best guess became global diplomatic policy.
  • Hansen, emerging from Anniek’s successful cancer surgery, took it upon himself to start a one-man public information campaign. He gave news conferences and was quoted in seemingly every article about the issue; he even appeared on television with homemade props. Like an entrant at an elementary-school science fair, he made “loaded dice” out of sections of cardboard and colored paper to illustrate the increased likelihood of hotter weather in a warmer climate. Public awareness of the greenhouse effect reached a new high of 68 percent
  • global warming became a major subject of the presidential campaign. While Michael Dukakis proposed tax incentives to encourage domestic oil production and boasted that coal could satisfy the nation’s energy needs for the next three centuries, George Bush took advantage. “I am an environmentalist,” he declared on the shore of Lake Erie, the first stop on a five-state environmental tour that would take him to Boston Harbor, Dukakis’s home turf. “Those who think we are powerless to do anything about the greenhouse effect,” he said, “are forgetting about the White House effect.”
  • His running mate emphasized the ticket’s commitment to the issue at the vice-presidential debate. “The greenhouse effect is an important environmental issue,” Dan Quayle said. “We need to get on with it. And in a George Bush administration, you can bet that we will.”
  • This kind of talk roused the oil-and-gas men. “A lot of people on the Hill see the greenhouse effect as the issue of the 1990s,” a gas lobbyist told Oil & Gas Journal. Before a meeting of oil executives shortly after the “environmentalist” candidate won the election, Representative Dick Cheney, a Wyoming Republican, warned, “It’s going to be very difficult to fend off some kind of gasoline tax.” The coal industry, which had the most to lose from restrictions on carbon emissions, had moved beyond denial to resignation. A spokesman for the National Coal Association acknowledged that the greenhouse effect was no longer “an emerging issue. It is here already, and we’ll be hearing more and more about it.”
  • By the end of the year, 32 climate bills had been introduced in Congress, led by Wirth’s omnibus National Energy Policy Act of 1988. Co-sponsored by 13 Democrats and five Republicans, it established as a national goal an “International Global Agreement on the Atmosphere by 1992,” ordered the Energy Department to submit to Congress a plan to reduce energy use by at least 2 percent a year through 2005 and directed the Congressional Budget Office to calculate the feasibility of a carbon tax. A lawyer for the Senate energy committee told an industry journal that lawmakers were “frightened” by the issue and predicted that Congress would eventually pass significant legislation after Bush took office
  • The other great powers refused to wait. The German Parliament created a special commission on climate change, which concluded that action had to be taken immediately, “irrespective of any need for further research,” and that the Toronto goal was inadequate; it recommended a 30 percent reduction of carbon emissions
  • Margaret Thatcher, who had studied chemistry at Oxford, warned in a speech to the Royal Society that global warming could “greatly exceed the capacity of our natural habitat to cope” and that “the health of the economy and the health of our environment are totally dependent upon each other.”
  • The prime ministers of Canada and Norway called for a binding international treaty on the atmosphere; Sweden’s Parliament went further, announcing a national strategy to stabilize emissions at the 1988 level and eventually imposing a carbon tax
  • the United Nations unanimously endorsed the establishment, by the World Meteorological Organization and the United Nations Environment Program, of an Intergovernmental Panel on Climate Change, composed of scientists and policymakers, to conduct scientific assessments and develop global climate policy.
  • One of the I.P.C.C.’s first sessions to plan an international treaty was hosted by the State Department, 10 days after Bush’s inauguration. James Baker chose the occasion to make his first speech as secretary of state. “We can probably not afford to wait until all of the uncertainties about global climate change have been resolved,” he said. “Time will not make the problem go away.”
  • : On April 14, 1989, a bipartisan group of 24 senators, led by the majority leader, George Mitchell, requested that Bush cut emissions in the United States even before the I.P.C.C.’s working group made its recommendation. “We cannot afford the long lead times associated with a comprehensive global agreement,” the senators wrote. Bush had promised to combat the greenhouse effect with the White House effect. The self-proclaimed environmentalist was now seated in the Oval Office. It was time.
  • 8. ‘You Never Beat The White House’ April 1989
  • After Jim Baker gave his boisterous address to the I.P.C.C. working group at the State Department, he received a visit from John Sununu, Bush’s chief of staff. Leave the science to the scientists, Sununu told Baker. Stay clear of this greenhouse-effect nonsense. You don’t know what you’re talking about. Baker, who had served as Reagan’s chief of staff, didn’t speak about the subject again.
  • despite his reputation as a political wolf, he still thought of himself as a scientist — an “old engineer,” as he was fond of putting it, having earned a Ph.D. in mechanical engineering from M.I.T. decades earlier. He lacked the reflexive deference that so many of his political generation reserved for the class of elite government scientists.
  • Since World War II, he believed, conspiratorial forces had used the imprimatur of scientific knowledge to advance an “anti-growth” doctrine. He reserved particular disdain for Paul Ehrlich’s “The Population Bomb,” which prophesied that hundreds of millions of people would starve to death if the world took no step to curb population growth; the Club of Rome, an organization of European scientists, heads of state and economists, which similarly warned that the world would run out of natural resources; and as recently as the mid-’70s, the hypothesis advanced by some of the nation’s most celebrated scientists — including Carl Sagan, Stephen Schneider and Ichtiaque Rasool — that a new ice age was dawning, thanks to the proliferation of man-made aerosols. All were theories of questionable scientific merit, portending vast, authoritarian remedies to halt economic progress.
  • When Mead talked about “far-reaching” decisions and “long-term consequences,” Sununu heard the marching of jackboots.
  • Sununu had suspected that the greenhouse effect belonged to this nefarious cabal since 1975, when the anthropologist Margaret Mead convened a symposium on the subject at the National Institute of Environmental Health Sciences.
  • While Sununu and Darman reviewed Hansen’s statements, the E.P.A. administrator, William K. Reilly, took a new proposal to the White House. The next meeting of the I.P.C.C.’s working group was scheduled for Geneva the following month, in May; it was the perfect occasion, Reilly argued, to take a stronger stand on climate change. Bush should demand a global treaty to reduce carbon emissions.
  • Sununu wouldn’t budge. He ordered the American delegates not to make any commitment in Geneva. Very soon after that, someone leaked the exchange to the press.
  • A deputy of Jim Baker pulled Reilly aside. He said he had a message from Baker, who had observed Reilly’s infighting with Sununu. “In the long run,” the deputy warned Reilly, “you never beat the White House.”
  • 9. ‘A Form of Science Fraud’ May 1989
  • The cameras followed Hansen and Gore into the marbled hallway. Hansen insisted that he wanted to focus on the science. Gore focused on the politics. “I think they’re scared of the truth,” he said. “They’re scared that Hansen and the other scientists are right and that some dramatic policy changes are going to be needed, and they don’t want to face up to it.”
  • The censorship did more to publicize Hansen’s testimony and the dangers of global warming than anything he could have possibly said. At the White House briefing later that morning, Press Secretary Marlin Fitzwater admitted that Hansen’s statement had been changed. He blamed an official “five levels down from the top” and promised that there would be no retaliation. Hansen, he added, was “an outstanding and distinguished scientist” and was “doing a great job.”
  • 10. The White House Effect Fall 1989
  • The Los Angeles Times called the censorship “an outrageous assault.” The Chicago Tribune said it was the beginning of “a cold war on global warming,” and The New York Times warned that the White House’s “heavy-handed intervention sends the signal that Washington wants to go slow on addressing the greenhouse problem.”
  • Darman went to see Sununu. He didn’t like being accused of censoring scientists. They needed to issue some kind of response. Sununu called Reilly to ask if he had any ideas. We could start, Reilly said, by recommitting to a global climate treaty. The United States was the only Western nation on record as opposing negotiations.
  • Sununu sent a telegram to Geneva endorsing a plan “to develop full international consensus on necessary steps to prepare for a formal treaty-negotiating process. The scope and importance of this issue are so great that it is essential for the U.S. to exercise leadership.”
  • Sununu seethed at any mention of the subject. He had taken it upon himself to study more deeply the greenhouse effect; he would have a rudimentary, one-dimensional general circulation model installed on his personal desktop computer. He decided that the models promoted by Jim Hansen were a lot of bunk. They were horribly imprecise in scale and underestimated the ocean’s ability to mitigate warming. Sununu complained about Hansen to D. Allan Bromley, a nuclear physicist from Yale who, at Sununu’s recommendation, was named Bush’s science adviser. Hansen’s findings were “technical poppycock” that didn’t begin to justify such wild-eyed pronouncements that “the greenhouse effect is here” or that the 1988 heat waves could be attributed to global warming, let alone serve as the basis for national economic policy.
  • When a junior staff member in the Energy Department, in a meeting at the White House with Sununu and Reilly, mentioned an initiative to reduce fossil-fuel use, Sununu interrupted her. “Why in the world would you need to reduce fossil-fuel use?” he asked. “Because of climate change,” the young woman replied. “I don’t want anyone in this administration without a scientific background using ‘climate change’ or ‘global warming’ ever again,” he said. “If you don’t have a technical basis for policy, don’t run around making decisions on the basis of newspaper headlines.” After the meeting, Reilly caught up to the staff member in the hallway. She was shaken. Don’t take it personally, Reilly told her. Sununu might have been looking at you, but that was directed at me.
  • Reilly, for his part, didn’t entirely blame Sununu for Bush’s indecision on the prospect of a climate treaty. The president had never taken a vigorous interest in global warming and was mainly briefed about it by nonscientists. Bush had brought up the subject on the campaign trail, in his speech about the White House effect, after leafing through a briefing booklet for a new issue that might generate some positive press. When Reilly tried in person to persuade him to take action, Bush deferred to Sununu and Baker. Why don’t the three of you work it out, he said. Let me know when you decide
  • Relations between Sununu and Reilly became openly adversarial. Reilly, Sununu thought, was a creature of the environmental lobby. He was trying to impress his friends at the E.P.A. without having a basic grasp of the science himself.
  • Pomerance had the sinking feeling that the momentum of the previous year was beginning to flag. The censoring of Hansen’s testimony and the inexplicably strident opposition from John Sununu were ominous signs. So were the findings of a report Pomerance had commissioned, published in September by the World Resources Institute, tracking global greenhouse-gas emissions. The United States was the largest contributor by far, producing nearly a quarter of the world’s carbon emissions, and its contribution was growing faster than that of every other country. Bush’s indecision, or perhaps inattention, had already managed to delay the negotiation of a global climate treaty until 1990 at the earliest, perhaps even 1991. By then, Pomerance worried, it would be too late.
  • Pomerance tried to be more diplomatic. “The president made a commitment to the American people to deal with global warming,” he told The Washington Post, “and he hasn’t followed it up.” He didn’t want to sound defeated. “There are some good building blocks here,” Pomerance said, and he meant it. The Montreal Protocol on CFCs wasn’t perfect at first, either — it had huge loopholes and weak restrictions. Once in place, however, the restrictions could be tightened. Perhaps the same could happen with climate change. Perhaps. Pomerance was not one for pessimism. As William Reilly told reporters, dutifully defending the official position forced upon him, it was the first time that the United States had formally endorsed the concept of an emissions limit. Pomerance wanted to believe that this was progress.
  • All week in Noordwijk, Becker couldn’t stop talking about what he had seen in Zeeland. After a flood in 1953, when the sea swallowed much of the region, killing more than 2,000 people, the Dutch began to build the Delta Works, a vast concrete-and-steel fortress of movable barriers, dams and sluice gates — a masterpiece of human engineering. The whole system could be locked into place within 90 minutes, defending the land against storm surge. It reduced the country’s exposure to the sea by 700 kilometers, Becker explained. The United States coastline was about 153,000 kilometers long. How long, he asked, was the entire terrestrial coastline? Because the whole world was going to need this. In Zeeland, he said, he had seen the future.
  • Ken Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, Calif., has a habit of asking new graduate students to name the largest fundamental breakthrough in climate physics since 1979. It’s a trick question. There has been no breakthrough. As with any mature scientific discipline, there is only refinement. The computer models grow more precise; the regional analyses sharpen; estimates solidify into observational data. Where there have been inaccuracies, they have tended to be in the direction of understatement.
  • More carbon has been released into the atmosphere since the final day of the Noordwijk conference, Nov. 7, 1989, than in the entire history of civilization preceding it
  • Despite every action taken since the Charney report — the billions of dollars invested in research, the nonbinding treaties, the investments in renewable energy — the only number that counts, the total quantity of global greenhouse gas emitted per year, has continued its inexorable rise.
  • When it comes to our own nation, which has failed to make any binding commitments whatsoever, the dominant narrative for the last quarter century has concerned the efforts of the fossil-fuel industries to suppress science, confuse public knowledge and bribe politicians.
  • The mustache-twirling depravity of these campaigns has left the impression that the oil-and-gas industry always operated thus; while the Exxon scientists and American Petroleum Institute clerics of the ’70s and ’80s were hardly good Samaritans, they did not start multimillion-dollar disinformation campaigns, pay scientists to distort the truth or try to brainwash children in elementary schools, as their successors would.
  • It was James Hansen’s testimony before Congress in 1988 that, for the first time since the “Changing Climate” report, made oil-and-gas executives begin to consider the issue’s potential to hurt their profits. Exxon, as ever, led the field. Six weeks after Hansen’s testimony, Exxon’s manager of science and strategy development, Duane LeVine, prepared an internal strategy paper urging the company to “emphasize the uncertainty in scientific conclusions.” This shortly became the default position of the entire sector. LeVine, it so happened, served as chairman of the global petroleum industry’s Working Group on Global Climate Change, created the same year, which adopted Exxon’s position as its own
  • The American Petroleum Institute, after holding a series of internal briefings on the subject in the fall and winter of 1988, including one for the chief executives of the dozen or so largest oil companies, took a similar, if slightly more diplomatic, line. It set aside money for carbon-dioxide policy — about $100,000, a fraction of the millions it was spending on the health effects of benzene, but enough to establish a lobbying organization called, in an admirable flourish of newspeak, the Global Climate Coalition.
  • The G.C.C. was conceived as a reactive body, to share news of any proposed regulations, but on a whim, it added a press campaign, to be coordinated mainly by the A.P.I. It gave briefings to politicians known to be friendly to the industry and approached scientists who professed skepticism about global warming. The A.P.I.’s payment for an original op-ed was $2,000.
  • It was joined by the U.S. Chamber of Commerce and 14 other trade associations, including those representing the coal, electric-grid and automobile industries
  • In October 1989, scientists allied with the G.C.C. began to be quoted in national publications, giving an issue that lacked controversy a convenient fulcrum. “Many respected scientists say the available evidence doesn’t warrant the doomsday warnings,” was the caveat that began to appear in articles on climate change.
  • The following year, when President Bill Clinton proposed an energy tax in the hope of meeting the goals of the Rio treaty, the A.P.I. invested $1.8 million in a G.C.C. disinformation campaign. Senate Democrats from oil-and-coal states joined Republicans to defeat the tax proposal, which later contributed to the Republicans’ rout of Democrats in the midterm congressional elections in 1994 — the first time the Republican Party had won control of both houses in 40 years
  • The G.C.C. spent $13 million on a single ad campaign intended to weaken support for the 1997 Kyoto Protocol, which committed its parties to reducing greenhouse-gas emissions by 5 percent relative to 1990 levels. The Senate, which would have had to ratify the agreement, took a pre-emptive vote declaring its opposition; the resolution passed 95-0. There has never been another serious effort to negotiate a binding global climate treaty.
  • . This has made the corporation an especially vulnerable target for the wave of compensatory litigation that began in earnest in the last three years and may last a generation. Tort lawsuits have become possible only in recent years, as scientists have begun more precisely to attribute regional effects to global emission levels. This is one subfield of climate science that has advanced significantly sin
  • Pomerance had not been among the 400 delegates invited to Noordwijk. But together with three young activists — Daniel Becker of the Sierra Club, Alden Meyer of the Union of Concerned Scientists and Stewart Boyle from Friends of the Earth — he had formed his own impromptu delegation. Their constituency, they liked to say, was the climate itself. Their mission was to pressure the delegates to include in the final conference statement, which would be used as the basis for a global treaty, the target proposed in Toronto: a 20 percent reduction of greenhouse-gas combustion by 2005. It was the only measure that mattered, the amount of emissions reductions, and the Toronto number was the strongest global target yet proposed.
  • The delegations would review the progress made by the I.P.C.C. and decide whether to endorse a framework for a global treaty. There was a general sense among the delegates that they would, at minimum, agree to the target proposed by the host, the Dutch environmental minister, more modest than the Toronto number: a freezing of greenhouse-gas emissions at 1990 levels by 2000. Some believed that if the meeting was a success, it would encourage the I.P.C.C. to accelerate its negotiations and reach a decision about a treaty sooner. But at the very least, the world’s environmental ministers should sign a statement endorsing a hard, binding target of emissions reductions. The mood among the delegates was electric, nearly giddy — after more than a decade of fruitless international meetings, they could finally sign an agreement that meant something.
  • 11. ‘The Skunks at The Garden Party’ November 1989
  • It was nearly freezing — Nov. 6, 1989, on the coast of the North Sea in the Dutch resort town of Noordwijk
  • Losing Earth: The Decade WeAlmost Stopped Climate Change We knew everything we needed to know, and nothing stood in our way. Nothing, that is, except ourselves. A tragedy in two acts. By Nathaniel RichPhotographs and Videos by George Steinmetz AUG. 1, 2018
Javier E

Inside the Struggle to Make Lab-Grown Meat - WSJ - 0 views

  • “We can make it on small scales successfully,” said Josh Tetrick, chief executive officer of a rival food-technology company, Eat Just Inc.
  • What is uncertain is whether we and other companies will be able to produce this at the largest of scales, at the lowest of costs within the next decade.”
  • Mr. Tetrick said Eat Just’s Good Meat unit sells less than 5,000 pounds annually of its hybrid cultivated chicken in Singapore,
  • ...10 more annotations...
  • Uma Valeti, the company’s CEO, said Upside has proven it can safely produce a delicious product. The company said that it has helped pioneer an industry and that it is making progress on growing larger quantities of meat, while bringing down its cost.
  • According to former employees, Upside has struggled to produce large quantities of meat. They said the company often scrambled to make enough for lab analysis and tastings. Upside for years worked to grow whole cuts of meat, which proved difficult in its bioreactors. It battled contamination in its labs. Traces of rodent DNA once tainted a chicken cell line, according to former employees, and confirmed by company executives.  
  • Today, the company is growing its marquee filet not in large bioreactors at its pilot plant but in two-liter plastic bottles akin to those used to grow cells for decades by pharmaceutical companies. 
  • “Roller bottles aren’t scalable. Too small, too labor-intensive,”
  • Upside’s pilot plant isn’t yet operating at the 50,000-pound annual capacity the company announced when it opened in 2021, according to company executives, much less its future target of 400,000 pounds. Production can accelerate once Upside receives USDA clearance, company executives said.
  • Industry champions said they are confident that steady scientific progress will help reduce production costs for cultivated meat, while climate change and global population growth will intensify the need for it.
  • “It turned out that tissue, or creating this whole-cut texture, was really challenging,” said Amy Chen, Upside’s chief operating officer
  • Upside also wrestled with problems common to other cultivated-meat makers, including a battle against bacteria, according to former employees.Growing meat requires meticulous sterilization because small quantities of bacteria can quickly overtake a bioreactor, ruining a batch.
  • The company said contamination can slow production, but doesn’t affect final cultivated products, unlike conventional meat. The company said that autoclaves sometimes require maintenance and that meat grown for consumers won’t be produced in the older building
  • Some industry officials think companies can surmount contamination problems, but that other hurdles will still abound, including those tied to growing the finicky cells and the high cost of supplies.  
Javier E

There Are Too Many Books; Or, Publishing Shouldn't Be All About Quantity ‹ Li... - 0 views

  • In a January 30 interview newly installed Penguin Random House CEO Nihar Malaviya told New York Times reporter Liz Harris that after the deal to acquire Simon & Schuster fell through, he envisions a new strategy for increasing market share. “Much of its growth will have to come organically—by selling more books. Mr. Malaviya said that, hopefully, A.I. will help, making it easier to publish more titles without hiring ever more employees.”
  • It’s about the very American and capitalist idea that more is always better: that constantly churning out new products will help companies achieve year over year growth which, of course, is the paramount goal
  • heir authors increasingly wonder if they should reach inside their own wallets and hire outside help, not because the people working on their books are too lazy to do their jobs, but because freelance publicists and marketers are more likely to have the bandwidth to be thorough.
  • ...5 more annotations...
  • I’ve spoken to in-house editors and publicists who are more inundated than ever, unable to give each of their titles the attention they deserve. Their submissions and workloads have increased even as marketing and editorial resources for individual titles have tapered off.
  • In the corporate world, output seems to be becoming more and more of a percentage game. You throw a bunch of products against the wall, see what sticks, and write off the ones (a vast majority) that don’t.
  • What a remarkable change it would be if corporations would allow their employees to do the best job they can with each book that the company has chosen to buy, rather than allowing them to flail
  • I had always thought that “discoverability” was a unique problem for books because so much browsing happens online rather than in carefully curated physical stores, but the world of streaming TV and movies has begun to catch up
  • What do corporate publishing and streaming have in common? They’re very often run by people who don’t engage with the products they put out.
Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 2 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • The History of Animal Spirits: Dreams Never Sleep
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

How did Neanderthals and other ancient humans learn to count? - 0 views

  • Rafael Núñez, a cognitive scientist at the University of California, San Diego, and one of the leaders of QUANTA, accepts that many animals might have an innate appreciation of quantity. However, he argues that the human perception of numbers is typically much more sophisticated, and can’t have arisen through a process such as natural selection. Instead, many aspects of numbers, such as the spoken words and written signs that are used to represent them, must be produced by cultural evolution — a process in which individuals learn through imitation or formal teaching to adopt a new skill (such as how to use a tool).
  • Although many animals have culture, one that involves numbers is essentially unique to humans. A handful of chimpanzees have been taught in captivity to use abstract symbols to represent quantities, but neither chimps nor any other non-human species use such symbols in the natural world.
  • during excavations at Border Cave in South Africa, archaeologists discovered an approximately 42,000-year-old baboon fibula that was also marked with notches. D’Errico suspects that anatomically modern humans living there at the time used the bone to record numerical information. In the case of this bone, microscopic analysis of its 29 notches suggests they were carved using four distinct tools and so represent four counting events, which D’Errico thinks took place on four separate occasions1.
  • ...14 more annotations...
  • D’Errico has developed a scenario to explain how number systems might have arisen through the very act of producing such artefacts. His hypothesis is one of only two published so far for the prehistoric origin of numbers.
  • It all started by accident, he suggests, as early hominins unintentionally left marks on bones while they were butchering animal carcasses. Later, the hominins made a cognitive leap when they realized that they could deliberately mark bones to produce abstract designs — such as those seen on an approximately 430,000-year-old shell found in Trinil, Indonesia6. At some point after that, another leap occurred: individual marks began to take on meaning, with some of them perhaps encoding numerical information
  • The Les Pradelles hyena bone is potentially the earliest known example of this type of mark-making, says D’Errico. He thinks that with further leaps, or what he dubs cultural exaptations, such notches eventually led to the invention of number signs such as 1, 2 and 37.
  • Overmann has developed her own hypothesis to explain how number systems might have emerged in prehistory — a task made easier by the fact that a wide variety of number systems are still in use around the world. For example, linguists Claire Bowern and Jason Zentz at Yale University in New Haven, Connecticut, reported in a 2012 survey that 139 Aboriginal Australian languages have an upper limit of ‘three’ or ‘four’ for specific numerals. Some of those languages use natural quantifiers such as ‘several’ and ‘many’ to indicate higher values
  • here is even one group, the Pirahã people of the Brazilian Amazon, that is sometimes claimed not to use numbers at all10.
  • In a 2013 study11, Overmann analysed anthropological data relating to 33 contemporary hunter-gatherer societies across the world. She discovered that those with simple number systems (an upper limit not much higher than ‘four’) often had few material possessions, such as weapons, tools or jewellery. Those with elaborate systems (an upper numeral limit much higher than ‘four’) always had a richer array of possessions.
  • In societies with complex number systems, there were clues to how those systems developed. Significantly, Overmann noted that it was common for these societies to use quinary (base 5), decimal or vigesimal (base 20) systems. This suggested to her that many number systems began with a finger-counting stage.
  • This finger-counting stage is important, according to Overmann. She is an advocate of material engagement theory (MET), a framework devised about a decade ago by cognitive archaeologist Lambros Malafouris at the University of Oxford, UK12. MET maintains that the mind extends beyond the brain and into objects, such as tools or even a person’s fingers. This extension allows ideas to be realized in physical form; so, in the case of counting, MET suggests that the mental conceptualization of numbers can include the fingers. That makes numbers more tangible and easier to add or subtract.
  • The societies that moved beyond finger-counting did so, argues Overmann, because they developed a clearer social need for numbers. Perhaps most obviously, a society with more material possessions has a greater need to count (and to count much higher than ‘four’) to keep track of objects.
  • An artefact such as a tally stick also becomes an extension of the mind, and the act of marking tally notches on the stick helps to anchor and stabilize numbers as someone counts.
  • some societies moved beyond tally sticks. This first happened in Mesopotamia around the time when cities emerged there, creating an even greater need for numbers to keep track of resources and people. Archaeological evidence suggests that by 5,500 years ago, some Mesopotamians had begun using small clay tokens as counting aids.
  • Overmann acknowledges that her hypothesis is silent on one issue: when in prehistory human societies began developing number systems. Linguistics might offer some help here. One line of evidence suggests that number words could have a history stretching back at least tens of thousands of years.
  • Evolutionary biologist Mark Pagel at the University of Reading, UK, and his colleagues have spent many years exploring the history of words in extant language families, with the aid of computational tools that they initially developed to study biological evolution. Essentially, words are treated as entities that either remain stable or are outcompeted and replaced as languages spread and diversif
  • Using this approach, Pagel and Andrew Meade at Reading showed that low-value number words (‘one’ to ‘five’) are among the most stable features of spoken languages14. Indeed, they change so infrequently across language families — such as the Indo-European family, which includes many modern European and southern Asian languages — that they seem to have been stable for anywhere between 10,000 and 100,000 years.
Javier E

With a Big If, Science Panel Finds Deep Cuts Possible in U.S. Vehicle Emissions and Oil... - 0 views

  • deep cuts in oil use and emissions of greenhouse gases from cars and light trucks are possible in the United States by 2050, but only with a mix of diverse and intensified research and policies far stronger than those pursued so far by the Obama administration.
  • by the year 2050, the U.S. may be able to reduce petroleum consumption and greenhouse gas emissions by 80 percent for light-duty vehicles -- cars and small trucks -- via a combination of more efficient vehicles; the use of alternative fuels like biofuels, electricity, and hydrogen; and strong government policies to overcome high costs and influence consumer choices.
  • "In addition, alternative fuels to petroleum must be readily available, cost-effective and produced with low emissions of greenhouse gases.  Such a transition will be costly and require several decades.
  • ...6 more annotations...
  •  The committee's model calculations, while exploratory and highly uncertain, indicate that the benefits of making the transition, i.e. energy cost savings, improved vehicle technologies, and reductions in petroleum use and greenhouse gas emissions, exceed the additional costs of the transition over and above what the market is willing to do voluntarily."
  • Improving the efficiency of conventional vehicles is, up to a point, the most economical and easiest-to-implement approach to saving fuel and lowering emissions, the report says.  This approach includes reducing work the engine must perform -- reducing vehicle weight, aerodynamic resistance, rolling resistance, and accessories -- plus improving the efficiency of the internal combustion engine powertrain.
  • Improved efficiency alone will not meet the 2050 goals, however.  The average fuel economy of vehicles on the road would have to exceed 180 mpg, which, the report says, is extremely unlikely with current technologies.  Therefore, the study committee also considered other alternatives for vehicles and fuels, including:
  • Although driving costs per mile will be lower, especially for vehicles powered by natural gas or electricity, the high initial purchase cost is likely to be a significant barrier to widespread consumer acceptance
  • Wide consumer acceptance is essential, however, and large numbers of alternative vehicles must be purchased long before 2050 if the on-road fleet is to meet desired performance goals.  Strong policies and technology advances are critical in overcoming this challenge.
  • While corn-grain ethanol and biodiesel are the only biofuels to have been produced in commercial quantities in the U.S. to date, the study committee found much greater potential in biofuels made from lignocellulosic biomass -- which includes crop residues like wheat straw, switchgrass, whole trees, and wood waste.  This "drop-in" fuel is designed to be a direct replacement for gasoline and could lead to large reductions in both petroleum use and greenhouse gas emissions; it can also be introduced without major changes in fuel delivery infrastructure or vehicles.  The report finds that sufficient lignocellulosic biomass could be produced by 2050 to meet the goal of an 80 percent reduction in petroleum use when combined with highly efficient vehicles
Javier E

Equal Opportunity, Our National Myth - NYTimes.com - 1 views

  • Today, the United States has less equality of opportunity than almost any other advanced industrial country. Study after study has exposed the myth that America is a land of opportunity.
  • The Pew Research Center has found that some 90 percent of Americans believe that the government should do everything it can to ensure equality of opportunity.
  • the upwardly mobile American is becoming a statistical oddity. According to research from the Brookings Institution, only 58 percent of Americans born into the bottom fifth of income earners move out of that category, and just 6 percent born into the bottom fifth move into the top
  • ...19 more annotations...
  • Perhaps a hundred years ago, America might have rightly claimed to have been the land of opportunity, or at least a land where there was more opportunity than elsewhere. But not for at least a quarter of a century. Horatio Alger-style rags-to-riches stories were not a deliberate hoax, but given how they’ve lulled us into a sense of complacency, they might as well have been.
  • government support for many state schools has been steadily gutted
  • the life prospects of an American are more dependent on the income and education of his parents than in almost any other advanced country for which there is data.
  • Discrimination, however, is only a small part of the picture. Probably the most important reason for lack of equality of opportunity is education: both its quantity and quality
  • While racial segregation decreased, economic segregation increased. After 1980, the poor grew poorer, the middle stagnated, and the top did better and better. Disparities widened between those living in poor localities and those living in rich suburbs — or rich enough to send their kids to private schools.
  • A result was a widening gap in educational performance — the achievement gap between rich and poor kids born in 2001 was 30 to 40 percent larger than it was for those born 25 years earlier
  • there are other forces at play, some of which start even before birth. Children in affluent families get more exposure to reading and less exposure to environmental hazards. Their families can afford enriching experiences like music lessons and summer camp. They get better nutrition and health care, which enhance their learning, directly and indirectly.
  • the situation is likely to get even worse
  • Economic mobility in the United States is lower than in most of Europe and lower than in all of Scandinavia.
  • students are crushed by giant student loan debts
  • at the same time that a college education is more important than ever for getting a good job.
  • Young people from families of modest means face a Catch-22: without a college education, they are condemned to a life of poor prospects; with a college education, they may be condemned to a lifetime of living at the brink.
  • increasingly even a college degree isn’t enough; one needs either a graduate degree or a series of (often unpaid) internships. Those at the top have the connections and social capital to get those opportunities
  • no one makes it on his or her own. And those at the top get more help from their families than do those lower down on the ladder
  • Without substantial policy changes, our self-image, and the image we project to the world, will diminish — and so will our economic standing and stability. Inequality of outcomes and inequality of opportunity reinforce each other — and contribute to economic weakness,
  • Policies that promote equality of opportunity must target the youngest Americans. First, we have to make sure that mothers are not exposed to environmental hazards and get adequate prenatal health care. Then, we have to reverse the damaging cutbacks to preschool education,
  • The right says that money isn’t the solution. They’ve chased reforms like charter schools and private-school vouchers, but most of these efforts have shown ambiguous results at best. Giving more money to poor schools would help. So would summer and extracurricular programs that enrich low-income students’ skills.
  • it is unconscionable that a rich country like the United States has made access to higher education so difficult for those at the bottom and middle. There are many alternative ways of providing universal access to higher education, from Australia’s income-contingent loan program to the near-free system of universities in Europe.
  • A more educated population yields greater innovation, a robust economy and higher incomes — which mean a higher tax base. Those benefits are, of course, why we’ve long been committed to free public education through 12th grade. But while a 12th-grade education might have sufficed a century ago, it doesn’t today
Javier E

More Guns, Less Crime: The Switzerland Example - Ta-Nehisi Coates - The Atlantic - 0 views

  • Swiss men remain part of the "militia" in reserve capacity until age 30 (age 34 for officers). 
  • Each such individual is required to keep his army-issued personal weapon (the 5.56x45mm Sig 550 rifle for enlisted personnel and/or the 9mm SIG-Sauer P220 semi-automatic pistol for officers, military police, medical and postal personnel) at home. Up until October 2007, a specified personal retention quantity of government-issued personal ammunition (50 rounds 5.56 mm / 48 rounds 9mm) was issued as well, which was sealed and inspected regularly to ensure that no unauthorized use had taken place. The ammunition was intended for use while traveling to the army barracks in case of invasion. In October 2007, the Swiss Federal Council decided that the distribution of ammunition to soldiers shall stop and that all previously issued ammo shall be returned. By March 2011, more than 99% of the ammo has been received. Only special rapid deployment units and the military police still have ammunition stored at home today.
  •  Switzerland does have a gun culture -- one that is heavily regulated by the government, right down to counting your bullets.
Javier E

Stop climate change: Move to the city, start walking - Salon.com - 0 views

  • electric cars are currently a bit greener than gasoline cars — per mile. Driving one hundred miles in a Nissan Altima results in the emission of 90.5 pounds of greenhouse gases. Driving the same distance in an all-electric Nissan Leaf emits 63.6 pounds of greenhouse gases — a significant improvement. But while the Altima driver pays 14 cents a mile for fuel, the Leaf driver pays less than 3 cents per mile, and this difference, thanks to the law of supply and demand, causes the Leaf driver to drive more.
  • What do you expect when you put people in cars they feel good (or at least less guilty) about driving, which are also cheap to buy and run? Naturally, they drive them more. So much more, in fact, that they obliterate energy gains made by increased fuel efficiency.
  • The real problem with cars is not that they don’t get enough miles per gallon; it’s that they make it too easy for people to spread out, encouraging forms of development that are inherently wasteful and damaging … The critical energy drain in a typical American suburb is not the Hummer in the driveway; it’s everything else the Hummer makes possible — the oversized houses and irrigated yards, the network of new feeder roads and residential streets, the costly and inefficient outward expansion of the power grid, the duplicated stores and schools, the two-hour solo commutes.
  • ...19 more annotations...
  • it turns out that the way we move largely determines the way we live.
  • gadgets cumulatively contribute only a fraction of what we save by living in a walkable neighborhood. It turns out that trading all of your incandescent lightbulbs for energy savers conserves as much carbon per year as living in a walkable neighborhood does each week.
  • “gizmo green”; the obsession with “sustainable” products that often have a statistically insignificant impact on the carbon footprint when compared to our location. And, as already suggested, our location’s greatest impact on our carbon footprint comes from how much it makes us drive.
  • study made it clear that, while every factor counts, none counts more than walkability. Specifically, it showed how, in drivable locations, transportation energy use consistently tops household energy use, in some cases by more than 2.4 to 1. As a result, the most green home (with Prius) in sprawl still loses out to the least green home in a walkable neighborhood.
  • because it’s better than nothing, LEED — like the Prius — is a get-out-of-jail-free card that allows us to avoid thinking more deeply about our larger footprint. For most organizations and agencies, it is enough. Unfortunately, as the transportation planner Dan Malouff puts it, “LEED architecture without good urban design is like cutting down the rainforest using hybrid-powered bulldozers.”
  • 10 to 20 units per acre is the density at which drivable suburbanism transitions into walkable urbanism.
  • “We are a destructive species, and if you love nature, stay away from it. The best means of protecting the environment is to
  • The average New Yorker consumes roughly one-third the electricity of the average Dallas resident, and ultimately generates less than one-third the greenhouse gases of the average American.
  • the American anti-urban ethos remained intact as everything else changed. The desire to be isolated in nature, adopted en masse, led to the quantities and qualities we now call “sprawl,” which somehow mostly manages to combine the traffic congestion of the city with the intellectual culture of the countryside.
  • New York consumes half the gasoline of Atlanta (326 versus 782 gallons per person per year). But Toronto cuts that number in half, as does Sydney — and most European cities use only half as much as those places. Cut Europe’s number in half, and you end up with Hong Kong
  • Paris is one place that has determined that its future depends on reducing its auto dependence. The city has recently decided to create 25 miles of dedicated busways, introduced 20,000 shared city bikes in 1,450 locations, and committed to removing 55,000 parking spaces from the city every year for the next 20 years. These changes sound pretty radical, but they are supported by 80 percent of the population.
  • increasing density from two units per acre to 20 units per acre resulted in about the same savings as the increase from 20 to 200.
  • New York is our densest big city and, not coincidentally, the one with the best transit service. All the other subway stations in America put together would not outnumber the 468 stops of the MTA. In terms of resource efficiency, it’s the best we’ve got.
  • most communities with these densities are also organized as traditional mixed-use, pedestrian-friendly neighborhoods, the sort of accommodating environment that entices people out of their cars. Everything above that is icing on the cake.
  • unless we hit a national crisis of unprecedented severity, it is hard to imagine any argument framed in the language of sustainability causing many people to modify their behavior. So what will?
  • The gold standard of quality-of-life rankings is the Mercer Survey, which carefully compares global cities in the 10 categories of political stability, economics, social quality, health and sanitation, education, public services, recreation, consumer goods, housing, and climate.
  • the top 10 cities always seem to include a bunch of places where they speak German (Vienna, Zurich, Dusseldorf, etc.), along with Vancouver, Auckland, and Sydney. These are all places with compact settlement patterns, good transit, and principally walkable neighborhoods. Indeed, there isn’t a single auto-oriented city in the top 50. The highest-rated American cities in 2010, which don’t appear until No. 31, are Honolulu, San Francisco, Boston, Chicago, Washington, New York, and Seattle.
  • Our cities, which are twice as efficient as our suburbs, burn twice the fuel of these European, Canadian, and Aussie/Kiwi places. Yet the quality of life in these foreign cities is deemed higher than ours by a long shot.
  • if we pollute so much because we are throwing away our time, money, and lives on the highway, then both problems would seem to share a single solution, and that solution is to make our cities more walkable. Doing so is not easy, but it can be done, it has been done,
Javier E

The Extraordinary Science of Addictive Junk Food - NYTimes.com - 0 views

  • Today, one in three adults is considered clinically obese, along with one in five kids, and 24 million Americans are afflicted by type 2 diabetes, often caused by poor diet, with another 79 million people having pre-diabetes. Even gout, a painful form of arthritis once known as “the rich man’s disease” for its associations with gluttony, now afflicts eight million Americans.
  • The public and the food companies have known for decades now — or at the very least since this meeting — that sugary, salty, fatty foods are not good for us in the quantities that we consume them. So why are the diabetes and obesity and hypertension numbers still spiraling out of control? It’s not just a matter of poor willpower on the part of the consumer and a give-the-people-what-they-want attitude on the part of the food manufacturers. What I found, over four years of research and reporting, was a conscious effort — taking place in labs and marketing meetings and grocery-store aisles — to get people hooked on foods that are convenient and inexpensive
  • the powerful sensory force that food scientists call “mouth feel.” This is the way a product interacts with the mouth, as defined more specifically by a host of related sensations, from dryness to gumminess to moisture release.
  • ...24 more annotations...
  • the mouth feel of soda and many other food items, especially those high in fat, is second only to the bliss point in its ability to predict how much craving a product will induce.
  • He organized focus-group sessions with the people most responsible for buying bologna — mothers — and as they talked, he realized the most pressing issue for them was time. Working moms strove to provide healthful food, of course, but they spoke with real passion and at length about the morning crush, that nightmarish dash to get breakfast on the table and lunch packed and kids out the door.
  • as the focus swung toward kids, Saturday-morning cartoons started carrying an ad that offered a different message: “All day, you gotta do what they say,” the ads said. “But lunchtime is all yours.”
  • When it came to Lunchables, they did try to add more healthful ingredients. Back at the start, Drane experimented with fresh carrots but quickly gave up on that, since fresh components didn’t work within the constraints of the processed-food system, which typically required weeks or months of transport and storage before the food arrived at the grocery store. Later, a low-fat version of the trays was developed, using meats and cheese and crackers that were formulated with less fat, but it tasted inferior, sold poorly and was quickly scrapped.
  • One of the company’s responses to criticism is that kids don’t eat the Lunchables every day — on top of which, when it came to trying to feed them more healthful foods, kids themselves were unreliable. When their parents packed fresh carrots, apples and water, they couldn’t be trusted to eat them. Once in school, they often trashed the healthful stuff in their brown bags to get right to the sweets.
  • This idea — that kids are in control — would become a key concept in the evolving marketing campaigns for the trays. In what would prove to be their greatest achievement of all, the Lunchables team would delve into adolescent psychology to discover that it wasn’t the food in the trays that excited the kids; it was the feeling of power it brought to their lives.
  • The prevailing attitude among the company’s food managers — through the 1990s, at least, before obesity became a more pressing concern — was one of supply and demand. “People could point to these things and say, ‘They’ve got too much sugar, they’ve got too much salt,’ ” Bible said. “Well, that’s what the consumer wants, and we’re not putting a gun to their head to eat it. That’s what they want. If we give them less, they’ll buy less, and the competitor will get our market. So you’re sort of trapped.”
  • at last count, including sales in Britain, they were approaching the $1 billion mark. Lunchables was more than a hit; it was now its own category
  • he holds the entire industry accountable. “What do University of Wisconsin M.B.A.’s learn about how to succeed in marketing?” his presentation to the med students asks. “Discover what consumers want to buy and give it to them with both barrels. Sell more, keep your job! How do marketers often translate these ‘rules’ into action on food? Our limbic brains love sugar, fat, salt. . . . So formulate products to deliver these. Perhaps add low-cost ingredients to boost profit margins. Then ‘supersize’ to sell more. . . . And advertise/promote to lock in ‘heavy users.’ Plenty of guilt to go around here!”
  • men in the eastern part of Finland had the highest rate of fatal cardiovascular disease in the world. Research showed that this plague was not just a quirk of genetics or a result of a sedentary lifestyle — it was also owing to processed foods. So when Finnish authorities moved to address the problem, they went right after the manufacturers. (The Finnish response worked. Every grocery item that was heavy in salt would come to be marked prominently with the warning “High Salt Content.” By 2007, Finland’s per capita consumption of salt had dropped by a third, and this shift — along with improved medical care — was accompanied by a 75 percent to 80 percent decline in the number of deaths from strokes and heart disease.)
  • I tracked Lin down in Irvine, Calif., where we spent several days going through the internal company memos, strategy papers and handwritten notes he had kept. The documents were evidence of the concern that Lin had for consumers and of the company’s intent on using science not to address the health concerns but to thwart them. While at Frito-Lay, Lin and other company scientists spoke openly about the country’s excessive consumption of sodium and the fact that, as Lin said to me on more than one occasion, “people get addicted to salt
  • the marketing team was joined by Dwight Riskey, an expert on cravings who had been a fellow at the Monell Chemical Senses Center in Philadelphia, where he was part of a team of scientists that found that people could beat their salt habits simply by refraining from salty foods long enough for their taste buds to return to a normal level of sensitivity. He had also done work on the bliss point, showing how a product’s allure is contextual, shaped partly by the other foods a person is eating, and that it changes as people age. This seemed to help explain why Frito-Lay was having so much trouble selling new snacks. The largest single block of customers, the baby boomers, had begun hitting middle age. According to the research, this suggested that their liking for salty snacks — both in the concentration of salt and how much they ate — would be tapering off.
  • Riskey realized that he and his colleagues had been misreading things all along. They had been measuring the snacking habits of different age groups and were seeing what they expected to see, that older consumers ate less than those in their 20s. But what they weren’t measuring, Riskey realized, is how those snacking habits of the boomers compared to themselves when they were in their 20s. When he called up a new set of sales data and performed what’s called a cohort study, following a single group over time, a far more encouraging picture — for Frito-Lay, anyway — emerged. The baby boomers were not eating fewer salty snacks as they aged. “In fact, as those people aged, their consumption of all those segments — the cookies, the crackers, the candy, the chips — was going up,” Riskey said. “They were not only eating what they ate when they were younger, they were eating more of it.” In fact, everyone in the country, on average, was eating more salty snacks than they used to. The rate of consumption was edging up about one-third of a pound every year, with the average intake of snacks like chips and cheese crackers pushing past 12 pounds a year
  • Riskey had a theory about what caused this surge: Eating real meals had become a thing of the past.
  • “We looked at this behavior, and said, ‘Oh, my gosh, people were skipping meals right and left,’ ” Riskey told me. “It was amazing.” This led to the next realization, that baby boomers did not represent “a category that is mature, with no growth. This is a category that has huge growth potential.”
  • The food technicians stopped worrying about inventing new products and instead embraced the industry’s most reliable method for getting consumers to buy more: the line extension.
  • He zeroed right in on the Cheetos. “This,” Witherly said, “is one of the most marvelously constructed foods on the planet, in terms of pure pleasure.” He ticked off a dozen attributes of the Cheetos that make the brain say more. But the one he focused on most was the puff’s uncanny ability to melt in the mouth. “It’s called vanishing caloric density,” Witherly said. “If something melts down quickly, your brain thinks that there’s no calories in it . . . you can just keep eating it forever.”
  • Frito-Lay acquired Stacy’s Pita Chip Company, which was started by a Massachusetts couple who made food-cart sandwiches and started serving pita chips to their customers in the mid-1990s. In Frito-Lay’s hands, the pita chips averaged 270 milligrams of sodium — nearly one-fifth a whole day’s recommended maximum for most American adults — and were a huge hit among boomers.
  • There’s a paradox at work here. On the one hand, reduction of sodium in snack foods is commendable. On the other, these changes may well result in consumers eating more. “The big thing that will happen here is removing the barriers for boomers and giving them permission to snack,” Carey said. The prospects for lower-salt snacks were so amazing, he added, that the company had set its sights on using the designer salt to conquer the toughest market of all for snacks: schools
  • The company’s chips, he wrote, were not selling as well as they could for one simple reason: “While people like and enjoy potato chips, they feel guilty about liking them. . . . Unconsciously, people expect to be punished for ‘letting themselves go’ and enjoying them.” Dichter listed seven “fears and resistances” to the chips: “You can’t stop eating them; they’re fattening; they’re not good for you; they’re greasy and messy to eat; they’re too expensive; it’s hard to store the leftovers; and they’re bad for children.” He spent the rest of his memo laying out his prescriptions, which in time would become widely used not just by Frito-Lay but also by the entire industry.
  • Dichter advised Frito-Lay to move its chips out of the realm of between-meals snacking and turn them into an ever-present item in the American diet. “The increased use of potato chips and other Lay’s products as a part of the regular fare served by restaurants and sandwich bars should be encouraged in a concentrated way,”
  • the largest weight-inducing food was the potato chip. The coating of salt, the fat content that rewards the brain with instant feelings of pleasure, the sugar that exists not as an additive but in the starch of the potato itself — all of this combines to make it the perfect addictive food. “The starch is readily absorbed,” Eric Rimm, an associate professor of epidemiology and nutrition at the Harvard School of Public Health and one of the study’s authors, told me. “More quickly even than a similar amount of sugar. The starch, in turn, causes the glucose levels in the blood to spike” — which can result in a craving for more.
  • If Americans snacked only occasionally, and in small amounts, this would not present the enormous problem that it does. But because so much money and effort has been invested over decades in engineering and then relentlessly selling these products, the effects are seemingly impossible to unwind.
  • Todd Putman, who worked at Coca-Cola from 1997 to 2001, said the goal became much larger than merely beating the rival brands; Coca-Cola strove to outsell every other thing people drank, including milk and water. The marketing division’s efforts boiled down to one question, Putman said: “How can we drive more ounces into more bodies more often?”
Javier E

Do We Really Need to Sleep 7 Hours a Night? - The New York Times - 0 views

  • By some estimates, Americans sleep two to three hours fewer today than they did before the industrial revolution.
  • now a new study is challenging that notion. It found that Americans on average sleep as much as people in three different hunter-gatherer societies where there is no electricity and the lifestyles have remained largely the same for thousands of years.
  • Dr. Siegel said he worried that putting a number on the amount of sleep people require could push those who get less to resort to using sleeping pills, which carry severe side effects. About 5 percent of Americans take sleeping pills, a percentage that has doubled in the past two decades.
  • ...18 more annotations...
  • their daily energy expenditure is about the same as most Americans, suggesting physical activity is not the reason for their relative good health.
  • The prevailing notion in sleep medicine is that humans evolved to go to bed when the sun goes down, and that by and large we stay up much later than we should because we are flooded with artificial light
  • Dr. Siegel and his colleagues found no evidence of this. The hunter-gatherer groups they studied, which slept outside or in crude huts, did not go to sleep when the sun went down. Usually they stayed awake three to four hours past sunset, with no light exposure other than the faint glow of a small fire that would keep animals away and provide a bit of warmth in the winter. Most days they would wake up about an hour before sunrise.
  • In a typical night, they slept just six and a half hours — slightly less than the average American
  • “I think this paper is going to transform the field of sleep,” said John Peever, a sleep expert at the University of Toronto who was not involved in the new research. “It’s difficult to envision how we can claim that Western society is highly sleep deprived if these groups that live without all these modern distractions and pressing schedules sleep less or about the same amount
  • in the new study, the hunter-gatherer societies were found to have a sleep period — meaning the time they were actually in bed — of roughly seven to eight and a half hours, which he said was consistent with his group’s recommendations.
  • the question of how much sleep people require was a delicate one. “Really it’s just the amount that allows people to wake up feeling refreshed and alert,
  • Yet the hunter-gatherers included in the new study, which was published in Current Biology, were relatively fit and healthy despite regularly sleeping amounts that are near the low end of those in industrialized societies.
  • called the new study “excellent and very timely,” and he said it suggests that sleep quality is much more important than quantity.
  • Some historians have also argued that it is not natural for people to sleep straight through the night. They say that before the introduction of artificial light it was normal for people to sleep in two intervals separated by an hour of wakefulness, a phenomenon known as segmented sleep, or “first” and “second” sleep.
  • But Dr. Siegel said he always questioned those assertions because there were no rigorous studies of sleep behaviors back then. He and his colleagues decided that one way to get some insight was to study cultures relatively unaffected by artificial light.
  • Among those they chose to follow were the Hadza people, who spend their days hunting and foraging in northern Tanzania, much as their ancestors have for tens of thousands of years; the San of Namibia, who have lived as hunter-gatherers in the Kalahari for at least 20,000 years; and the Tsimané, a seminomadic group that lives in the Andean foothills of Bolivia, near the farthest reaches of the human migration out of Africa.
  • The researchers found that in addition to sleeping roughly similar amounts each night, the three groups rarely took naps during the day and did not sleep in two separate intervals at night.
  • “The Hadza and the San live in the area where we know humans evolved, and then the Tsimané live in some sense at the end of the human migration,” he said. “The fact that we see very similar sleep times gives me great confidence that this is how all of our ancestors slept.”
  • Their sleep did not seem to be problematic. Chronic insomnia, which affects 20 percent to 30 percent of Americans, occurred in just 2 percent of the hunter-gatherers. The San and the Tsimané did not even have a word for it in their languages.
  • ambient temperature may be a major factor. The groups did not go sleep at sunset and they did not wake up at sunrise, suggesting that light exposure did not have much influence on their sleep patterns. But they almost always fell asleep as temperatures began to fall at night, and they would wake up right as the temperatures were rising again.
  • This suggests that humans may have evolved to sleep during the coldest hours of the day, perhaps as a way to conserve energy
  • “Today we sleep in environments with fixed temperatures, but none of our ancestors did,” Dr. Siegel said. “We evolved to sleep in a natural environment where the temperature falls at night. Whether we can treat insomnia by putting people in an environment where the temperature is modulated in this way is something to be studied in the future.
Javier E

At Kimberly-Clark, 'Dead Wood' Workers Have Nowhere to Hide - WSJ - 0 views

  • One of the company’s goals now is “managing out dead wood,” aided by performance-management software that helps track and evaluate salaried workers’ progress and quickly expose laggards. Turnover is now about twice as high it was a decade ago, with approximately 10% of U.S. employees leaving annually, voluntarily or not, the company said.
  • Armed with personalized goals for employees and large quantities of data, Kimberly-Clark said it expects employees to keep improving—or else. “People can’t duck and hide in the same way they could in the past,” said Mr. Boston, who oversees talent management globally for the firm.
  • Coca-Cola Co. KO -0.41 % in June approved pushing its new performance-management process from the pilot stage to a global rollout. The new system encourages managers to conduct a monthly “reflection” on every direct report, answering five questions that include “Given his/her performance, would you assign this associate to increased scale, scope, and responsibilities?” and “Is this associate at risk for low performance?”
  • ...14 more annotations...
  • The changes mirror what is happening inside many large companies, where “performance management” reflects the conviction that a sharpened focus on creating a high-performing workforce is a vital tool to generate revenue and profit.
  • Performance management shifts companies away from backward-looking, once-a-year reviews framed largely as compliance requirements—a paper trail for potential job cuts and salary decisions—to a process that is real-time, continuous and focused on helping people meet ambitious goals, or move out of the company faster.
  • The last recession led many employers to rethink the nearly automatic merit raises they had been doling out, forcing them to do a better job identifying high and low performers when giving raises and bonuses. Millennial workers, meanwhile, demand more feedback, more coaching and a stronger sense of their career path.
  • systems let managers track workers’ progress via dashboards that display their goals, accomplishments, attendance, peer feedback and other data.
  • Executives’ use of phrases like “performance culture” in conference calls with analysts and investors has doubled in the past five years, according to a review of transcripts in the Factiva news database. Firms that set goals and hold workers accountable “clearly outperform,” said Nicholas Bloom, an economist at Stanford University and co-author of a recent paper that used Census data to examine more than 32,000 U.S. manufacturing plants. He said they have faster growth, higher profitability and are less likely to go bankrupt.
  • Some academics say constant monitoring can feel intrusive and threatening to workers, especially those who value stability. But human-resources experts largely agree that the traditional review process is a waste of time and needs an overhaul.
  • Remaining employees are expected to work “smarter” and meet regularly raised targets. “We have to routinely shuffle the resources and say, what’s the most important thing we need to do today, this week, this month, to drive this objective?”
  • Using the Workday tool, Kimberly-Clark’s salaried employees set goals and report their progress, record accomplishments or mistakes, and solicit and send feedback
  • The system collects and archives feedback, which can be seen by employees’ managers. It also holds data on staffers’ strengths and development needs, their performance ratings and the risk they might leave the company.
  • “It’s certainly more challenging” for employees, said Mr. Herbert, the retired sales director. “If you really don’t have the mettle, you’re asked to get on with your life’s work [elsewhere].”
  • In 2015, Kimberly-Clark retained 95% of its top performers. Among the employees whose work was rated “unacceptable” or “inconsistent,” 44% left the company voluntarily or were let go. Ms. Gottung said she is “pretty pleased” that low-performer turnover has been rising.
  • Mr. Falk, the CEO, reviews 100 senior managers’ performance plans every year to make sure their goals are ambitious and reflect company priorities. Managers are instructed to begin every meeting with a story about how someone demonstrated one of the six behaviors the company promotes, such as “build trust” or “think customer.”
  • Regular “culture of accountability” sessions train employees in giving and receiving difficult feedback. When a colleague suggests improvements, “the proper response was ‘thank you for the feedback,’ not defensiveness,” Mr. Luettgen said. Employees also practice reinforcing positive behaviors, such as praising a colleague who had given up a weekend to solve a customer complaint.
  • More than 10,000 of Kimberly-Clark’s workers used the feedback feature in Workday in 2014, and about 25% of the comments were considered “constructive,” while the rest were positive or neutral, said Sandy Allred, a senior director on the talent management team. Staffers can send feedback to peers or workers above or below them
Javier E

Having It All-and Hating It - The Atlantic - 0 views

  • Nearly 35 years ago, Helen Gurley Brown published Having It All: Love, Success, Sex, Money, Even If You’re Starting With Nothing, a landmark bestseller in a pre-Oprah world about living your best life. In the ’80s, this was a go-girl message about putting on that power suit, and having great sex while doing it. Becoming a mother always complicated the equation
  • Today, it’s perhaps even more complicated: Work can no longer be left at the office; parenting is competitive and all-encompassing (one study found that working mothers today spend six hours more per week on childcare than stay-at-home mothers in the 1970s); marriage is expected to be both financially and emotionally satisfying; social media beckons its users compare every element of their lives to everyone else’s in a very public space, and then feel inadequate about not filling their feeds with smiling, well-appointed children nibbling perfectly composed, locally-sourced dinners. Having it all, as unattainable as it may have always been, is beyond the realm of possibility.
  • Of the women we interviewed for this project, our Highest Achievers (women who are C-Suite-adjacent or recognized in their fields) have ascended to that level in part because they’re cool with not having it all: For them, being a physically present parent was not their number-one priority.
  • ...6 more annotations...
  • Instead, the women still chasing the having-it-all dream are the group we’re calling the Scale Backers—13 women who dialed down high-powered careers to simultaneously be full-time mothers and workers. And in the process of downsizing, they became, ironically, the most stressed-out of our subjects, attempting to do everything well, but feeling like they excelled at none of it.
  • Having it all has always been exhausting, but our interviewees are attempting it not because they’re aspiring to be CEO, but under the illusion of work-life balance.
  • A January 2013 study by the American Sociological Association backed the idea that flexible work environments make for happier, healthier, more productive workers. But even our subjects with flexible or work-from-home jobs, while grateful for the arrangement, still seemed to operate from a baseline of frazzled.
  • While our Opt Out group has left the primary earning to their spouses, and our High Achievers have hired the help they need to run their lives, the Scale Backers insist on having one (super-flexed) leg in every realm—leaving many of them hobbling through their days.
  • Having it all today means answering emails from the playground, abruptly ending a conference call to deliver a forgotten lunch, and giving both work and your kids short shrift.
  • And yet, when asked what might make their lives easier, most of these subjects demurred, saying they wouldn’t change a thing. Every one of them described her life, complete with compromises and chaos, as a good life. Most seemed pleased at how their lives had turned out 25 years after college, despite sacrifices for both their career and their children. And many women admitted that part of what they liked about attempting to juggle it all is the sense of engineering their own destiny in every avenue.
proudsa

Hillary Clinton Attacks Donald Trump In Foreign Policy Speech - The Atlantic - 0 views

  • Contrasting her track record as secretary of state with Trump’s lack of foreign-policy experience, Clinton made the case that the presumptive Republican nominee is, above all, unqualified to be president.
    • proudsa
       
      This seems to be the part that voters are regularly forgetting.
  • dangerously incoherent.
  • Her speech, delivered in San Diego, portrayed Trump, by turns, as menacing, reckless, comical, even p
  • ...9 more annotations...
  • athological.
  • But it remains unclear whether voters want a candidate, like Clinton, who can deploy careful, nuanced national-security arguments, or if they prefer a candidate, like Trump, whose approach to foreign policy frequently appears to be grounded in gut instinct.
  • divisive
  • alienates
  • It is unclear if Clinton’s arguments will resonate with Republican voters who have rallied around Trump’s brash promises to protect America, seemingly at any cost.
  • 56 percent of voters think Clinton would handle foreign policy better than Trum
  • It is impossible to predict exactly how Clinton or Trump would engage in diplomacy, or retreat from it, if elected president.
  • For his part, Trump can turn Clinton’s credentials to his advantage
  • In a general election that pits Clinton against Trump, voters may have to decide what they find more appealing: an established track record or a relatively unknown quantity who brings with him the promise of brute force.
    • proudsa
       
      Like Mr. Ergueta was saying in MFW, whether you agree with him or not, Trump is a candidate unlike any we have ever seen before
Javier E

Why Americans Lead the World in Food Waste - The Atlantic - 0 views

  • roughly 50 percent of all produce in the United States is thrown away—some 60 million tons (or $160 billion) worth of produce annually, an amount constituting “one third of all foodstuffs.”
  • Wasted food is also the single biggest occupant in American landfills
  • the great American squandering of produce appears to be a cultural dynamic as well, enabled in large part by a national obsession with the aesthetic quality of food.
  • ...13 more annotations...
  • bruise, brown, wilt, oxidize, ding, or discolor and that is apparently something American shoppers will not abide. For an American family of four, the average value of discarded produce is nearly $1,600 annually
  • (Globally, the United Nations Food and Agriculture Organization estimates that one-third of all food grown is lost or wasted, an amount valued at nearly $3 trillion. )
  • “Grocery stores routinely trash produce for being the wrong shape or containing minor blemishes,
  • “Vast quantities of fresh produce grown in the U.S. are left in the field to rot, fed to livestock or hauled directly from the field to landfill, because of unrealistic and unyielding cosmetic standards.”
  • “In my mind, the desire for perfect produce came about in the 1940s as housewives adapted to widespread refrigeration and new CPG [consumer packaged goods] products,”
  • Perfection and manicured foods came to represent safety and new technology.
  • this obsession might become amplified in an era of high foodie-ism and Instagram where a sort of heirloom airbrushing has taken hold. Writing in The Times in 2014, Pete Wells christened the extension of this phenomenon in restaurants as “camera cuisine,”
  • in the last year, ‘foodies’ and chefs have catapulted the issue of food waste into popular conversations,” she adds, naming initiatives by chefs and public intellectuals such as Dan Barber and Roy Choi as well as the pu pu platter of coverage of the issue in elite food magazines.
  • start-ups like the Bay Area’s Imperfect Produce are starting to deliver ugly but otherwise consumable goods at a discount
  • France has banned supermarkets from throwing away food by directing them to compost or donate all expiring or unsold food.
  • Germany is focusing on the issue in part by reforming expiration dates, which many argue are arbitrary and problematic.
  • “My hope is that as food education proliferates, so will an appreciation for ugly fruits and veggies, biodiversity, local crops, and so much more, all of which can help mitigate food waste,”
  • “Wouldn't it be neat if the power of Instagram was used to share recipes for carrot top pesto and food scrap stock? Or if we had easy-to-use apps for sharing extra produce with neighbors or food pantries? Both ideas I've already seen foodies fiddling with.”
Javier E

McLean High School student's heroin overdose shows disturbing trend facing police - The... - 0 views

  • The drug seems to be permeating many places across the country. In a news release announcing a bust in New York on Friday, James J. Hunt, acting special agent in charge with the Drug Enforcement Administration, said heroin was “pummeling the northeast, leaving addiction, overdoses and fear in its wake.” In Vermont, the governor devoted much of his State of the State address to discussing heroin and opiate addiction.
  • In Maryland, state health officials said the number of such deaths increased from 245 in 2011 to 378 in 2012. In Virginia, officials said they recorded 91 accidental heroin deaths in the first nine months of 2012, up from 90 for all of 2011 and 70 for 2010. D.C. officials said their statistics are current only through 2011.
  • Many users, they say, are people who became dependent on prescription pain pills but can no longer get them because doctors and pharmacies have reformed how they are doled out.
  • ...1 more annotation...
  • But heroin, officials say, is a dangerous substitute. Its dosage, they say, is not controlled by the pill, and its purity can vary wildly.“If you go to heroin, you don’t know who you’re getting it from, what it’s cut with, what quantity can I handle,
Javier E

The Collapse of Big Law: A Cautionary Tale for Big Med - Richard Gunderman and Mark Mut... - 0 views

  • he law is not well. US law school applications are down by nearly half from eight years ago, and 85% of graduates now carry at least $100,000 in debt. More than 180 of the 200 US law schools are able to find jobs for more than 80% of their graduates. Median starting salaries for those who do find work are down by 17%, and more than a third of graduates cannot find full-time employment. Tellingly, lawyers have higher rates of depression and alcoholism than the general population. 
  • more fundamental problems emerge. One is the increasing popularity of law school rankings. In order to compete for students and tuition dollars, law schools do what they can to improve their standing, which means in part encouraging as many students as possible to apply and to take jobs with high-paying firms when they graduate
  • An even more serious problem is the way law firms keep score. One prevalent measure is PPP, or profit per partner, introduced by The American Lawyer in 1985. When such statistics began to be published, firms that thought they were doing well suddenly discovered that they were being outperformed by peers.  Soon bidding wars ensued for top earners, who are sometimes referred to as “rainmakers.”
  • ...5 more annotations...
  • as soon as law firms begin measuring their performance by the revenue each attorney generates, money begins to supplant all other means of assessing performance.
  • To professionals who choose careers in fields such as law, medicine, and teaching, it is demoralizing to be treated as a unit of production. Even some of the lawyers earning millions of dollars report that they find little or no fulfillment in the work they do.
  • by stoking the flames of competition between law firms and attorneys, the current system has engrained what economists call a “zero-sum” mentality. There is only a relatively fixed quantity of legal work to be done, and for one firm or attorney to command more of it, others must make due with less.
  • As a professional, a lawyer represents her clients in the courtroom, in her office and at the negotiating table. She operates with an appreciation for her role in the adversarial judicial process, the need to educate clients about the limits and purpose of the law, and the importance of helping clients create frameworks to work together to form organizations, build businesses, and plan for the future. Doing these things well provides a sense of meaning and value in work.
  • As a mere service provider, by contrast, her role is to provide a discrete technical service—usually assumed to be the same as any other lawyer would provide—for a fee. Her success is measured not in the professional insight and practical wisdom she offers but in the technical efficiency with which she provides services and her ability to attract other clients willing to pay her to do the same. The sense of professional fulfillment associated with the role of service provider is small at best. 
Javier E

How Technology Wrecks the Middle Class - NYTimes.com - 0 views

  • the productivity of American workers — those lucky enough to have jobs — has risen smartly
  • the United States still has two million fewer jobs than before the downturn, the unemployment rate is stuck at levels not seen since the early 1990s and the proportion of adults who are working is four percentage points off its peak in 2000.
  • Do “smart machines” threaten us with “long-term misery,” as the economists Jeffrey D. Sachs and Laurence J. Kotlikoff prophesied earlier this year?
  • ...17 more annotations...
  • Economists have historically rejected what we call the “lump of labor” fallacy: the supposition that an increase in labor productivity inevitably reduces employment because there is only a finite amount of work to do. While intuitively appealing, this idea is demonstrably false.
  • Labor-saving technological change necessarily displaces workers performing certain tasks — that’s where the gains in productivity come from — but over the long run, it generates new products and services that raise national income and increase the overall demand for labor.
  • The multi-trillionfold decline in the cost of computing since the 1970s has created enormous incentives for employers to substitute increasingly cheap and capable computers for expensive labor.
  • Computers excel at “routine” tasks: organizing, storing, retrieving and manipulating information, or executing exactly defined physical movements in production processes. These tasks are most pervasive in middle-skill jobs
  • Logically, computerization has reduced the demand for these jobs, but it has boosted demand for workers who perform “nonroutine” tasks that complement the automated activities
  • At one end are so-called abstract tasks that require problem-solving, intuition, persuasion and creativity.
  • On the other end are so-called manual tasks, which require situational adaptability, visual and language recognition, and in-person interaction.
  • Computerization has therefore fostered a polarization of employment, with job growth concentrated in both the highest- and lowest-paid occupations, while jobs in the middle have declined.
  • overall employment rates have largely been unaffected in states and cities undergoing this rapid polarization.
  • So computerization is not reducing the quantity of jobs, but rather degrading the quality of jobs for a significant subset of workers. Demand for highly educated workers who excel in abstract tasks is robust, but the middle of the labor market, where the routine task-intensive jobs lie, is sagging.
  • Spurred by growing demand for workers performing abstract job tasks, the payoff for college and professional degrees has soared; despite its formidable price tag, higher education has perhaps never been a better investment.
  • The good news, however, is that middle-education, middle-wage jobs are not slated to disappear completely. While many middle-skill jobs are susceptible to automation, others demand a mixture of tasks that take advantage of human flexibility
  • we predict that the middle-skill jobs that survive will combine routine technical tasks with abstract and manual tasks in which workers have a comparative advantage — interpersonal interaction, adaptability and problem-solving.
  • this category includes numerous jobs for people in the skilled trades and repair: plumbers; builders; electricians; heating, ventilation and air-conditioning installers; automotive technicians; customer-service representatives; and even clerical workers who are required to do more than type and file
  • Lawrence F. Katz, a labor economist at Harvard, memorably called those who fruitfully combine the foundational skills of a high school education with specific vocational skills the “new artisans.”
  • The outlook for workers who haven’t finished college is uncertain, but not devoid of hope. There will be job opportunities in middle-skill jobs, but not in the traditional blue-collar production and white-collar office jobs of the past
  • we expect to see growing employment among the ranks of the “new artisans”: licensed practical nurses and medical assistants; teachers, tutors and learning guides at all educational levels; kitchen designers, construction supervisors and skilled tradespeople of every variety; expert repair and support technicians; and the many people who offer personal training and assistance, like physical therapists, personal trainers, coaches and guides
1 - 20 of 80 Next › Last »
Showing 20 items per page