Skip to main content

Home/ TOK Friends/ Group items tagged just war

Rss Feed Group items tagged

Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 1 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • The History of Animal Spirits: Dreams Never Sleep
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

The War in Ukraine Has Unleashed a New Word - The New York Times - 0 views

  • As I read about Irpin, about Bucha, about Trostyanets, of the bodies crushed by tanks, of the bicyclists shot on the street, of the desecrated corpses, there it was, “рашизм,” again and again
  • Grasping its meaning requires crossing differences in alphabet and pronunciation, thinking our way into the experience of a bilingual society at war with a fascist empire.
  • “Pашизм” sounds like “fascism,” but with an “r” sound instead of an “f” at the beginning; it means, roughly, “Russian fascism.”
  • ...19 more annotations...
  • The aggressor in this war keeps trying to push back toward a past as it never happened, toward nonsensical and necrophiliac accounts of history. Russia must conquer Ukraine, Vladimir Putin says, because of a baptism a thousand years ago, or because of bloodshed during World War II.
  • The new word “рашизм” is a useful conceptualization of Putin’s worldview. Far more than Western analysts, Ukrainians have noticed the Russian tilt toward fascism in the last decade.
  • Undistracted by Putin’s operational deployment of genocide talk, they have seen fascist practices in Russia: the cults of the leader and of the dead, the corporatist state, the mythical past, the censorship, the conspiracy theories, the centralized propaganda and now the war of destruction
  • we have tended to overlook the central example of fascism’s revival, which is the Putin regime in the Russian Federation.
  • A bilingual nation like Ukraine is not just a collection of bilingual individuals; it is an unending set of encounters in which people habitually adjust the language they use to other people and new settings, manipulating language in ways that are foreign to monolingual nations
  • I have gone on Ukrainian television and radio, taken questions in Russian and answered them in Ukrainian, without anyone for a moment finding that switch worthy of mention.
  • Ukrainians change languages effortlessly — not just as situations change, but also to make situations change, sometimes in the middle of a sentence, or even in the middle of a word.
  • “Pашизм” is a word built up from the inside, from several languages, as a complex of puns and references that reveal a bilingual society thinking out its predicament and communicating to itself.
  • Putin’s ethnic imperialism insists that Ukrainians must be Russians because they speak Russian. They do — and they speak Ukrainian. But Ukrainian identity has as much to do with an ability to live between languages than it does with the use of any one of them
  • Those six Cyrillic letters contain references to Italian, Russian and English, all of which a mechanical, letter-by-letter transliteration would block
  • The best (if imperfect) way I have found to render “рашизм” from Ukrainian into English is “ruscism”
  • When we see “ruscism” we might guess this word has to do with Russia (“rus”), with politics (“ism”) and with the extreme right (“ascism”) — as, indeed, it does
  • I have had to spell “рашизм” as “ruscism” in English because we need “rus,” with a “u,” to see the reference to Russia. In losing the original Ukrainian “a,” though, we weaken a multilayered reference — because the “a” in “рашизм,” conveniently, allows the Ukrainian word to associate Russia and fascism in a way English cannot.
  • If you don’t know either language, you might think that Russian and Ukrainian are very similar. They are pretty close — much as, say, Spanish and Italian are.
  • the semantics are not that close
  • From a Russian perspective, the false friends are legion. There is an elegant four-syllable Ukrainian word that simply means “soon” or “without delay,” but to a Russian it sounds like “not behind the bar.” The Ukrainian word for “cat” sounds like the Russian for “whale,” while the Ukrainian for “female cats” sounds like Russian for “intestines.”
  • Russians do not understand Ukrainian, because they have not learned it. Ukrainians do understand Russian, because they have learned it.
  • Ukrainian soldiers often speak Russian, though they are instructed to use Ukrainian to spot infiltrators and spies. This is a drastic example of a general practice of code-switching.
  • Ukrainians are perfectly capable of writing Russian correctly, but during the war some internet commentators have spelled the occasional Russian word using the Ukrainian writing system, leaving it looking unmoored and pitiable. Writing in Ukrainian, you might spell “oсвобождение” as “aсвобaждениe,” the way it is pronounced — a bit of lexicographic alchemy that makes it (and, by extension, Russians) look silly, and mocks the political concepts being used to justify a war. In a larger sense, such efforts are a means of displacing Russia from its central position in regional culture.
Javier E

Pandemic-Era Politics Are Ruining Public Education - The Atlantic - 0 views

  • You’re also the nonvoting, perhaps unwitting, subject of adults’ latest pedagogical experiments: either relentless test prep or test abolition; quasi-religious instruction in identity-based virtue and sin; a flood of state laws to keep various books out of your hands and ideas out of your head.
  • Your parents, looking over your shoulder at your education and not liking what they see, have started showing up at school-board meetings in a mortifying state of rage. If you live in Virginia, your governor has set up a hotline where they can rat out your teachers to the government. If you live in Florida, your governor wants your parents to sue your school if it ever makes you feel “discomfort” about who you are
  • Adults keep telling you the pandemic will never end, your education is being destroyed by ideologues, digital technology is poisoning your soul, democracy is collapsing, and the planet is dying—but they’re counting on you to fix everything when you grow up.
  • ...37 more annotations...
  • It isn’t clear how the American public-school system will survive the COVID years. Teachers, whose relative pay and status have been in decline for decades, are fleeing the field. In 2021, buckling under the stresses of the pandemic, nearly 1 million people quit jobs in public education, a 40 percent increase over the previous year.
  • These kids, and the investments that come with them, may never return—the beginning of a cycle of attrition that could continue long after the pandemic ends and leave public schools even more underfunded and dilapidated than before. “It’s an open question whether the public-school system will recover,” Steiner said. “That is a real concern for democratic education.”
  • The high-profile failings of public schools during the pandemic have become a political problem for Democrats, because of their association with unions, prolonged closures, and the pedagogy of social justice, which can become a form of indoctrination.
  • The party that stands for strong government services in the name of egalitarian principles supported the closing of schools far longer than either the science or the welfare of children justified, and it has been woefully slow to acknowledge how much this damaged the life chances of some of America’s most disadvantaged students.
  • Public education is too important to be left to politicians and ideologues. Public schools still serve about 90 percent of children across red and blue America.
  • Since the common-school movement in the early 19th century, the public school has had an exalted purpose in this country. It’s our core civic institution—not just because, ideally, it brings children of all backgrounds together in a classroom, but because it prepares them for the demands and privileges of democratic citizenship. Or at least, it needs to.
  • What is school for? This is the kind of foundational question that arises when a crisis shakes the public’s faith in an essential institution. “The original thinkers about public education were concerned almost to a point of paranoia about creating self-governing citizens,”
  • “Horace Mann went to his grave having never once uttered the phrase college- and career-ready. We’ve become more accustomed to thinking about the private ends of education. We’ve completely lost the habit of thinking about education as citizen-making.”
  • School can’t just be an economic sorting system. One reason we have a stake in the education of other people’s children is that they will grow up to be citizens.
  • Public education is meant not to mirror the unexamined values of a particular family or community, but to expose children to ways that other people, some of them long dead, think.
  • If the answer were simply to push more and more kids into college, the United States would be entering its democratic prime
  • So the question isn’t just how much education, but what kind. Is it quaint, or utopian, to talk about teaching our children to be capable of governing themselves?
  • The COVID era, with Donald Trump out of office but still in power and with battles over mask mandates and critical race theory convulsing Twitter and school-board meetings, shows how badly Americans are able to think about our collective problems—let alone read, listen, empathize, debate, reconsider, and persuade in the search for solutions.
  • democratic citizenship can, at least in part, be learned.
  • The history warriors build their metaphysics of national good or evil on a foundation of ignorance. In a 2019 survey, only 40 percent of Americans were able to pass the test that all applicants for U.S. citizenship must take, which asks questions like “Who did the United States fight in World War II?” and “We elect a President for how many years?” The only state in which a majority passed was Vermont.
  • he orthodoxies currently fighting for our children’s souls turn the teaching of U.S. history into a static and morally simple quest for some American essence. They proceed from celebration or indictment toward a final judgment—innocent or guilty—and bury either oppression or progress in a subordinate clause. The most depressing thing about this gloomy pedagogy of ideologies in service to fragile psyches is how much knowledge it takes away from students who already have so little
  • A central goal for history, social-studies, and civics instruction should be to give students something more solid than spoon-fed maxims—to help them engage with the past on its own terms, not use it as a weapon in the latest front of the culture wars.
  • Releasing them to do “research” in the vast ocean of the internet without maps and compasses, as often happens, guarantees that they will drown before they arrive anywhere.
  • The truth requires a grounding in historical facts, but facts are quickly forgotten without meaning and context
  • The goal isn’t just to teach students the origins of the Civil War, but to give them the ability to read closely, think critically, evaluate sources, corroborate accounts, and back up their claims with evidence from original documents.
  • This kind of instruction, which requires teachers to distinguish between exposure and indoctrination, isn’t easy; it asks them to be more sophisticated professionals than their shabby conditions and pay (median salary: $62,000, less than accountants and transit police) suggest we are willing to support.
  • To do that, we’ll need to help kids restore at least part of their crushed attention spans.
  • staring at a screen for hours is a heavy depressant, especially for teenagers.
  • we’ll look back on the amount of time we let our children spend online with the same horror that we now feel about earlier generations of adults who hooked their kids on smoking.
  • “It’s not a choice between tech or no tech,” Bill Tally, a researcher with the Education Development Center, told me. “The question is what tech infrastructure best enables the things we care about,” such as deep engagement with instructional materials, teachers, and other students.
  • The pandemic should have forced us to reassess what really matters in public school; instead, it’s a crisis that we’ve just about wasted.
  • Like learning to read as historians, learning to sift through the tidal flood of memes for useful, reliable information can emancipate children who have been heedlessly hooked on screens by the adults in their lives
  • Finally, let’s give children a chance to read books—good books. It’s a strange feature of all the recent pedagogical innovations that they’ve resulted in the gradual disappearance of literature from many classrooms.
  • The best way to interest young people in literature is to have them read good literature, and not just books that focus with grim piety on the contemporary social and psychological problems of teenagers.
  • We sell them insultingly short in thinking that they won’t read unless the subject is themselves. Mirrors are ultimately isolating; young readers also need windows, even if the view is unfamiliar, even if it’s disturbing
  • connection through language to universal human experience and thought is the reward of great literature, a source of empathy and wisdom.
  • The culture wars, with their atmosphere of resentment, fear, and petty faultfinding, are hostile to the writing and reading of literature.
  • W. E. B. Du Bois wrote: “Nations reel and stagger on their way; they make hideous mistakes; they commit frightful wrongs; they do great and beautiful things. And shall we not best guide humanity by telling the truth about all this, so far as the truth is ascertainable?”
  • The classroom has become a half-abandoned battlefield, where grown-ups who claim to be protecting students from the virus, from books, from ideologies and counter-ideologies end up using children to protect themselves and their own entrenched camps.
  • American democracy can’t afford another generation of adults who don’t know how to talk and listen and think. We owe our COVID-scarred children the means to free themselves from the failures of the past and the present.
  • Students are leaving as well. Since 2020, nearly 1.5 million children have been removed from public schools to attend private or charter schools or be homeschooled.
  • “COVID has encouraged poor parents to question the quality of public education. We are seeing diminished numbers of children in our public schools, particularly our urban public schools.” In New York, more than 80,000 children have disappeared from city schools; in Los Angeles, more than 26,000; in Chicago, more than 24,000.
Javier E

Drones, Ethics and the Armchair Soldier - NYTimes.com - 0 views

  • the difference between humans and robots is precisely the ability to think and reflect, in Immanuel Kant’s words, to set and pursue ends for themselves. And these ends cannot be set beforehand in some hard and fast way
  • Working one’s way through the complexities of “just war” and moral theory makes it perfectly clear that ethics is not about arriving easily at a single right answer, but rather coming to understand the profound difficulty of doing so. Experiencing this difficulty is what philosophers call existential responsibility.
  • One of the jobs of philosophy, at least as I understand it, is neither to help people to avoid these difficulties nor to exaggerate them, but rather to face them in resolute and creative ways.
  • ...6 more annotations...
  • ground troops, unfortunately, had more pressing concerns than existential responsibility. They did not have leisure, unlike their commanders, who also often had the philosophical training to think through the complexities of their jobs.
  • This training was not simply a degree requirement at Officer Candidate School or one of the United States military academies, but a sustained, ongoing, and rigorous engagement with a philosophical tradition. Alexander lived with Aristotle.
  • , Jeff McMahan argued that traditional “just war theory” should be reworked in several important ways. He suggested that the tenets of a revised theory apply not only to governments, traditionally represented by commanders and heads of state, but also to individual soldiers. This is a significant revision since it broadens the scope of responsibility for warfare
  • McMahan believes that individuals are to bear at least some responsibility in upholding “just cause” requirements. McMahan expects more of soldiers and, in this age of drones and leisure, he is right to do so.
  • while drones are to be applauded for keeping these soldiers out of harm’s way physically, we would do well to remember that they do not keep them out of harm’s way morally or psychologically. The high rates of “burnout” should drive this home. Supporting our troops requires ensuring that they are provided not just with training and physical armor, but with the intellectual tools to navigate these new difficulties.
  • Just as was the case in the invasion of Iraq 10 years ago, the most important questions we should be asking should not be directed to armchair soldiers but to those of us in armchairs at home: What wars are being fought in our name? On what grounds are they being fought?
katherineharron

FBI arrests spotlight lessons learned after Charlottesville (opinion) - CNN - 0 views

  • On Thursday, the FBI arrested three men, Patrik J. Mathews, 27, Brian M. Lemley Jr., 33, and William G. Bilbrough IV, 19, with firearms charges, and they had plans, an official said, to attend a Virginia pro-gun rally. This followed Virginia Gov. Ralph Northam's declaration of a temporary state of emergency after authorities learned that extremists hoped to use the anti-gun control rally planned next Monday -- Martin Luther King, Jr. Day -- to incite a violent clash.
  • These arrests add to mounting evidence that a decades-old and violent white-power movement is alive and well, perhaps even gaining strength. White power is a social movement that has united neo-Nazis, Klansmen, skinheads, and militiamen around a shared fear of racial annihilation and cultural change. Since 1983, when movement leaders declared war on the federal government, members of such groups have worked together to bring about a race war.
  • JUST WATCHEDOn GPS: What motivates white power activists?ReplayMore Videos ...MUST WATCH position: absol
  • ...4 more annotations...
  • Silver linings aside, it will take many, many more instances of coordinated response to stop a movement generations in the making. In more than a decade of studying the earlier white power movement, I have become familiar with the themes of underground activity that are today clearly drawing from the earlier movement. In the absence of decisive action across multiple institutions, a rich record of criminal activity and violence will continue to provide these activists with a playbook for further chaos.
  • The Base, furthermore, is what experts call "accelerationist," meaning that its members hope to provoke what they see as an inevitable race war. They have conducted paramilitary training in the Pacific Northwest. Both of these strategies date back to the 1980s, when the Order trained in those forests with hopes of provoking the same race war.
  • One of the men arrested Thursday was formerly a reservist in the Canadian Army, where he received training in explosives and demolition, according to the New York Times. This kind of preparation, too, is common among extremists like these. To take just a few representative examples, in the 1960s, Bobby Frank Cherry, a former Marine trained in demolition, helped fellow members of the United Klans of America to bomb the 16th Street Birmingham Baptist Church, killing four black girls.
  • This news out of Virginia shows that there is a real social benefit when people direct their attention to these events -- and sustain the public conversation about the presence of a renewed white-power movement and what it means for our society.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

Naomi Oreskes, a Lightning Rod in a Changing Climate - The New York Times - 0 views

  • Dr. Oreskes is fast becoming one of the biggest names in climate science — not as a climatologist, but as a defender who uses the tools of historical scholarship to counter what she sees as ideologically motivated attacks on the field.
  • Formally, she is a historian of science
  • Dr. Oreskes’s approach has been to dig deeply into the history of climate change denial, documenting its links to other episodes in which critics challenged a developing scientific consensus.
  • ...20 more annotations...
  • Her core discovery, made with a co-author, Erik M. Conway, was twofold. They reported that dubious tactics had been used over decades to cast doubt on scientific findings relating to subjects like acid rain, the ozone shield, tobacco smoke and climate change. And most surprisingly, in each case, the tactics were employed by the same group of people.
  • The central players were serious scientists who had major career triumphs during the Cold War, but in subsequent years apparently came to equate environmentalism with socialism, and government regulation with tyranny.
  • In a 2010 book, Dr. Oreskes and Dr. Conway called these men “Merchants of Doubt,” and this spring the book became a documentary film, by Robert Kenner. At the heart of both works is a description of methods that were honed by the tobacco industry in the 1960s and have since been employed to cast doubt on just about any science being cited to support new government regulations.
  • Dr. Oreskes, the more visible and vocal of the “Merchants” authors, has been threatened with lawsuits and vilified on conservative websites, and routinely gets hate mail calling her a communist or worse.
  • She established her career as a historian with a book-length study examining the role of dissent in the scientific method. As she put it a few months ago to an audience at Indiana University, she wanted to wrestle with this question: “How do you distinguish a maverick from a crank?”
  • Dr. Oreskes found that Wegener had been treated badly, particularly by American geologists. But he did not abandon his faith in the scientific method. He kept publishing until his death in 1930, trying to convince fellow scientists of his position, and was finally vindicated three decades later by oceanographic research conducted during the Cold War.
  • As she completed that study, Dr. Oreskes sought to understand how science was affected not only by the Cold War but by its end. In particular, she started wondering about climate science. Global warming had seemed to rise as an important issue around the time the Iron Curtain came down. Was this just a way for scientists to scare up research money that would no longer be coming their way through military channels?
  • the widespread public impression was that scientists were still divided over whether humans were primarily responsible for the warming of the planet. But how sharp was the split, she wondered?
  • She decided to do something no climate scientist had thought to do: count the published scientific papers. Pulling 928 of them, she was startled to find that not one dissented from the basic findings that warming was underway and human activity was the main reason.
  • She published that finding in a short paper in the journal Science in 2004, and the reaction was electric. Advocates of climate action seized on it as proof of a level of scientific consensus that most of them had not fully perceived. Just as suddenly, Dr. Oreskes found herself under political attack.
  • Some of the voices criticizing her — scientists like Dr. Singer and groups like the George C. Marshall Institute in Washington — were barely known to her at the time, Dr. Oreskes said in an interview. Just who were they?
  • It did not take them long to document that this group, which included prominent Cold War scientists, had been attacking environmental research for decades, challenging the science of the ozone layer and acid rain, even the finding that breathing secondhand tobacco smoke was harmful. Trying to undermine climate science was simply the latest project.
  • Dr. Oreskes and Dr. Conway came to believe that the attacks were patterned on the strategy employed by the tobacco industry when evidence of health risks first emerged. Documents pried loose by lawyers showed that the industry had paid certain scientists to contrive dubious research, had intimidated reputable scientists, and had cherry-picked evidence to present a misleading pictur
  • The tobacco industry had used these tactics in defense of profits. But Dr. Oreskes and Dr. Conway wrote that the so-called merchants of doubt had adopted them for a deep ideological reason: contempt for government regulation. The insight gave climate scientists a new way of understanding the politics that had engulfed their field.
  • Following Dr. Oreskes’s cue, researchers have in recent years developed a cottage industry of counting scientific papers and polling scientists. The results typically show that about 97 percent of working climate scientists accept that global warming is happening, that humans are largely responsible, and that the situation poses long-term risks, though the severity of those risks is not entirely clear. That wave of evidence has prompted many national news organizations to stop portraying the field as split evenly between scientists who are convinced and unconvinced.
  • Dr. Oreskes’s critics have taken delight in searching out errors in her books and other writings, prompting her to post several corrections. They have generally been minor, though, like describing a pH of six as neutral, when the correct number is seven. Dr. Oreskes described that as a typographical error.
  • In the leaked emails, Dr. Singer told a group of his fellow climate change denialists that he felt that Dr. Oreskes and Dr. Conway had libeled him. But in an interview, when pressed for specific errors in the book that might constitute libel, he listed none. Nor did he provide such a list in response to a follow-up email request.
  • However much she might be hated by climate change denialists, Dr. Oreskes is often welcomed on college campuses these days. She usually outlines the decades of research supporting the idea that human emissions pose serious risks.
  • “One of the things that should always be asked about scientific evidence is, how old is it?” Dr. Oreskes said. “It’s like wine. If the science about climate change were only a few years old, I’d be a skeptic, too.”
  • Dr. Oreskes and Dr. Conway keep looking for ways to reach new audiences. Last year, they published a short work of science fiction, written as a historical essay from the distant future. “The Collapse of Western Civilization: A View From the Future” argues that conservatives, by fighting sensible action to cope with the climate crisis, are essentially guaranteeing the long-term outcome they fear, a huge expansion of government.
Javier E

Where We Went Wrong | Harvard Magazine - 0 views

  • John Kenneth Galbraith assessed the trajectory of America’s increasingly “affluent society.” His outlook was not a happy one. The nation’s increasingly evident material prosperity was not making its citizens any more satisfied. Nor, at least in its existing form, was it likely to do so
  • One reason, Galbraith argued, was the glaring imbalance between the opulence in consumption of private goods and the poverty, often squalor, of public services like schools and parks
  • nother was that even the bountifully supplied private goods often satisfied no genuine need, or even desire; a vast advertising apparatus generated artificial demand for them, and satisfying this demand failed to provide meaningful or lasting satisfaction.
  • ...28 more annotations...
  • economist J. Bradford DeLong ’82, Ph.D. ’87, looking back on the twentieth century two decades after its end, comes to a similar conclusion but on different grounds.
  • DeLong, professor of economics at Berkeley, looks to matters of “contingency” and “choice”: at key junctures the economy suffered “bad luck,” and the actions taken by the responsible policymakers were “incompetent.”
  • these were “the most consequential years of all humanity’s centuries.” The changes they saw, while in the first instance economic, also “shaped and transformed nearly everything sociological, political, and cultural.”
  • DeLong’s look back over the twentieth century energetically encompasses political and social trends as well; nor is his scope limited to the United States. The result is a work of strikingly expansive breadth and scope
  • labeling the book an economic history fails to convey its sweeping frame.
  • The century that is DeLong’s focus is what he calls the “long twentieth century,” running from just after the Civil War to the end of the 2000s when a series of events, including the biggest financial crisis since the 1930s followed by likewise the most severe business downturn, finally rendered the advanced Western economies “unable to resume economic growth at anything near the average pace that had been the rule since 1870.
  • d behind those missteps in policy stood not just failures of economic thinking but a voting public that reacted perversely, even if understandably, to the frustrations poor economic outcomes had brought them.
  • Within this 140-year span, DeLong identifies two eras of “El Dorado” economic growth, each facilitated by expanding globalization, and each driven by rapid advances in technology and changes in business organization for applying technology to economic ends
  • from 1870 to World War I, and again from World War II to 197
  • fellow economist Robert J. Gordon ’62, who in his monumental treatise on The Rise and Fall of American Economic Growth (reviewed in “How America Grew,” May-June 2016, page 68) hailed 1870-1970 as a “special century” in this regard (interrupted midway by the disaster of the 1930s).
  • Gordon highlighted the role of a cluster of once-for-all-time technological advances—the steam engine, railroads, electrification, the internal combustion engine, radio and television, powered flight
  • Pessimistic that future technological advances (most obviously, the computer and electronics revolutions) will generate productivity gains to match those of the special century, Gordon therefore saw little prospect of a return to the rapid growth of those halcyon days.
  • DeLong instead points to a series of noneconomic (and non-technological) events that slowed growth, followed by a perverse turn in economic policy triggered in part by public frustration: In 1973 the OPEC cartel tripled the price of oil, and then quadrupled it yet again six years later.
  • For all too many Americans (and citizens of other countries too), the combination of high inflation and sluggish growth meant that “social democracy was no longer delivering the rapid progress toward utopia that it had delivered in the first post-World War II generation.”
  • Frustration over these and other ills in turn spawned what DeLong calls the “neoliberal turn” in public attitudes and economic policy. The new economic policies introduced under this rubric “did not end the slowdown in productivity growth but reinforced it.
  • the tax and regulatory changes enacted in this new climate channeled most of what economic gains there were to people already at the top of the income scale
  • Meanwhile, progressive “inclusion” of women and African Americans in the economy (and in American society more broadly) meant that middle- and lower-income white men saw even smaller gains—and, perversely, reacted by providing still greater support for policies like tax cuts for those with far higher incomes than their own.
  • Daniel Bell’s argument in his 1976 classic The Cultural Contradictions of Capitalism. Bell famously suggested that the very success of a capitalist economy would eventually undermine a society’s commitment to the values and institutions that made capitalism possible in the first plac
  • In DeLong’s view, the “greatest cause” of the neoliberal turn was “the extraordinary pace of rising prosperity during the Thirty Glorious Years, which raised the bar that a political-economic order had to surpass in order to generate broad acceptance.” At the same time, “the fading memory of the Great Depression led to the fading of the belief, or rather recognition, by the middle class that they, as well as the working class, needed social insurance.”
  • what the economy delivered to “hard-working white men” no longer matched what they saw as their just deserts: in their eyes, “the rich got richer, the unworthy and minority poor got handouts.”
  • As Bell would have put it, the politics of entitlement, bred by years of economic success that so many people had come to take for granted, squeezed out the politics of opportunity and ambition, giving rise to the politics of resentment.
  • The new era therefore became “a time to question the bourgeois virtues of hard, regular work and thrift in pursuit of material abundance.”
  • DeLong’s unspoken agenda would surely include rolling back many of the changes made in the U.S. tax code over the past half-century, as well as reinvigorating antitrust policy to blunt the dominance, and therefore outsize profits, of the mega-firms that now tower over key sectors of the economy
  • He would also surely reverse the recent trend moving away from free trade. Central bankers should certainly behave like Paul Volcker (appointed by President Carter), whose decisive action finally broke the 1970s inflation even at considerable economic cost
  • Not only Galbraith’s main themes but many of his more specific observations as well seem as pertinent, and important, today as they did then.
  • What will future readers of Slouching Towards Utopia conclude?
  • If anything, DeLong’s narratives will become more valuable as those events fade into the past. Alas, his description of fascism as having at its center “a contempt for limits, especially those implied by reason-based arguments; a belief that reality could be altered by the will; and an exaltation of the violent assertion of that will as the ultimate argument” will likely strike a nerve with many Americans not just today but in years to come.
  • what about DeLong’s core explanation of what went wrong in the latter third of his, and our, “long century”? I predict that it too will still look right, and important.
Javier E

How Poor Are the Poor? - NYTimes.com - 0 views

  • “Anyone who studies the issue seriously understands that material poverty has continued to fall in the U.S. in recent decades, primarily due to the success of anti-poverty programs” and the declining cost of “food, air-conditioning, communications, transportation, and entertainment,”
  • Despite the rising optimism, there are disagreements over how many poor people there are and the conditions they live under. There are also questions about the problem of relative poverty, what we are now calling inequality
  • Jencks argues that the actual poverty rate has dropped over the past five decades – far below the official government level — if poverty estimates are adjusted for food and housing benefits, refundable tax credits and a better method of determining inflation rates. In Jencks’s view, the war on poverty worked.
  • ...15 more annotations...
  • Democratic supporters of safety net programs can use Jencks’s finding that poverty has dropped below 5 percent as evidence that the war on poverty has been successful.
  • At the same time liberals are wary of positive news because, as Jencks notes:It is easier to rally support for such an agenda by saying that the problem in question is getting worse
  • The plus side for conservatives of Jencks’s low estimate of the poverty rate is the implication that severe poverty has largely abated, which then provides justification for allowing enemies of government entitlement programs to further cut social spending.
  • At the same time, however, Jencks’s data undermines Republican claims that the war on poverty has been a failure – a claim exemplified by Ronald Reagan’s famous 1987 quip: “In the sixties we waged a war on poverty, and poverty won.”
  • Jencks’s conclusion: “The absolute poverty rate has declined dramatically since President Johnson launched his war on poverty in 1964.” At 4.8 percent, Jencks’s calculation is the lowest poverty estimate by a credible expert in the field.
  • his conclusion — that instead of the official count of 45.3 million people living in poverty, the number of poor people in America is just under 15 million — understates the scope of hardship in this country.
  • There are strong theoretical justifications for the use of a relative poverty measure. The Organization for Economic Cooperation and Development puts it this way:In order to participate fully in the social life of a community, individuals may need a level of resources that is not too inferior to the norms of a community. For example, the clothing budget that allows a child not to feel ashamed of his school attire is much more related to national living standards than to strict requirements for physical survival
  • using a relative measure shows that the United States lags well behind other developed countries:If you use the O.E.C.D. standard of 50 percent of median income as a poverty line, the United States looks pretty bad in cross-national relief. We have a relative poverty rate exceeded only by Chile, Turkey, Mexico and Israel (which has seen a big increase in inequality in recent years). And that rate in 2010 was essentially where it was in 1995
  • While the United States “has achieved real progress in reducing absolute poverty over the past 50 years,” according to Burtless, “the country may have made no progress at all in reducing the relative economic deprivation of folks at the bottom.”
  • the heart of the dispute: How severe is the problem of poverty?
  • Kathryn Edin, a professor of sociology at Johns Hopkins, and Luke Schaefer, a professor of social work at the University of Michigan, contend that the poverty debate overlooks crucial changes that have taken place within the population of the poor.
  • welfare reform, signed into law by President Clinton in 1996 (the Personal Responsibility and Work Opportunity Act), which limited eligibility for welfare benefits to five years. The limitation has forced many of the poor off welfare: over the past 19 years, the percentage of families falling under the official poverty line who receive welfare benefits has fallen from to 26 percent from 68 percent. Currently, three-quarters of those in poverty, under the official definition, receive no welfare payments.
  • he enactment of expanded benefits for the working poor through the earned-income tax credit and the child tax credit.According to Edin and Schaefer, the consequence of these changes, taken together, has been to divide the poor who no longer receive welfare into two groups. The first group is made up of those who have gone to work and have qualified for tax credits. Expanded tax credits lifted about 3.2 million children out of poverty in 2013
  • he second group, though, has really suffered. These are the very poor who are without work, part of a population that is struggling desperately. Edin and Schaefer write that among the losers are an estimated 3.4 million “children who over the course of a year live for at least three months under a $2 per person per day threshold.”
  • ocusing on these findings, Mishel argues, diverts attention from the more serious problem of “the failure of the labor market to adequately reward low-wage workers.”To support his case, Mishel points out that hourly pay for those in the bottom fifth grew only 7.7 percent from 1979 to 2007, while productivity grew by 64 percent, and education levels among workers in this quintile substantially improved.
Javier E

With Dr. Stella Immanuel's viral video, this was the week America lost the war on misin... - 0 views

  • With nearly 150,000 dead from covid-19, we’ve not only lost the public-health war, we’ve lost the war for truth. Misinformation and lies have captured the castle.
  • And the bad guys’ most powerful weapon? Social media — in particular, Facebook
  • new research, out just this morning from Pew, tells us in painstaking numerical form exactly what’s going on, and it’s not pretty: Americans who rely on social media as their pathway to news are more ignorant and more misinformed than those who come to news through print, a news app on their phones or network TV.
  • ...6 more annotations...
  • nd that group is growing.
  • “Even as Americans who primarily turn to social media for political news are less aware and knowledgeable about a wide range of events and issues in the news, they are more likely than other Americans to have heard about a number of false or unproven claims.”
  • Specifically, they’ve been far more exposed to the conspiracy theory that powerful people intentionally planned the pandemic. Yet this group, says Pew, is also less concerned about the impact of made-up news like this than the rest of the U.S. population.
  • They’re absorbing fake news, but they don’t see it as a problem. In a society that depends on an informed citizenry to make reasonably intelligent decisions about self-governance, this is the worst kind of trouble.
  • In a sweeping piece on disinformation and the 2020 campaign in February — in the pre-pandemic era — the Atlantic’s McKay Coppins concluded with a telling quote from the political theorist Hannah Arendt that bears repetition now. Through an onslaught of lies, which may be debunked before the cycle is repeated, totalitarian leaders are able to instill in their followers “a mixture of gullibility and cynicism,” she warned.
  • Over time, people are conditioned to “believe everything and nothing, think that everything was possible and that nothing was true.” And then such leaders can do pretty much whatever they wish
Javier E

'The Fourth Turning' Enters Pop Culture - The New York Times - 0 views

  • According to “fourth turning” proponents, American history goes through recurring cycles. Each one, which lasts about 80 to 100 years, consists of four generation-long seasons, or “turnings.” The winter season is a time of upheaval and reconstruction — a fourth turning.
  • The theory first appeared in “The Fourth Turning,” a work of pop political science that has had a cult following more or less since it was published in 1997. In the last few years of political turmoil, the book and its ideas have bubbled into the mainstream.
  • According to “The Fourth Turning,” previous crisis periods include the American Revolution, the Civil War and World War II. America entered its latest fourth turning in the mid-2000s. It will culminate in a crisis sometime in the 2020s — i.e., now.
  • ...13 more annotations...
  • One of the book’s authors, Neil Howe, 71, has become a frequent podcast guest. A follow-up, “The Fourth Turning Is Here,” comes out this month.
  • The play’s author, Will Arbery, 33, said he heard about “The Fourth Turning” while researching Stephen K. Bannon, the right-wing firebrand and former adviser to President Donald J. Trump, who is a longtime fan of the book and directed a 2010 documentary based on its ideas.
  • He described it as “this almost fun theory about history,” but added: “And yet there’s something deeply menacing about it.”
  • Mr. Arbery, who said he does not subscribe to the theory, sees parallels between the fourth turning and other nonscientific beliefs. “I modeled the way that Teresa talks about the fourth turning on the way that young liberals talk about astrology,” he said.
  • The book’s outlook on the near future has made it appealing to macro traders and crypto enthusiasts, and it is frequently cited on the podcasts “Macro Voices,” “Wealthion” and “On the Margin.”
  • In the new book, he describes what a coming civil war or geopolitical conflict might look like — though he shies away from casting himself as a modern-day Nostradamus.
  • “The Fourth Turning” captured a mood of decline in recent American life. “I remember feeling safe in the ’90s, and then as soon as 9/11 hit, the world went topsy-turvy,” he said. “Every time my cohort got to the point where we were optimistic, another crisis happened. When I read the book, I was like, ‘That makes sense.’”
  • “The Fourth Turning” was conceived during a period of relative calm. In the late 1980s, Mr. Howe, a Washington, D.C., policy analyst, teamed with William Strauss, a founder of the political satire troupe the Capitol Steps.
  • Their first book, “Generations,” told a story of American history through generational profiles going back to the 1600s. The book was said to have influenced Bill Clinton to choose a fellow baby boomer, Al Gore, as his running mate
  • when the 2008 financial crisis hit at almost exactly the point when the start of the fourth turning was predicted, it seemed to many that the authors might have been onto something. Recent events — the pandemic, the storming of the Capitol — have seemingly provided more evidence for the book’s fans.
  • Historically, a fourth turning crisis has always translated into a civil war, a war of great nations, or both, according to the book. Either is possible over the next decade, Mr. Howe said. But he is a doomsayer with an optimistic streak: Each fourth turning, in his telling, kicks off a renaissance in civic life.
  • “I’ve read ‘The Fourth Turning,’ and indeed found it useful from a macroeconomic investing perspective,” Lyn Alden, 35, an investment analyst, wrote in an email. “History doesn’t repeat, but it kind of gives us a loose framework to work with.”
  • “This big tidal shift is arriving,” Mr. Howe said. “But if you’re asking me which wave is going to knock down the lighthouse, I can’t do that. I can just tell you that this is the time period. It gives you a good idea of what to watch for.”
Javier E

Episode 203 - Transcript - Philosophize This! - 0 views

  • what do you think the average person LIVING in postmodern society would say if you asked them…how do you determine what right or WRONG is in a given situation?
  • I think MOST people…a GOOD percentage of specifically YOUNG people alive today if you PRESSED them HARD enough on it would say that they think morality…is something that’s RELATIVE. 
  • They’ll say who am I to claim… that one culture is better or worse than any OTHER culture. THEIR values make sense to THEM…MY values make sense to ME. I can’t appeal to anything objectively BETTER about mine than theirs…and I CERTAINLY, as someone born into a postmodern type of subjectivity, have to be VERY skeptical of any sort of GRAND NARRATIVE that’s been constructed out there that tries to make CLAIMS about moral objectivity. Those don’t EXIST to me. So therefore, morality is relative. 
  • ...46 more annotations...
  • And then if you ask those SAME PEOPLE okay: well if that’s the CASE… then how should we be TREATING other people or cultures that see things differently than YOU do. And again for a lot of young people LIVING in a postmodern society their answer is often…that we should treat them with TOLERANCE.
  • And it makes SENSE: see because in a world where every moral conclusion is equally valid…then, of COURSE, you should be TOLERANT of people to be able hold whatever positions they WANT to. 
  • there’s OTHER people out there that would say to this person… that this tolerant relativism is actually…a glaring contradiction. That it’s SUCH A contradiction that it actually becomes an indefensible, philosophical position…because if every person and every culture out there is equally correct about morality…then that would mean that even the most INTOLERANT cultures, would have to be right as well
  • Which then makes your ADDITIONAL belief that TOLERANCE is the CORRECT way to be BEHAVING in this world…it makes it INCOMPATIBLE with TRUE moral relativism. 
  • the reason YOUNG people would be the ones that you see HOLDING this kind of position… is because they often times haven’t really been TESTED yet in life…where there’s a LINE in the sand and they’re FORCED to TAKE SIDES in difficult, moral issues, that NEED a decision to be made. 
  • Tolerant Relativism if you wanted to break it down…is REALLY something you see MOSTLY… in privileged, wealthy, WESTERN societies…because they would say the ONLY type of person that can HOLD that position for very long… are people that live in societies that are PEACEFUL enough… that they don’t really HAVE some group that opposes their entire existence that they feel they need to DEFEND themselves against. 
  • You know they’d say it’s funny… how your moral relativism starts to FADE a bit the second there’s a dude with an axe on your doorstep…it’s a pretty difficult act to pull off when you’re watching your family get dismembered in front of you to say your beliefs, my beliefs…let’s just call it halfsies halfsies why don’t we. 
  • Again there’s SOME people out there that would say that TRUE moral reasoning…. ONLY actually begins…when someone DECLARES a set of moral universals…and then is mature enough to recognize the WEIGHT and COMPLEXITY that comes along with DOING something like that.
  • as we talked about a couple episodes ago to Zizek: EVEN WITHIN something like postmodernism… that on the surface is SKEPTICAL of ANY of these universals…in the sense that postmodernism ELEVATES DIFFERENCE and CELEBRATES it as the most important factor…to someone like Zizek…this is NOT a postmodernist REJECTING universals…to HIM this is JUST creating a UNIVERSAL out of DIFFERENCE. 
  • maybe it’s IMPOSSIBLE for someone to NOT be following moral universals…it’s just possible for people to not be AWARE of the ones they’re supporting…or to live in a place that’s PEACEFUL enough to not REQUIRE you to look at yours deeper. 
  • let’s PROCEED from here as though this is the case. That a VERY important piece of making ANY sort of PROGRESS in the world…is GOING to require people to DECLARE certain moral universals…and then to be able to ACT on them without having to apologize for them constantly. 
  • This PERSON would say there’s an INFINITE number of WAYS that history can be interpreted…and OUR responsibility is to SUBVERT the existing narratives and tell the stories of the voiceless from the past!
  • IF that is TRUE…then it would make TOTAL SENSE to Mark Fisher why the cultural LOGIC of postmodernism…LEAVES us in a PLACE he thinks…where we are COMPLETELY STUCK…in the present. 
  • he CALLS the western world a society that has a memory condition: the western world has what’s called anterograde amnesia. 
  • : there’s a MOVIE that can help illustrate his point here. Mark Fisher compares how we are as a society…to the character named Leonard…in the movie Memento, directed by Christopher Nolan in the year 2000
  • The main character is a guy named Leonard…that can’t FORM new memories. Importantly in the movie he’s ALSO a guy whose wife was murdered not too long ago.
  • And he remembers EVERYTHING about his life up until a certain POINT…but once he gets sick, no matter how hard he tries, he just doesn’t remember anything BEYOND that.
  • Now in the movie…he’s ALSO trying to SOLVE the murder of his wife, so whenever he gets a piece of information he doesn’t want to forget that could help him figure it out…he tattoos it on his body, he takes a bunch of pictures, he makes notes about it…he essentially is a man…that has a MAJOR MYSTERY that he needs to solve that is SUPER important to him, but is constantly living in this HAZE where he CAN’T form new memories has to be SKEPTICAL of everything around him and lives pretty much every day in a state of confusion. 
  • To Mark Fisher…this DESCRIBES the life of a modern person maybe BETTER than it first may seem, and it CERTAINLY describes the condition of society overall. We are LIVING in a state of CULTURAL amnesia…where we CAN’T remember our PAST, which then makes it IMPOSSIBLE to accurately diagnose the present, and even MORE difficult to be able to IMAGINE a different social future that may be better off for people. 
  • THINK of the CONFUSION that postmodernism often LEAVES people in. When you QUESTION…GRAND NARRATIVES about the world you live in…and MORE than that: when QUESTIONING narratives and universals BECOMES something that’s VERY important to yo
  • the COST of that often times are the things that TRADITIONALLY, have GIVEN people a clear sense of IDENTITY all throughout human history: that is the METANARRATIVES that UNIFY societies together around certain common stories we have about reality. 
  • As an example: THINK of how this applies to HISTORY…as ONE of those common stories societies usually have.
  • There’s ONE version of history that’s taught to people in CLASSROOMS…that centers history around great WARS that have taken place. Memorizing a bunch of dates…THIS is when Napoleon invaded Russia…THIS is when the Magna Carta was signed…in other words: HUMAN HISTORY… is just a progression of different great leaders… SEIZING territory from each other. 
  • And there’s a CRITICISM of that view that is well received by people in post modern society that says: well THAT’S not the whole story of what humanity is! We’re talking about ALL human BEINGS here…HUMAN history is JUST as much the summer romance between two people that fall in love…the life of a street vendor in 9th century baghdad…
  • that could BE because you live in a really safe, peaceful country…it could ALSO just be you MANUFACTURING a peaceful environment like that in your LIFE, by surrounding yourself with FRIENDS who all AGREE with you. 
  • again this is generally seen as a REALLY NICE sentiment to people LIVING in a postmodern world. 
  • But what that ALSO brings along with it some people say…is a CREATIVE LICENSE to able to REINTERPRET human history…and PRESENT it in a way that just BENEFITS whatever political ends you’re trying to JUSTIFY. 
  • For example in MY country the United States…the FOUNDING FATHERS of our country, who WERE any of these dudes with buckles on me shoes and powdered wigs? Like what’s the TRUE answer to that question?
  • in MANY cases in postmodern society…it all depends on what side of the political aisle you LAND on…ONE side of it interprets history in a way where these men were some of the greatest political minds to have ever LIVED on planet earth, launching the greatest experiment in nation building that has EVER been launched.
  • Now it’s ALSO possible to see these men as SLAVE owners, bigots, people that were actively complicit in the extermination of the native americans, and MUCH MORE. But WHICH one of these is TRUE? 
  • you’ll SEE this happen when it comes to MOST of a postmodern subject’s view of history. Where DEPENDING on what STORY you believe about the recent PAST of the place you live in…that will DETERMINE the way that you see the present, and then what you think the next, best MOVES are for the futur
  • But if nobody can AGREE on what their HISTORY is…then HISTORY isn’t a METANARRATIVE anymore that UNIFIES a society…HISTORY just becomes this fragmented STORY that’s used as an INSTRUMENT to prove your political bias. The SAME events, the SAME historical FIGURES…the MEANING of them will COMPLETELY CHANGE depending on who’s EVOKING them.
  • HISTORY is not the ONLY example of a metanarrative that’s been deconstructed to the point that it no longer has the same unifying potential as in former societies
  • From shared rituals, community bonds, a shared conception of truth more generally, MOST things that unify your understanding of what your culture is all about, and who YOU are as a person WITHIN that culture.
  • there’s a REASON SEVERAL, modern day philosophers… have DESCRIBED the world we live in… as Schizophrenic.
  • that’s obviously not a CLINICAL diagnosis they’re making…
  • t’s a metaphor for the TYPE of experience that’s often available for people, where there’s a BREAKDOWN… of these unifying metanarratives...that help us develop a CLEAR sense of who we ARE…and an obvious, DEFINED POSITION within the world around us with clear boundaries to it. 
  • Feeling confused, like you DON’T REALLY know what’s going on, and you don’t know who or what to read to FIGURE out what’s going on, and you think the ONE thing that’s for sure is that people that CLAIM to know what’s going on, are CLEARLY idiots, and you feel like every year sort of blends into the next with no REAL prospects on the horizon for different ways of living that may come about in the future…this is a COMMON complaint…of people LIVING in postmodern culture
  • it’s BECAUSE postmodernism…at bottom…IS the critique of the critique. It is a reaction video to a reaction video about reality. It is FUNDAMENTALLY, NOT ABOUT CONSTRUCTING any NEW cultural forms…it’s about DECONSTRUCTION. It’s about the elevation of DIFFERENCE to the level of the universal. 
  • This is what MAKES the critique so EFFECTIVE…but it ALSO COMES with certain social effects. It becomes VERY difficult to go EXTERNAL to yourself to find MEANING…or to DECLARE universals and look to the FUTURE as a way out
  • So, what HAPPENS…is when people can’t go EXTERNAL they turn INWARD towards NARCISSISM…and because they can’t go FORWARDS they turn BACKWARDS towards nostalgia. 
  • HIS is going to be the other part of this unique BLEND we talked about last episode that is going to LEAD us to this state of affairs called Capitalist Realism
  • Where everything we talked about LAST episode with neoliberalism, the focus is on the individual and the expansion of CAPITAL for the sake of CAPITAL…gets combined with postmodernism…that puts people in a HAZE where they are CONFUSED and INCAPABLE of ORIENTING themselves in TIME…let ALONE being able to imagine a different social future. 
  • To put it ANOTHER way: we are STUCK for Mark Fisher in a confused, narcissistic PRESENT moment…with NO conception of what the future should look like.
  • And as HE said: Capitalist Realism’s IMPOSSIBLE to define in a single sentence…the best way to SHOW people what Capitalist Realism is…is just to give them example, after example… that they can see in the world all around them
  • show through examples how IN this postmodern, neoliberal VACUUM that’s been created…how we ACCEPT the FALSE reality that CAPITALISM…is NOT an economic system…it’s just simply the WAY the world is, with no hope of changing it. 
  •  
    Episode #203 - Why the future is being slowly cancelled. - Postmodernism (Mark Fisher, Capitalist Realism)
Javier E

David Hare: A Political and Personal Playwright - Lantern Theater Company: Searchlight ... - 0 views

  • The Vertical Hour was Hare’s next play after Stuff Happens, making the global machinations of the Iraq War and the related decisions and consequences intensely personal. This, too, is a hallmark of Hare’s work. His plays are populated by damaged idealists trying to find a way to live right in a world of broken and corrupt institutions, but they are never content to offer just one view — even if the playwright himself passionately holds one opinion.
  • Working to bridge the political left and right, or at least to locate the places where they overlap, is a core component of his work, and specifically of The Vertical Hour
  • “I was very interested in the position of the pro-war liberals. They had a very strong moral case for intervention. It was part of something that had been building up over the previous 10 or 15 years in Africa, in Yugoslavia, where a whole lot of well-intentioned liberals came to believe that the West had a moral duty to intervene when there was great deal of suffering. Although I was against the war, I could see that there was a virtuous case.”
  • ...4 more annotations...
  • Giving all sides a hearing is not just a political tactic in his plays, though; it is essential to crafting work that highlights contemporary issues through finely realized and deeply felt characters
  • Hare’s work, populated by those who want to help, those who want to exploit broken institutions, and those caught in the morass of contemporary life, aims to make a tangible difference in those lives and institutions.
  • Beckett said that famous thing, the number of tears in the world is constant, meaning, whatever you do, there is such a thing called the human condition and it’s always the same,” Hare said to NPR.
  • “I don’t believe that. I believe things are very different in one country to another and at one time and another, and you can actually relieve the number of tears in the world. And you can make them less and that the job of making them less is a noble job and something worth undertaking.”
anonymous

Opinion | He's a Famous Evangelical Preacher, but His Kids Wish He'd Pipe Down - The Ne... - 1 views

  • He’s a Famous Evangelical Preacher, but His Kids Wish He’d Pipe Down
  • The Rev. Rick Joyner has called on Christians to arm themselves for civil war. But his children would be on the other side.
  • The Rev. Rick Joyner is a famous evangelical leader who has called on Christians to arm themselves for an inevitable civil war against liberals, whom he suggests are allies of the devil.
  • ...23 more annotations...
  • But this is the awkward part: His five children would be on the other side of that civil war, as he and his kids all acknowledge
  • She worries that his far-right rhetoric may get people killed, so she feels a responsibility to challenge him.
  • “He talks about Democrats being evil, forgetting that all five of his kids vote Democratic,”
  • “Who is he asking his followers to take up arms against? Liberal activists? That’s me.”
  • Just as America is torn asunder by politics and polarization, so is the Joyner family.
  • “I hope my kids don’t get involved in the violence, but it’s coming,”
  • “I think what he does is morally wrong, but I love him,”
  • “I don’t want to hurt him, but when he’s spreading dangerous ideas, it gets complicated.”
  • “One of my goals as a parent was to raise strong, independent children,”
  • “But I think I overshot the runway.
  • “I think it’s completely possible that some of my dad’s followers could pick up guns and cause violence because they think they’re defending the country,”
  • He claims liberals are in league with Satan and Democrats are plotting “to criminalize Christianity.”
  • Joyner’s rants leave his children furiously texting back and forth in exasperation (they say their mom is somewhere in the middle).
  • “There is a responsibility to hold those you love accountable,”
  • But how? The siblings disagree among themselves.
  • “I’m not willing to sacrifice my relationship with him to call him out.”
  • The most outspoken is Anna Jane, who says her father’s rhetoric became more extreme in recent years. She had her first falling out with him when she became a Democrat and an environmentalist while a sophomore at the University of North Carolina at Chapel Hill. He cut off her college funding, and she moved to New Zealand. Soon after, she nearly died in a boating accident — and the first person she called was her dad. They reconciled.
  • He also promoted a film by his son Ben about a gay man in the South, even though it likewise had him gritting his teeth.
  • “The church in America has been tremendously weakened,”
  • At what point can I no longer go home for Thanksgiving and watch football with my dad?” Ben mused. “By doing so, am I condoning his behavior? It can be hard to draw that line in the sand, especially when you love this person.”
  • “Is it OK to just talk about movies and dogs with someone who’s trying to incite civil war? I don’t know.”
  • She was furious at her father for his climate denial — but he’s the person she called in that crisis, and he stayed on the phone with her for much of the night, relaying the latest information and helping to keep her safe.
  • “I’m so angry at him for his politics and for endangering me and all of us by not believing in climate change,” she said. “And yet he’s the one I turn to in the middle of the night when I’m evacuating and I’m really scared.”
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

Marie Kondo and the Ruthless War on Stuff - The New York Times - 1 views

  • the method outlined in Kondo’s book. It includes something called a “once-in-a-lifetime tidying marathon,” which means piling five categories of material possessions — clothing, books, papers, miscellaneous items and sentimental items, including photos, in that order — one at a time, surveying how much of each you have, seeing that it’s way too much and then holding each item to see if it sparks joy in your body. The ones that spark joy get to stay. The ones that don’t get a heartfelt and generous goodbye, via actual verbal communication, and are then sent on their way to their next life.
  • She is often mistaken for someone who thinks you shouldn’t own anything, but that’s wrong. Rather, she thinks you can own as much or as little as you like, as long as every possession brings you true joy.
  • By the time her book arrived, America had entered a time of peak stuff, when we had accumulated a mountain of disposable goods — from Costco toilet paper to Isaac Mizrahi swimwear by Target — but hadn’t (and still haven’t) learned how to dispose of them. We were caught between an older generation that bought a princess phone in 1970 for $25 that was still working and a generation that bought $600 iPhones, knowing they would have to replace them within two years. We had the princess phone and the iPhone, and we couldn’t dispose of either. We were burdened by our stuff; we were drowning in it.
  • ...16 more annotations...
  • A woman named Diana, who wore star-and-flower earrings, said that before she tidied, her life was out of control. Her job had been recently eliminated when she found the book. “It’s a powerful message for women that you should be surrounded by things that make you happy,”
  • “I found the opposite of happiness is not sadness,” Diana told us. “It’s chaos.”
  • Another woman said she KonMaried a bad boyfriend. Having tidied everything in her home and finding she still distinctly lacked happiness, she held her boyfriend in her hands, realized he no longer sparked joy and got rid of him.
  • She realized that the work she was doing as a tidying consultant was far more psychological than it was practical. Tidying wasn’t just a function of your physical space; it was a function of your soul.
  • . She wants you to override the instinct to keep a certain thing because an HGTV show or a home-design magazine or a Pinterest page said it would brighten up your room or make your life better. She wants you to possess your possessions on your own terms, not theirs
  • she would say to him what she said to me, that yes, America is a little different from Japan, but ultimately it’s all the same. We’re all the same in that we’re enticed into the false illusion of happiness through material purchase.
  • She leaves room for something that people don’t often give her credit for: that the KonMari method might not be your speed. “I think it’s good to have different types of organizing methods,” she continued, “because my method might not spark joy with some people, but his method might.
  • Conference was different from the KonMari events that I attended. Whereas Kondo does not believe that you need to buy anything in order to organize and that storage systems provide only the illusion of tidiness, the women of Conference traded recon on timesaving apps, label makers, the best kind of Sharpie, the best tool they own (“supersticky notes,” “drawer dividers”)
  • They don’t like that you have to get rid of all of your papers, which is actually a misnomer: Kondo just says you should limit them because they’re incapable of sparking joy, and you should confine them to three folders: needs immediate attention, must be kept for now, must be kept forever.
  • each organizer I spoke with said that she had the same fundamental plan that Kondo did, that the client should purge (they cry “purge” for what Kondo gently calls “discarding”) what is no longer needed or wanted; somehow the extra step of thanking the object or folding it a little differently enrages them. This rage hides behind the notion that things are different here in America, that our lives are more complicated and our stuff is more burdensome and our decisions are harder to make.
  • Ultimately, the women of NAPO said that Kondo’s methods were too draconian and that the clients they knew couldn’t live in Kondo’s world. They had jobs and children, and they needed baby steps and hand-holding and maintenance plans. They needed someone to do for them what they couldn’t naturally do for themselves.
  • the most potent difference between Kondo and the NAPO women is that the NAPO women seek to make a client’s life good by organizing their stuff; Kondo, on the other hand, leads with her spiritual mission, to change their lives through magic.
  • She went to work in finance, but she found the work empty and meaningless. She would come home and find herself overwhelmed by her stuff. So she began searching for “minimalism” on the internet almost constantly, happening on Pinterest pages of beautiful, empty bathrooms and kitchens, and she began to imagine that it was her stuff that was weighing her down. She read philosophy blogs about materialism and the accumulation of objects. “They just all talked about feeling lighter,”
  • “I never knew how to get here from there,” she said. Ning looked around her apartment, which is spare. She loves it here now, but that seemed impossible just a couple of years ago.
  • She found Kondo’s book, and she felt better immediately, just having read it. She began tidying, and immediately she lost three pounds. She had been trying to lose weight forever, and then suddenly, without effort, three pounds, just gone.
  • when it comes to stuff, we are all the same. Once we’ve divided all the drawers and eliminated that which does not bring us joy and categorized ourselves within an inch of our lives, we’ll find that the person lying beneath all the stuff was still just plain old us. We are all a mess, even when we’re done tidying.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
1 - 20 of 158 Next › Last »
Showing 20 items per page