Skip to main content

Home/ TOK Friends/ Group items tagged WHO

Rss Feed Group items tagged

Duncan H

Other People's Suffering - NYTimes.com - 0 views

  • members of the upper class are more likely than others to behave unethically, to lie during negotiations, to drive illegally and to cheat when competing for a prize.“Greed is a robust determinant of unethical behavior,” the authors conclude. “Relative to lower-class individuals, individuals from upper-class backgrounds behaved more unethically in both naturalistic and laboratory settings.”
  • Our findings suggest that when a person is suffering, upper-class individuals perceive these signals less well on average, consistent with other findings documenting reduced empathic accuracy in upper-class individuals (Kraus et al., 2010). Taken together, these findings suggest that upper-class individuals may underestimate the distress and suffering in their social environments.
  • each participant was assigned to listen, face to face, from two feet away, to someone else describing real personal experiences of suffering and distress.The listeners’ responses were measured two ways, first by self-reported levels of compassion and second by electrocardiogram readings to determine the intensity of their emotional response. The participants all took a test known as the “sense of power” scale, ranking themselves on such personal strengths and weaknesses as ‘‘I can get people to listen to what I say’’ and ‘‘I can get others to do what I want,” as well as ‘‘My wishes do not carry much weight’’ and ‘‘Even if I voice them, my views have little sway,’’ which are reverse scored.The findings were noteworthy, to say the least. For “low-power” listeners, compassion levels shot up as the person describing suffering became more distressed. Exactly the opposite happened for “high-power” listeners: their compassion dropped as distress rose.
  • ...7 more annotations...
  • Who fits the stereotype of the rich and powerful described in this research? Mitt Romney. Empathy: “I’m not concerned about the very poor.” Compassion: “I like being able to fire people who provide services to me.” Sympathy for the disadvantaged: My wife “drives a couple of Cadillacs.” Willingness to lie in negotiations: “I was a severely conservative Republican governor.”
  • 48 percent described the Democratic Party as “weak,” compared to 28 percent who described the Republican Party that way. Conversely, 50 percent said the Republican Party is “cold hearted,” compared to 30 percent who said that was true of the Democrats.
  • This is the war that is raging throughout America. It is between conservatives, who emphasize personal responsibility and achievement, against liberals, who say the government must take from the wealthy and give to the poor. So it will be interesting this week to see if President Obama can rally the country to support his vision of a strong social compact. He has compassion on his side. Few Americans want to see their fellow citizens suffer. But the president does have that fiscal responsibility issue haunting him because the country remains in dire trouble.
  • For power holders, the world is viewed through an instrumental lens, and approach is directed toward those individuals who populate the useful parts of the landscape. Our results suggest that power not only channels its possessor’s energy toward goal completion but also targets and attempts to harness the energy of useful others. Thus, power appears to be a great facilitator of goal pursuit through a combination of intrapersonal and interpersonal processes. The nature of the power holder’s goals and interpersonal relationships ultimately determine how power is harnessed and what is accomplished in the end.
  • Republicans recognize the political usefulness of objectification, capitalizing on “compassion fatigue,” or the exhaustion of empathy, among large swathes of the electorate who are already stressed by the economic collapse of 2008, high levels of unemployment, an epidemic of foreclosures, stagnant wages and a hyper-competitive business arena.
  • . Republican debates provided further evidence of compassion fatigue when audiences cheered the record-setting use of the death penalty in Texas and applauded the prospect of a gravely ill pauper who, unable to pay medical fees, was allowed to die.Even Rick Santorum, who has been described by the National Review as holding “unstinting devotion to human dignity” and as fluent in “the struggles of the working class,” wants to slash aid to the poor. At a Feb. 21 gathering of 500 voters in Maricopa County, Ariz., Santorum brought the audience to its feet as he declared:We need to take everything from food stamps to Medicaid to the housing programs to education and training programs, we need to cut them, cap them, freeze them, send them to the states, say that there has to be a time limit and a work requirement, and be able to give them the flexibility to do those programs here at the state level.
  • President Obama has a substantial advantage this year because he does not have a primary challenger, which frees him from the need to emphasize his advocacy for the disempowered — increasing benefits or raising wages for the poor. This allows him to pick and chose the issues he wants to address.At the same time, compassion fatigue may make it easier for the Republican nominee to overcome the liabilities stemming from his own primary rhetoric, to reach beyond the core of the party to white centrist voters less openly drawn to hard-edged conservatism. With their capacity for empathy frayed by a pervasive sense of diminishing opportunity and encroaching shortfall, will these voters once again become dependable Republicans in 2012?
  •  
    Do you agree with Edsall? I think he is definitely taking an anti-Republican stance, but the findings are interesting.
Javier E

He Wants to Save Classics From Whiteness. Can the Field Survive? - The New York Times - 0 views

  • Padilla laid out an indictment of his field. “If one were intentionally to design a discipline whose institutional organs and gatekeeping protocols were explicitly aimed at disavowing the legitimate status of scholars of color,” he said, “one could not do better than what classics has done.”
  • Padilla believes that classics is so entangled with white supremacy as to be inseparable from it. “Far from being extrinsic to the study of Greco-Roman antiquity,” he has written, “the production of whiteness turns on closer examination to reside in the very marrows of classics.”
  • Rather than kowtowing to criticism, Williams said, “maybe we should start defending our discipline.” She protested that it was imperative to stand up for the classics as the political, literary and philosophical foundation of European and American culture: “It’s Western civilization. It matters because it’s the West.” Hadn’t classics given us the concepts of liberty, equality and democracy?
  • ...46 more annotations...
  • Williams ceded the microphone, and Padilla was able to speak. “Here’s what I have to say about the vision of classics that you outlined,” he said. “I want nothing to do with it. I hope the field dies that you’ve outlined, and that it dies as swiftly as possible.”
  • “I believe in merit. I don’t look at the color of the author.” She pointed a finger in Padilla’s direction. “You may have got your job because you’re Black,” Williams said, “but I would prefer to think you got your job because of merit.”
  • What he did find was a slim blue-and-white textbook titled “How People Lived in Ancient Greece and Rome.” “Western civilization was formed from the union of early Greek wisdom and the highly organized legal minds of early Rome,” the book began. “The Greek belief in a person’s ability to use his powers of reason, coupled with Roman faith in military strength, produced a result that has come to us as a legacy, or gift from the past.” Thirty years later, Padilla can still recite those opening lines.
  • In 2017, he published a paper in the journal Classical Antiquity that compared evidence from antiquity and the Black Atlantic to draw a more coherent picture of the religious life of the Roman enslaved. “It will not do merely to adopt a pose of ‘righteous indignation’ at the distortions and gaps in the archive,” he wrote. “There are tools available for the effective recovery of the religious experiences of the enslaved, provided we work with these tools carefully and honestly.”
  • Padilla sensed that his pursuit of classics had displaced other parts of his identity, just as classics and “Western civilization” had displaced other cultures and forms of knowledge. Recovering them would be essential to dismantling the white-supremacist framework in which both he and classics had become trapped. “I had to actively engage in the decolonization of my mind,” he told me.
  • He also gravitated toward contemporary scholars like José Esteban Muñoz, Lorgia García Peña and Saidiya Hartman, who speak of race not as a physical fact but as a ghostly system o
  • In response to rising anti-immigrant sentiment in Europe and the United States, Mary Beard, perhaps the most famous classicist alive, wrote in The Wall Street Journal that the Romans “would have been puzzled by our modern problems with migration and asylum,” because the empire was founded on the “principles of incorporation and of the free movement of people.”
  • In November 2015, he wrote an essay for Eidolon, an online classics journal, clarifying that in Rome, as in the United States, paeans to multiculturalism coexisted with hatred of foreigners. Defending a client in court, Cicero argued that “denying foreigners access to our city is patently inhumane,” but ancient authors also recount the expulsions of whole “suspect” populations, including a roundup of Jews in 139 B.C., who were not considered “suitable enough to live alongside Romans.”
  • The job of classicists is not to “point out the howlers,” he said on a 2017 panel. “To simply take the position of the teacher, the qualified classicist who knows things and can point to these mistakes, is not sufficient.”
  • Dismantling structures of power that have been shored up by the classical tradition will require more than fact-checking; it will require writing an entirely new story about antiquity, and about who we are today
  • To find that story, Padilla is advocating reforms that would “explode the canon” and “overhaul the discipline from nuts to bolts,” including doing away with the label “classics” altogether.
  • . “What I want to be thinking about in the next few weeks,” he told them, “is how we can be telling the story of the early Roman Empire not just through a variety of sources but through a variety of persons.” He asked the students to consider the lives behind the identities he had assigned them, and the way those lives had been shaped by the machinery of empire, which, through military conquest, enslavement and trade, creates the conditions for the large-scale movement of human beings.
  • ultimately, he decided that leaving enslaved characters out of the role play was an act of care. “I’m not yet ready to turn to a student and say, ‘You are going to be a slave.’”
  • Privately, even some sympathetic classicists worry that Padilla’s approach will only hasten the field’s decline. “I’ve spoken to undergrad majors who say that they feel ashamed to tell their friends they’re studying classics,”
  • “I very much admire Dan-el’s work, and like him, I deplore the lack of diversity in the classical profession,” Mary Beard told me via email. But “to ‘condemn’ classical culture would be as simplistic as to offer it unconditional admiration.”
  • In a 2019 talk, Beard argued that “although classics may become politicized, it doesn’t actually have a politics,” meaning that, like the Bible, the classical tradition is a language of authority — a vocabulary that can be used for good or ill by would-be emancipators and oppressors alike.
  • Over the centuries, classical civilization has acted as a model for people of many backgrounds, who turned it into a matrix through which they formed and debated ideas about beauty, ethics, power, nature, selfhood, citizenship and, of course, race
  • Anthony Grafton, the great Renaissance scholar, put it this way in his preface to “The Classical Tradition”: “An exhaustive exposition of the ways in which the world has defined itself with regard to Greco-Roman antiquity would be nothing less than a comprehensive history of the world.”
  • Classics as we know it today is a creation of the 18th and 19th centuries. During that period, as European universities emancipated themselves from the control of the church, the study of Greece and Rome gave the Continent its new, secular origin story. Greek and Latin writings emerged as a competitor to the Bible’s moral authority, which lent them a liberatory power
  • Historians stress that such ideas cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions. “The whiter the body is, the more beautiful it is,” Winkelmann wrote.
  • While Renaissance scholars were fascinated by the multiplicity of cultures in the ancient world, Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below.
  • Jefferson, along with most wealthy young men of his time, studied classics at college, where students often spent half their time reading and translating Greek and Roman texts. “Next to Christianity,” writes Caroline Winterer, a historian at Stanford, “the central intellectual project in America before the late 19th century was classicism.
  • Of the 2.5 million people living in America in 1776, perhaps only 3,000 had gone to college, but that number included many of the founders
  • They saw classical civilization as uniquely educative — a “lamp of experience,” in the words of Patrick Henry, that could light the path to a more perfect union. However true it was, subsequent generations would come to believe, as Hannah Arendt wrote in “On Revolution,” that “without the classical example … none of the men of the Revolution on either side of the Atlantic would have possessed the courage for what then turned out to be unprecedented action.”
  • Comparisons between the United States and the Roman Empire became popular as the country emerged as a global power. Even after Latin and Greek were struck from college-entrance exams, the proliferation of courses on “great books” and Western civilization, in which classical texts were read in translation, helped create a coherent national story after the shocks of industrialization and global warfare.
  • even as the classics were pulled apart, laughed at and transformed, they continued to form the raw material with which many artists shaped their visions of modernity.
  • Over the centuries, thinkers as disparate as John Adams and Simone Weil have likened classical antiquity to a mirror. Generations of intellectuals, among them feminist, queer and Black scholars, have seen something of themselves in classical texts, flashes of recognition that held a kind of liberatory promise
  • The language that is used to describe the presence of classical antiquity in the world today — the classical tradition, legacy or heritage — contains within it the idea of a special, quasi-genetic relationship. In his lecture “There Is No Such Thing as Western Civilization,” Kwame Anthony Appiah (this magazine’s Ethicist columnist) mockingly describes the belief in such a kinship as the belief in a “golden nugget” of insight — a precious birthright and shimmering sign of greatness — that white Americans and Europeans imagine has been passed down to them from the ancients.
  • To see classics the way Padilla sees it means breaking the mirror; it means condemning the classical legacy as one of the most harmful stories we’ve told ourselves
  • Padilla is wary of colleagues who cite the radical uses of classics as a way to forestall change; he believes that such examples have been outmatched by the field’s long alliance with the forces of dominance and oppression.
  • Classics and whiteness are the bones and sinew of the same body; they grew strong together, and they may have to die together. Classics deserves to survive only if it can become “a site of contestation” for the communities who have been denigrated by it in the past.
  • if classics fails his test, Padilla and others are ready to give it up. “I would get rid of classics altogether,” Walter Scheidel, another of Padilla’s former advisers at Stanford, told me. “I don’t think it should exist as an academic field.”
  • One way to get rid of classics would be to dissolve its faculties and reassign their members to history, archaeology and language departments.
  • many classicists are advocating softer approaches to reforming the discipline, placing the emphasis on expanding its borders. Schools including Howard and Emory have integrated classics with Ancient Mediterranean studies, turning to look across the sea at Egypt, Anatolia, the Levant and North Africa. The change is a declaration of purpose: to leave behind the hierarchies of the Enlightenment and to move back toward the Renaissance model of the ancient world as a place of diversity and mixture.
  • Ian Morris put it more bluntly. “Classics is a Euro-American foundation myth,” Morris said to me. “Do we really want that sort of thing?”
  • There’s a more interesting story to be told about the history of what we call the West, the history of humanity, without valorizing particular cultures in it,” said Josephine Quinn, a professor of ancient history at Oxford. “It seems to me the really crucial mover in history is always the relationship between people, between cultures.”
  • “In some moods, I feel that this is just a moment of despair, and people are trying to find significance even if it only comes from self-accusation,” he told me. “I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.”
  • “One of the dubious successes of my generation is that it did break the canon,” Richlin told me. “I don’t think we could believe at the time that we would be putting ourselves out of business, but we did.” She added: “If they blew up the classics departments, that would really be the end.”
  • Padilla, like Douglass, now sees the moment of absorption into the classical, literary tradition as simultaneous with his apprehension of racial difference; he can no longer find pride or comfort in having used it to bring himself out of poverty.
  • “Claiming dignity within this system of structural oppression,” Padilla has said, “requires full buy-in into its logic of valuation.” He refuses to “praise the architects of that trauma as having done right by you at the end.”
  • Last June, as racial-justice protests unfolded across the nation, Padilla turned his attention to arenas beyond classics. He and his co-authors — the astrophysicist Jenny Greene, the literary theorist Andrew Cole and the poet Tracy K. Smith — began writing their open letter to Princeton with 48 proposals for reform. “Anti-Blackness is foundational to America,” the letter began. “Indifference to the effects of racism on this campus has allowed legitimate demands for institutional support and redress in the face of microaggression and outright racist incidents to go long unmet.”
  • Padilla believes that the uproar over free speech is misguided. “I don’t see things like free speech or the exchange of ideas as ends in themselves,” he told me. “I have to be honest about that. I see them as a means to the end of human flourishing.”
  • “There is a certain kind of classicist who will look on what transpired and say, ‘Oh, that’s not us,’” Padilla said when we spoke recently. “What is of interest to me is why is it so imperative for classicists of a certain stripe to make this discursive move? ‘This is not us.’
  • Joel Christensen, the Brandeis professor, now feels that it is his “moral and ethical and intellectual responsibility” to teach classics in a way that exposes its racist history. “Otherwise we’re just participating in propaganda,”
  • Christensen, who is 42, was in graduate school before he had his “crisis of faith,” and he understands the fear that many classicists may experience at being asked to rewrite the narrative of their life’s work. But, he warned, “that future is coming, with or without Dan-el.”
  • On Jan. 6, Padilla turned on the television minutes after the windows of the Capitol were broken. In the crowd, he saw a man in a Greek helmet with TRUMP 2020 painted in white. He saw a man in a T-shirt bearing a golden eagle on a fasces — symbols of Roman law and governance — below the logo 6MWE, which stands for “Six Million Wasn’t Enough,
Javier E

Opinion | Richard Powers on What We Can Learn From Trees - The New York Times - 0 views

  • Theo and Robin have a nightly ritual where they say a prayer that Alyssa, the deceased wife and mother, taught them: May all sentient beings be free from needless suffering. That prayer itself comes from the four immeasurables in the Buddhist tradition.
  • When we enter into or recover this sense of kinship that was absolutely fundamental to so many indigenous cultures everywhere around the world at many, many different points in history, that there is no radical break between us and our kin, that even consciousness is shared, to some degree and to a large degree, with a lot of other creatures, then death stops seeming like the enemy and it starts seeming like one of the most ingenious kinds of design for keeping evolution circulating and keeping the experiment running and recombining.
  • Look, I’m 64 years old. I can remember sitting in psychology class as an undergraduate and having my professor declare that no, of course animals don’t have emotions because they don’t have an internal life. They don’t have conscious awareness. And so what looks to you like your dog being extremely happy or being extremely guilty, which dogs do so beautifully, is just your projection, your anthropomorphizing of those other creatures. And this prohibition against anthropomorphism created an artificial gulf between even those animals that are ridiculously near of kin to us, genetically.
  • ...62 more annotations...
  • I don’t know if that sounds too complicated. But the point is, it’s not just giving up domination. It’s giving up this sense of separateness in favor of a sense of kinship. And those people who do often wonder how they failed to see how much continuity there is in the more-than-human world with the human world.
  • to go from terror into being and into that sense that the experiment is sacred, not this one outcome of the experiment, is to immediately transform the way that you think even about very fundamental social and economic and cultural things. If the experiment is sacred, how can we possibly justify our food systems, for instance?
  • when I first went to the Smokies and hiked up into the old growth in the Southern Appalachians, it was like somebody threw a switch. There was some odd filter that had just been removed, and the world sounded different and smelled different.
  • richard powersYeah. In human exceptionalism, we may be completely aware of evolutionary continuity. We may understand that we have a literal kinship with the rest of creation, that all life on Earth employs the same genetic code, that there is a very small core of core genes and core proteins that is shared across all the kingdoms and phyla of life. But conceptually, we still have this demented idea that somehow consciousness creates a sanctity and a separation that almost nullifies the continuous elements of evolution and biology that we’ve come to understand.
  • if we want to begin this process of rehabilitation and transformation of consciousness that we are going to need in order to become part of the living Earth, it is going to be other kinds of minds that give us that clarity and strength and diversity and alternative way of thinking that could free us from this stranglehold of thought that looks only to the maximizing return on investment in very leverageable ways.
  • richard powersIt amazed me to get to the end of the first draft of “Bewilderment” and to realize how much Buddhism was in the book, from the simplest things.
  • I think there is nothing more science inflected than being out in the living world and the more-than-human world and trying to understand what’s happening.
  • And of course, we can combine this with what we were talking about earlier with death. If we see all of evolution as somehow leading up to us, all of human, cultural evolution leading up to neoliberalism and here we are just busily trying to accumulate and make meaning for ourselves, death becomes the enemy.
  • And you’re making the point in different ways throughout the book that it is the minds we think of as unusual, that we would diagnose as having some kind of problem or dysfunction that are, in some cases, are the only ones responding to the moment in the most common sense way it deserves. It is almost everybody else’s brain that has been broken.
  • it isn’t surprising. If you think of the characteristics of this dominant culture that we’ve been talking about — the fixation on control, the fixation on mastery, the fixation on management and accumulation and the resistance of decay — it isn’t surprising that that culture is also threatened by difference and divergence. It seeks out old, stable hierarchies — clear hierarchies — of control, and anything that’s not quite exploitable or leverageable in the way that the normal is terrifying and threatening.
  • And the more I looked for it, the more it pervaded the book.
  • ezra kleinI’ve heard you say that it has changed the way you measure a good day. Can you tell me about that?richard powersThat’s true.I suppose when I was still enthralled to commodity-mediated individualist market-driven human exceptionalism — we need a single word for this
  • And since moving to the Smokies and since publishing “The Overstory,” my days have been entirely inverted. I wake up, I go to the window, and I look outside. Or I step out onto the deck — if I haven’t been sleeping on the deck, which I try to do as much as I can in the course of the year — and see what’s in the air, gauge the temperature and the humidity and the wind and see what season it is and ask myself, you know, what’s happening out there now at 1,700 feet or 4,000 feet or 5,000 feet.
  • let me talk specifically about the work of a scientist who has herself just recently published a book. It’s Dr. Suzanne Simard, and the book is “Finding the Mother Tree.” Simard has been instrumental in a revolution in our way of thinking about what’s happening underground at the root level in a forest.
  • it was a moving moment for me, as an easterner, to stand up there and to say, this is what an eastern forest looks like. This is what a healthy, fully-functioning forest looks like. And I’m 56 years old, and I’d never seen it.
  • the other topics of that culture tend to circle back around these sorts of trends, human fascinations, ways of magnifying our throw weight and our ability and removing the last constraints to our desires and, in particular, to eliminate the single greatest enemy of meaning in the culture of the technological sublime that is, itself, such a strong instance of the culture of human separatism and commodity-mediated individualist capitalism— that is to say, the removal of death.
  • Why is it that we have known about the crisis of species extinction for at least half a century and longer? And I mean the lay public, not just scientists. But why has this been general knowledge for a long time without public will demanding some kind of action or change
  • And when you make kinship beyond yourself, your sense of meaning gravitates outwards into that reciprocal relationship, into that interdependence. And you know, it’s a little bit like scales falling off your eyes. When you do turn that corner, all of the sources of anxiety that are so present and so deeply internalized become much more identifiable. And my own sense of hope and fear gets a much larger frame of reference to operate in.
  • I think, for most of my life, until I did kind of wake up to forests and to trees, I shared — without really understanding this as a kind of concession or a kind of subscription — I did share this cultural consensus that meaning is a private thing that we do for ourselves and by ourselves and that our kind of general sense of the discoveries of the 19th and 20th century have left us feeling a bit unsponsored and adrift beyond the accident of human existence.
  • The largest single influence on any human being’s mode of thought is other human beings. So if you are surrounded by lots of terrified but wishful-thinking people who want to believe that somehow the cavalry is going to come at the last minute and that we don’t really have to look inwards and change our belief in where meaning comes from, that we will somehow be able to get over the finish line with all our stuff and that we’ll avert this disaster, as we have other kinds of disasters in the past.
  • I think what was happening to me at that time, as I was turning outward and starting to take the non-human world seriously, is my sense of meaning was shifting from something that was entirely about me and authored by me outward into this more collaborative, reciprocal, interdependent, exterior place that involved not just me but all of these other ways of being that I could make kinship with.
  • And I think I was right along with that sense that somehow we are a thing apart. We can make purpose and make meaning completely arbitrarily. It consists mostly of trying to be more in yourself, of accumulating in one form or another.
  • I can’t really be out for more than two or three miles before my head just fills with associations and ideas and scenes and character sketches. And I usually have to rush back home to keep it all in my head long enough to get it down on paper.
  • for my journey, the way to characterize this transition is from being fascinated with technologies of mastery and control and what they’re doing to us as human beings, how they’re changing what the capacities and affordances of humanity are and how we narrate ourselves, to being fascinated with technologies and sciences of interdependence and cooperation, of those sciences that increase our sense of kinship and being one of many, many neighbors.
  • And that’s an almost impossible persuasion to rouse yourself from if you don’t have allies. And I think the one hopeful thing about the present is the number of people trying to challenge that consensual understanding and break away into a new way of looking at human standing is growing.
  • And when you do subscribe to a culture like that and you are confronted with the reality of your own mortality, as I was when I was living in Stanford, that sense of stockpiling personal meaning starts to feel a little bit pointless.
  • And I just head out. I head out based on what the day has to offer. And to have that come first has really changed not only how I write, but what I’ve been writing. And I think it really shows in “Bewilderment.” It’s a totally different kind of book from my previous 12.
  • the marvelous thing about the work, which continues to get more sophisticated and continues to turn up newer and newer astonishments, is that there was odd kind of reciprocal interdependence and cooperation across the species barrier, that Douglas firs and birches were actually involved in these sharing back and forth of essential nutrients. And that’s a whole new way of looking at forest.
  • she began to see that the forests were actually wired up in very complex and identifiable ways and that there was an enormous system of resource sharing going on underground, that trees were sharing not only sugars and the hydrocarbons necessary for survival, but also secondary metabolites. And these were being passed back and forth, both symbiotically between the trees and the fungi, but also across the network to other trees so that there were actually trees in wired up, fungally-connected forests where large, dominant, healthy trees were subsidizing, as it were, trees that were injured or not in favorable positions or damaged in some way or just failing to thrive.
  • so when I was still pretty much a card-carrying member of that culture, I had this sense that to become a better person and to get ahead and to really make more of myself, I had to be as productive as possible. And that meant waking up every morning and getting 1,000 words that I was proud of. And it’s interesting that I would even settle on a quantitative target. That’s very typical for that kind of mindset that I’m talking about — 1,000 words and then you’re free, and then you can do what you want with the day.
  • there will be a threshold, as there have been for these other great social transformations that we’ve witnessed in the last couple of decades where somehow it goes from an outsider position to absolutely mainstream and common sense.
  • I am persuaded by those scholars who have showed the degree to which the concept of nature is itself an artificial construction that’s born of cultures of human separatism. I believe that everything that life does is part of the living enterprise, and that includes the construction of cities. And there is no question at all the warning that you just gave about nostalgia creating a false binary between the built world and the true natural world is itself a form of cultural isolation.
  • Religion is a technology to discipline, to discipline certain parts of the human impulse. A lot of the book revolves around the decoded neurofeedback machine, which is a very real literalization of a technology, of changing the way we think
  • one of the things I think that we have to take seriously is that we have created technologies to supercharge some parts of our natural impulse, the capitalism I think should be understood as a technology to supercharge the growth impulse, and it creates some wonders out of that and some horrors out of that.
  • richard powersSure. I base my machine on existing technology. Decoded neurofeedback is a kind of nascent field of exploration. You can read about it; it’s been publishing results for a decade. I first came across it in 2013. It involves using fMRI to record the brain activity of a human being who is learning a process, interacting with an object or engaged in a certain emotional state. That neural activity is recorded and stored as a data structure. A second subsequent human being is then also scanned in real time and fed kinds of feedback based on their own internal neural activity as determined by a kind of software analysis of their fMRI data structures.
  • And they are queued little by little to approximate, to learn how to approximate, the recorded states of the original subject. When I first read about this, I did get a little bit of a revelation. I did feel my skin pucker and think, if pushed far enough, this would be something like a telepathy conduit. It would be a first big step in answering that age-old question of what does it feel like to be something other than we are
  • in the book I simply take that basic concept and extend it, juke it up a little bit, blur the line between what the reader might think is possible right now and what they might wonder about, and maybe even introduce possibilities for this empathetic transference
  • ezra kleinOne thing I loved about the role this played in the book is that it’s highlighting its inverse. So a reader might look at this and say, wow, wouldn’t that be cool if we had a machine that could in real time change how we think and change our neural pathways and change our mental state in a particular direction? But of course, all of society is that machine,
  • Robin and Theo are in an airport. And you’ve got TVs everywhere playing the news which is to say playing a constant loop of outrage, and disaster, and calamity. And Robbie, who’s going through these neural feedback sessions during this period, turns to his dad and says, “Dad, you know how the training’s rewiring my brain? This is what is rewiring everybody else.”
  • ezra kleinI think Marshall McLuhan knew it all. I really do. Not exactly what it would look like, but his view and Postman’s view that we are creating a digital global nervous system is a way they put it, it was exactly right. A nervous system, it was such the exact right metaphor.
  • the great insight of McLuhan, to me, what now gets called the medium is the message is this idea that the way media acts upon us is not in the content it delivers. The point of Twitter is not the link that you click or even the tweet that you read; it is that the nature and structure of the Twitter system itself begins to act on your system, and you become more like it.If you watch a lot of TV, you become more like TV. If you watch a lot of Twitter, you become more like Twitter, Facebook more like Facebook. Your identities become more important to you — that the content is distraction from the medium, and the medium changes you
  • it is happening to all of us in ways that at least we are not engaging in intentionally, not at that level of how do we want to be transformed.
  • richard powersI believe that the digital neural system is now so comprehensive that the idea that you could escape it somewhere, certainly not in the Smokies, even more remotely, I think, becomes more and more laughable. Yeah, and to build on this idea of the medium being the message, not the way in which we become more like the forms and affordances of the medium is that we begin to expect that those affordances, the method in which those media are used, the physiological dependencies and castes of behavior and thought that are required to operate them and interact with them are actual — that they’re real somehow, and that we just take them into human nature and say no, this is what we’ve always wanted and we’ve simply been able to become more like our true selves.
  • Well, the warpage in our sense of time, the warpage in our sense of place, are profound. The ways in which digital feedback and the affordances of social media and all the rest have changed our expectations with regard to what we need to concentrate on, what we need to learn for ourselves, are changing profoundly.
  • If you look far enough back, you can find Socrates expressing great anxiety and suspicion about the ways in which writing is going to transform the human brain and human expectation. He was worried that somehow it was going to ruin our memories. Well, it did up to a point — nothing like the way the digital technologies have ruined our memories.
  • my tradition is Jewish, the Sabbath is a technology, is a technology to create a different relationship between the human being, and time, and growth, and productive society than you would have without the Sabbath which is framed in terms of godliness but is also a way of creating separation from the other impulses of the weak.
  • Governments are a technology, monogamy is a technology, a religiously driven technology, but now one that is culturally driven. And these things do good and they do bad. I’m not making an argument for any one of them in particular. But the idea that we would need to invent something wholly new to come up with a way to change the way human beings act is ridiculous
  • My view of the story of this era is that capitalism was one of many forces, and it has become, in many societies, functionally the only one that it was in relationship with religion, it was in relationship with more rooted communities.
  • it has become not just an economic system but a belief system, and it’s a little bit untrammeled. I’m not an anti-capitalist person, but I believe it needs countervailing forces. And my basic view is that it doesn’t have them anymore.
  • the book does introduce this kind of fable, this kind of thought experiment about the way the affordances that a new and slightly stronger technology of empathy might deflect. First of all, the story of a little boy and then the story of his father who’s scrambling to be a responsible single parent. And then, beyond that, the community of people who hear about this boy and become fascinated with him as a narrative, which again ripples outward through these digital technologies in ways that can’t be controlled or whose consequences can be foreseen.
  • I’ve talked about it before is something I’ve said is that I think a push against, functionally, materialism and want is an important weight in our society that we need. And when people say it is the way we’ll deal with climate change in the three to five year time frame, I become much more skeptical because to the point of things like the technology you have in the book with neural feedback, I do think one of the questions you have to ask is, socially and culturally, how do you move people’s minds so you can then move their politics?
  • You’re going to need something, it seems to me, outside of politics, that changes humans’ sense of themselves more fundamentally. And that takes a minute at the scale of billions.
  • richard powersWell, you are correct. And I don’t think it’s giving away any great reveal in the book to say that a reader who gets far enough into the story probably has this moment of recursive awareness where they, he or she comes to understand that what Robin is doing in this gradual training on the cast of mind of some other person is precisely what they’re doing in the act of reading the novel “Bewilderment” — by living this act of active empathy for these two characters, they are undergoing their own kind of neurofeedback.
  • The more we understand about the complexities of living systems, of organisms and the evolution of organisms, the more capable it is to feel a kind of spiritual awe. And that certainly makes it easier to have reverence for the experiment beyond me and beyond my species. I don’t think those are incommensurable or incompatible ways of knowing the world. In fact, I think to invoke one last time that Buddhist precept of interbeing, I think there is a kind of interbeing between the desire, the true selfless desire to understand the world out there through presence, care, measurement, attention, reproduction of experiment and the desire to have a spiritual affinity and shared fate with the world out there. They’re really the same project.
  • richard powersWell, sure. If we turn back to the new forestry again and researchers like Suzanne Simard who were showing the literal interconnectivity across species boundaries and the cooperation of resource sharing between different species in a forest, that is rigorous science, rigorous reproducible science. And it does participate in that central principle of practice, or collection of practices, which always requires the renunciation of personal wish and ego and prior belief in favor of empirical reproduction.
  • I’ve begun to see people beginning to build out of the humbling sciences a worldview that seems quite spiritual. And as you’re somebody who seems to me to have done that and it has changed your life, would you reflect on that a bit?
  • So much of the book is about the possibility of life beyond Earth. Tell me a bit about the role that’s playing. Why did you make the possibility of alien life in the way it might look and feel and evolve and act so central in a book about protecting and cherishing life here?
  • richard powersI’m glad that we’re slipping this in at the end because yes this framing of the book around this question of are we alone or does the universe want life it’s really important. Theo, Robin’s father, is an astrobiologist.
  • Imagine that everything happens just right so that every square inch of this place is colonized by new forms of experiments, new kinds of life. And the father trying to entertain his son with the story of this remarkable place in the sun just stopping him and saying, Dad, come on, that’s asking too much. Get real, that’s science fiction. That’s the vision that I had when I finished the book, an absolutely limitless sense of just how lucky we’ve had it here.
  • one thing I kept thinking about that didn’t make it into the final book but exists as a kind of parallel story in my own head is the father and son on some very distant planet in some very distant star, many light years from here, playing that same game. And the father saying, OK, now imagine a world that’s just the right size, and it has plate tectonics, and it has water, and it has a nearby moon to stabilize its rotation, and it has incredible security and safety from asteroids because of other large planets in the solar system.
  • they make this journey across the universe through all kinds of incubators, all kinds of petri dishes for life and the possibilities of life. And rather than answer the question — so where is everybody? — it keeps deferring the question, it keeps making that question more subtle and stranger
  • For the purposes of the book, Robin, who desperately believes in the sanctity of life beyond himself, begs his father for these nighttime, bedtime stories, and Theo gives him easy travel to other planets. Father and son going to a new planet based on the kinds of planets that Theo’s science is turning up and asking this question, what would life look like if it was able to get started here?
Javier E

Opinion | Reflections on Stephen L. Carter's 1991 Book, 'Reflections of an Affirmative ... - 0 views

  • In 1991, Stephen L. Carter, a professor at Yale Law School, began his book “Reflections of an Affirmative Action Baby” with a discomfiting anecdote. A fellow professor had criticized one of Carter’s papers because it “showed a lack of sensitivity to the experience of Black people in America.”
  • “I live in a box,” he wrote, one bearing all kinds of labels, including “Careful: Discuss Civil Rights Law or Law and Race Only” and “Warning! Affirmative Action Baby! Do Not Assume That This Individual Is Qualified!”
  • The diversity argument holds that people of different races benefit from one another’s presence, which sounds desirable on its face
  • ...17 more annotations...
  • The fact that Thomas was very likely nominated because he was Black and because he opposed affirmative action posed a conundrum for many supporters of racial preferences. Was being Black enough? Or did you have to be “the right kind” of Black person? It’s a question Carter openly wrestles with in his book.
  • What immediately struck me on rereading it was how prescient Carter was about these debates 32 years ago. What role affirmative action should take was playing out then in ways that continue to reverberate.
  • The demise of affirmative action, in Carter’s view, was both necessary and inevitable. “We must reject the common claim that an end to preferences ‘would be a disastrous situation, amounting to a virtual nullification of the 1954 desegregation ruling,’” he wrote, quoting the activist and academic Robert Allen. “The prospect of its end should be a challenge and a chance.”
  • Like many people today — both proponents and opponents of affirmative action — he expressed reservations about relying on diversity as the constitutional basis for racial preferences.
  • Carter bristled at the judgment of many of his Black peers, describing several situations in which he found himself accused of being “inauthentically” Black, as if people of a particular race were a monolith and that those who deviated from it were somehow shirking their duty. He said he didn’t want to be limited in what he was allowed to say by “an old and vicious form of silencing.”
  • But the implication of recruiting for diversity, Carter explained, had less to do with admitting Black students to redress past discrimination and more to do with supporting and reinforcing essentialist notions about Black people.
  • An early critic of groupthink, Carter warned against “the idea that Black people who gain positions of authority or influence are vested a special responsibility to articulate the presumed views of other people who are Black — in effect, to think and act and speak in a particular way, the Black way — and that there is something peculiar about Black people who insist on doing anything else.”
  • A graduate of Stanford and Yale Law, Carter was a proud beneficiary of affirmative action. Yet he acknowledged the personal toll it took (“a decidedly mixed blessing”) as well as affirmative action’s sometimes troubling effects on Black people as the programs evolved.
  • , it’s hard to imagine Carter welcoming the current vogue for white allyship, with its reductive assumption that all Black people have the same interests and values
  • He disparaged what he called “the peculiar relationship between Black intellectuals and the white ones who seem loath to criticize us for fear of being branded racists — which is itself a mark of racism of a sort.”
  • In the past, such ideas might have been seen as “frankly racist,” Carter noted. “Now, however, they are almost a gospel for people who want to show their commitment to equality.”
  • Carter took issue with the belief, now practically gospel in academic, cultural and media circles, that heightened race consciousness would be central to overcoming racism
  • However well intentioned you may be, when you reduce people to their race-based identity rather than view them as individuals in their full, complex humanity, you risk making sweeping assumptions about who they are. This used to be called stereotyping or racism.
  • he rejected all efforts to label him, insisting that intellectuals should be “politically unpredictable.
  • “Critics who attempt to push (or pull) Carter into the ranks of the Black right wing will be making a mistake. He is not a conservative, neo- or otherwise. He is an honest Black scholar — the product of the pre-politically correct era — who abhors the stifling of debate by either wing or by people of any hue.”
  • This strikes me as the greatest difference between reading the book today and reading it as an undergrad at a liberal Ivy League college: the attitude toward debating controversial views. “Reflections” offers a vigorous and unflinching examination of ideas, something academia, media and the arts still prized in 1991.
  • Today, a kind of magical thinking has seized ideologues on both the left and the right, who seem to believe that stifling debate on difficult questions will make them go away
Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Javier E

What Machines Can't Do - NYTimes.com - 0 views

  • certain mental skills will become less valuable because computers will take over. Having a great memory will probably be less valuable. Being able to be a straight-A student will be less valuable — gathering masses of information and regurgitating it back on tests. So will being able to do any mental activity that involves following a set of rules.
  • what human skills will be more valuable?
  • In the news business, some of those skills are already evident.
  • ...13 more annotations...
  • Technology has rewarded sprinters (people who can recognize and alertly post a message on Twitter about some interesting immediate event) and marathoners (people who can write large conceptual stories), but it has hurt middle-distance runners (people who write 800-word summaries of yesterday’s news conference).
  • Technology has rewarded graphic artists who can visualize data, but it has punished those who can’t turn written reporting into video presentations.
  • More generally, the age of brilliant machines seems to reward a few traits.
  • First, it rewards enthusiasm. The amount of information in front of us is practically infinite; so is that amount of data that can be collected with new tools. The people who seem to do best possess a voracious explanatory drive, an almost obsessive need to follow their curiosity.
  • Second, the era seems to reward people with extended time horizons and strategic discipline.
  • a human can provide an overall sense of direction and a conceptual frame. In a world of online distractions, the person who can maintain a long obedience toward a single goal, and who can filter out what is irrelevant to that goal, will obviously have enormous worth.
  • Third, the age seems to reward procedural architects. The giant Internet celebrities didn’t so much come up with ideas, they came up with systems in which other people could express ideas: Facebook, Twitter, Wikipedia, etc.
  • One of the oddities of collaboration is that tightly knit teams are not the most creative. Loosely bonded teams are, teams without a few domineering presences, teams that allow people to think alone before they share results with the group. So a manager who can organize a decentralized network around a clear question, without letting it dissipate or clump, will have enormous value.
  • Fifth, essentialists will probably be rewarded.
  • creativity can be described as the ability to grasp the essence of one thing, and then the essence of some very different thing, and smash them together to create some entirely new thing.
  • In the 1950s, the bureaucracy was the computer. People were organized into technocratic systems in order to perform routinized information processing.
  • now the computer is the computer. The role of the human is not to be dispassionate, depersonalized or neutral. It is precisely the emotive traits that are rewarded: the voracious lust for understanding, the enthusiasm for work, the ability to grasp the gist, the empathetic sensitivity to what will attract attention and linger in the mind.
  • Unable to compete when it comes to calculation, the best workers will come with heart in hand.
Javier E

Measles Proves Delicate Issue to G.O.P. Field - NYTimes.com - 0 views

  • The politics of medicine, morality and free will have collided in an emotional debate over vaccines and the government’s place in requiring them, posing a challenge for Republicans who find themselves in the familiar but uncomfortable position of reconciling modern science with the skepticism of their core conservative voters.
  • the national debate is forcing the Republican Party’s 2016 presidential hopefuls to confront questions about whether it is in the public’s interest to allow parents to decide for themselves.
  • The vaccination controversy is a twist on an old problem for the Republican Party: how to approach matters that have largely been settled among scientists but are not widely accepted by conservatives.
  • ...6 more annotations...
  • It is a dance Republican candidates often do when they hedge their answers about whether evolution should be taught in schools. It is what makes the fight over global warming such a liability for their party, and what led last year to a widely criticized response to the Ebola scare.
  • There is evidence that vaccinations have become more of a political issue in recent years. Pew Research Center polls show that in 2009, 71 percent of both Republicans and Democrats favored requiring the vaccination of children. Five years later, Democratic support had grown to 76 percent, but Republican support had fallen to 65 percent.
  • The debate does not break entirely along right-left lines. The movement to forgo vaccinations has been popular in more liberal and affluent communities where some parents are worried that vaccines cause autism or other disorders among children.
  • Howard Dean, a presidential candidate in 2004 and a former chairman of the Democratic National Committee, said there are three groups of people who object to required vaccines: “One is people who are very much scared about their kids getting autism, which is an idea that has been completely discredited. Two, is entitled people who don’t want to put any poison in their kids and view this as poison, which is ignorance more than anything else. And three, people who are antigovernment in any way.”
  • The issue has more political potency among conservative voters who are highly skeptical of anything required by the government.
  • for Republicans like Mr. Paul who appeal to the kind of libertarian conservatives who are influential in states like Iowa and New Hampshire, which hold the first two contests in the battle for the nomination, there is an appeal in framing the issue as one of individual liberty.Asked about immunizations again later on Monday, Mr. Paul was even more insistent, saying it was a question of “freedom.” He grew irritated with a CNBC host who pressed him and snapped: “The state doesn’t own your children. Parents own the children.”
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
Javier E

The Startling Link Between Sugar and Alzheimer's - The Atlantic - 0 views

  • A longitudinal study, published Thursday in the journal Diabetologia, followed 5,189 people over 10 years and found that people with high blood sugar had a faster rate of cognitive decline than those with normal blood sugar
  • In other words, the higher the blood sugar, the faster the cognitive decline.
  • “Currently, dementia is not curable, which makes it very important to study risk factors.”
  • ...9 more annotations...
  • People who have type 2 diabetes are about twice as likely to get Alzheimer’s, and people who have diabetes and are treated with insulin are also more likely to get Alzheimer’s, suggesting elevated insulin plays a role in Alzheimer’s. In fact, many studies have found that elevated insulin, or “hyperinsulinemia,” significantly increases your risk of Alzheimer’s. On the other hand, people with type 1 diabetes, who don’t make insulin at all, are also thought to have a higher risk of Alzheimer’s. How could these both be true?
  • Schilling posits this happens because of the insulin-degrading enzyme, a product of insulin that breaks down both insulin and amyloid proteins in the brain—the same proteins that clump up and lead to Alzheimer’s disease. People who don’t have enough insulin, like those whose bodies’ ability to produce insulin has been tapped out by diabetes, aren’t going to make enough of this enzyme to break up those brain clumps. Meanwhile, in people who use insulin to treat their diabetes and end up with a surplus of insulin, most of this enzyme gets used up breaking that insulin down, leaving not enough enzyme to address those amyloid brain clumps.
  • this can happen even in people who don’t have diabetes yet—who are in a state known as “prediabetes.” It simply means your blood sugar is higher than normal, and it’s something that affects roughly 86 million Americans.
  • In a 2012 study, Roberts broke nearly 1,000 people down into four groups based on how much of their diet came from carbohydrates. The group that ate the most carbs had an 80 percent higher chance of developing mild cognitive impairment—a pit stop on the way to dementia—than those who ate the smallest amount of carbs.
  • “It’s hard to be sure at this stage, what an ‘ideal’ diet would look like,” she said. “There’s a suggestion that a Mediterranean diet, for example, may be good for brain health.”
  • there are several theories out there to explain the connection between high blood sugar and dementia. Diabetes can also weaken the blood vessels, which increases the likelihood that you’ll have ministrokes in the brain, causing various forms of dementia. A high intake of simple sugars can make cells, including those in the brain, insulin resistant, which could cause the brain cells to die. Meanwhile, eating too much in general can cause obesity. The extra fat in obese people releases cytokines, or inflammatory proteins that can also contribute to cognitive deterioration, Roberts said. In one study by Gottesman, obesity doubled a person’s risk of having elevated amyloid proteins in their brains later in life.
  • even people who don’t have any kind of diabetes should watch their sugar intake, she said.
  • as these and other researchers point out, decisions we make about food are one risk factor we can control. And it’s starting to look like decisions we make while we’re still relatively young can affect our future cognitive health.
  • “Alzheimer’s is like a slow-burning fire that you don’t see when it starts,” Schilling said. It takes time for clumps to form and for cognition to begin to deteriorate. “By the time you see the signs, it’s way too late to put out the fire.”
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Technopoly-Chs. 9,10--Scientism, the great symbol drain - 0 views

  • By Scientism, I mean three interrelated ideas that, taken together, stand as one of the pillars of Technopoly.
  • The first and indispensable idea is, as noted, that the methods of the natural sciences can be applied to the study of human behavior. This idea is the backbone of much of psychology and sociology as practiced at least in America, and largely accounts for the fact that social science, to quote F. A. Hayek, "has cont~ibuted scarcely anything to our understanding of social phenomena." 2
  • The second idea is, as also noted, that social science generates specific principles which can be used to organize society on a rational and humane basis. This implies that technical meansmostly "invisible technologies" supervised by experts-can be designed to control human behavior and set it on the proper course.
  • ...63 more annotations...
  • The third idea is that faith in science can serve as a comprehensive belief system that gives meaning to life, as well. as a sense of well-being, morality, and even immortality.
  • the spirit behind this scientific ideal inspired several men to believe that the reliable and predictable knowledge that could be obtained about stars and atoms could also be obtained about human behavior.
  • Among the best known of these early "social scientists" were Claude-Henri de Saint-Simon, Prosper Enfantin, and, of course, Auguste Comte.
  • They held in common two beliefs to which T echnopoly is deeply indebted: that the natural sciences provide a method to unlock the secrets of both the human heart and the direction of social life; that society can be rationally and humanely reorganized according to principles that social science will uncover. It is with these men that the idea of "social engineering" begins and the seeds of Scientism are planted.
  • Information produced by counting may sometimes be valuable in helping a person get an idea, or, even more so, in providing support for an idea. But the mere activity of counting does not make science.
  • Nor does observing th_ings, though it is sometimes said that if one is empirical, one is scientific. To be empirical means to look at things before drawing conclusions. Everyone, therefore, is an empiricist, with the possible exception of paranoid schizophrenics.
  • What we may call science, then, is the quest to find the immutable and universal laws that govern processes, presuming that there are cause-and-effect relations among these processes. It follows that the quest to understand human behavior and feeling can in no sense except the most trivial be called science.
  • Scientists do strive to be empirical and where possible precise, but it is also basic to their enterprise that they maintain a high degree of objectivity, which means that they study things independently of what people think or do about them.
  • I do not say, incidentally, that the Oedipus complex and God do not exist. Nor do I say that to believe in them is harmful-far from it. I say only that, there being no tests that could, in principle, show them to be false, they fall outside the purview Scientism 151 of science, as do almost all theories that make up the content of "social science."
  • in the nineteenth centu~, novelists provided us with most of the powerful metaphors and images of our culture.
  • This fact relieves the scientist of inquiring into their values and motivations and for this reason alone separates science from what is called social science, consigning the methodology of the latter (to quote Gunnar Myrdal) to the status of the "metaphysical and pseudo-objective." 3
  • The status of social-science methods is further reduced by the fact that there are almost no experiments that will reveal a social-science theory to be false.
  • et us further suppose that Milgram had found that 100 percent of his 1 subjecl:s did what they were told, with or without Hannah Arendt. And now let us suppose that I tell you a story of a Scientism 153 group of people who in some real situation refused to comply with the orders of a legitimate authority-let us say, the Danes who in the face of Nazi occupation helped nine thousand Jews escape to Sweden. Would you say to me that this cannot be so because Milgram' s study proves otherwise? Or would you say that this overturns Milgram's work? Perhaps you would say that the Danish response is not relevant, since the Danes did not regard the Nazi occupation as constituting legitimate autho!ity. But then, how would we explain the cooperative response to Nazi authority of the French, the Poles, and the Lithuanians? I think you would say none of these things, because Milgram' s experiment qoes not confirm or falsify any theory that might be said to postulate a law of human nature. His study-which, incidentally, I find both fascinating and terrifying-is not science. It is something else entirely.
  • Freud, could not imagine how the book could be judged exemplary: it was science or it was nothing. Well, of course, Freud was wrong. His work is exemplary-indeed, monumental-but scarcely anyone believes today that Freud was doing science, any more than educated people believe that Marx was doing science, or Max Weber or Lewis Mumford or Bruno Bettelheim or Carl Jung or Margaret Mead or Arnold Toynbee. What these people were doing-and Stanley Milgram was doing-is documenting the behavior and feelings of people as they confront problems posed by their culture.
  • the stories of social r~searchers are much closer in structure and purpose to what is called imaginative literature; that is to say, both a social researcher and a novelist give unique interpretations to a set of human events and support their interpretations with examples in various forms. Their interpretations cannot be proved or disproved but will draw their appeal from the power of their language, the depth of their explanations, the relevance of their examples, and the credibility of their themes.
  • And all of this has, in both cases, an identifiable moral purpose.
  • The words "true" and "false" do not apply here in the sense that they are used in mathematics or science. For there is nothing universally and irrevocably true or false about these interpretations. There are no critical tests to confirm or falsify them. There are no natural laws from which they are derived. They are bound by time, by situation, and above all by the cultural prejudices of the researcher or writer.
  • Both the novelist and the social researcher construct their stories by the use of archetypes and metaphors.
  • Cervantes, for example, gave us the enduring archetype of the incurable dreamer and idealist in Don Quixote. The social historian Marx gave us the archetype of the ruthless and conspiring, though nameless, capitalist. Flaubert gave us the repressed b~urgeois romantic in Emma Bovary. And Margaret Mead gave us the carefree, guiltless Samoan adolescent. Kafka gave us the alienated urbanite driven to self-loathing. And Max Weber gave us hardworking men driven by a mythology he called the Protestant Ethic. Dostoevsky gave us the egomaniac redeemed by love and religious fervor. And B. F. Skinner gave us the automaton redeemed by a benign technology.
  • Why do such social researchers tell their stories? Essentially for didactic and moralistic purposes. These men and women tell their stories for the same reason the Buddha, Confucius, Hillel, and Jesus told their stories (and for the same reason D. H. Lawrence told his).
  • Moreover, in their quest for objectivity, scientists proceed on the assumption that the objects they study are indifferent to the fact that they are being studied.
  • If, indeed, the price of civilization is repressed sexuality, it was not Sigmund Freud who discovered it. If the consciousness of people is formed by their material circumstances, it was not Marx who discovered it. If the medium is the message, it was not McLuhan who discovered it. They have merely retold ancient stories in a modem style.
  • Unlike science, social research never discovers anything. It only rediscovers what people once were told and need to be told again.
  • Only in knowing ~omething of the reasons why they advocated education can we make sense of the means they suggest. But to understand their reas.ons we must also understand the narratives that governed their view of the world. By narrative, I mean a story of human history that gives meaning to the past, explains the present, and provides guidance for the future.
  • In Technopoly, it is not Scientism 159 enough to say, it is immoral and degrading to allow people to be homeless. You cannot get anywhere by asking a judge, a politician, or a bureaucrat to r~ad Les Miserables or Nana or, indeed, the New Testament. Y 01.i must show that statistics have produced data revealing the homeless to be unhappy and to be a drain on the economy. Neither Dostoevsky nor Freud, Dickens nor Weber, Twain nor Marx, is now a dispenser of legitimate knowledge. They are interesting; they are ''.worth reading"; they are artifacts of our past. But as for "truth," we must tum to "science."
  • In Technopoly, it is not enough for social research to rediscover ancient truths or to comment on and criticize the moral behavior of people. In T echnopoly, it is an insult to call someone a "moralizer." Nor is it sufficient for social research to put forward metaphors, images, and ideas that can help people live with some measure of understanding and dignity.
  • Such a program lacks the aura of certain knowledge that only science can provide. It becomes necessary, then, to transform psychology, sociology, and anthropology into "sciences," in which humanity itself becomes an object, much like plants, planets, or ice cubes.
  • That is why the commonplaces that people fear death and that children who come from stable families valuing scholarship will do well in school must be announced as "discoveries" of scientific enterprise. In this way, social resear~hers can see themselves, and can be seen, as scientists, researchers without bias or values, unburdened by mere opinion. In this way, social policies can be claimed to rest on objectively determined facts.
  • given the psychological, social, and material benefits that attach to the label "scientist," it is not hard to see why social researchers should find it hard to give it up.
  • Our social "s'cientists" have from the beginning been less tender of conscience, or less rigorous in their views of science, or perhaps just more confused about the questions their procedures can answer and those they cannot. In any case, they have not been squeamish about imputing to their "discoveries" and the rigor of their procedures the power to direct us in how we ought rightly to behave.
  • It is less easy to see why the rest of us have so willingly, even eagerly, cooperated in perpetuating the same illusion.
  • When the new technologies and techniques and spirit of men like Galileo, Newton, and Bacon laid the foundations of natural science, they also discredited the authority of earlier accounts of the physical world, as found, for example, in the great tale of Genesis. By calling into question the truth of such accounts in one realm, science undermined the whole edifice of belief in sacred stories and ultimately swept away with it the source to which most humans had looked for moral authority. It is not too much to say, I think, that the desacralized world has been searching for an alternative source of moral authority ever since.
  • We welcome them gladly, and the claim explicitly made or implied, because we need so desperately to find some source outside the frail and shaky judgments of mortals like ourselves to authorize our moral decisions and behavior. And outside of the authority of brute force, which can scarcely be called moral, we seem to have little left but the authority of procedures.
  • It is not merely the misapplication of techniques such as quantification to questions where numbers have nothing to say; not merely the confusion of the material and social realms of human experience; not merely the claim of social researchers to be applying the aims and procedures of natural scien\:e to the human world.
  • This, then, is what I mean by Scientism.
  • It is the desperate hope, and wish, and ultimately the illusory belief that some standardized set of procedures called "science" can provide us with an unimpeachable source of moral authority, a suprahuman basis for answers to questions like "What is life, and when, and why?" "Why is death, and suffering?" 'What is right and wrong to do?" "What are good and evil ends?" "How ought we to think and feel and behave?
  • Science can tell us when a heart begins to beat, or movement begins, or what are the statistics on the survival of neonates of different gestational ages outside the womb. But science has no more authority than you do or I do to establish such criteria as the "true" definition of "life" or of human state or of personhood.
  • Social research can tell us how some people behave in the presence of what they believe to be legitimate authority. But it cannot tell us when authority is "legitimate" and when not, or how we must decide, or when it may be right or wrong to obey.
  • To ask of science, or expect of science, or accept unchallenged from science the answers to such questions is Scientism. And it is Technopoly's grand illusion.
  • In the institutional form it has taken in the United States, advertising is a symptom of a world-view 'that sees tradition as an obstacle to its claims. There can, of course, be no functioning sense of tradition without a measure of respect for symbols. Tradition is, in fact, nothing but the acknowledgment of the authority of symbols and the relevance of the narratives that gave birth to them. With the erosion of symbols there follows a loss of narrative, which is one of the most debilitating consequences of Technopoly' s power.
  • What the advertiser needs to know is not what is right about the product but what is wrong about the buyer. And so the balance of business expenditures shifts from product research to market research, which meahs orienting business away from making products of value and toward making consumers feel valuable. The business of business becomes pseudo-therapy; the consumer, a patient reassl.,lred by psychodramas.
  • At the moment, 1t 1s considered necessary to introduce computers to the classroom, as it once was thought necessary to bring closed-circuit television and film to the classroom. To the question "Why should we do this?" the answer is: "To make learning more efficient and more interesting." Such an answer is considered entirely adequate, since in T ~chnopoly efficiency and interest need no justification. It is, therefore, usually not noticed that this answer does not address the question "What is learning for?"
  • What this means is that somewhere near the core of Technopoly is a vast industry with license to use all available symbols to further the interests of commerce, by devouring the psyches of consumers.
  • In the twentieth century, such metaphors and images have come largely from the pens of social historians and researchers. ·Think of John Dewey, William James, Erik Erikson, Alfred Kinsey, Thorstein Veblen, Margaret Mead, Lewis Mumford, B. F. Skinner, Carl Rogers, Marshall McLuhan, Barbara Tuchman, Noam Chomsky, Robert Coles, even Stanley Milgram, and you must acknowledge that our ideas of what we are like and what kind of country we live in come from their stories to a far greater extent than from the stories of our most renowned novelists.
  • social idea that must be advanced through education.
  • Confucius advocated teaching "the Way" because in tradition he saw the best hope for social order. As our first systematic fascist, Plato wished education to produce philosopher kings. Cicero argued that education must free the student from the tyranny of the present. Jefferson thought the purpose of education is to teach the young how to protect their liberties. Rousseau wished education to free the young from the unnatural constraints of a wicked and arbitrary social order. And among John Dewey's aims was to help the student function without certainty in a world of constant change and puzzling· ambiguities.
  • The point is that cultures must have narratives and will find them where they will, even if they lead to catastrophe. The alternative is to live without meaning, the ultimate negation of life itself.
  • It is also to the point to say that each narrative is given its form and its emotional texture through a cluster of symbols that call for respect and allegiance, even devotion.
  • by definition, there can be no education philosophy that does not address what learning is for. Confucius, Plato, Quintilian, Cicero, Comenius, Erasmus, Locke, Rousseau, Jefferson, Russell, Montessori, Whitehead, and Dewey--each believed that there was some transcendent political, spiritual, or
  • The importance of the American Constitution is largely in its function as a symbol of the story of our origins. It is our political equivalent of Genesis. To mock it, to• ignore it, to circwnvent it is to declare the irrelevance of the story of the United States as a moral light unto the world. In like fashion, the Statue of Liberty is the key symbol of the story of America as the natural home of the teeming masses, from anywhere, yearning to be free.
  • There are those who believe--as did the great historian Arnold Toynbee-that without a comprehensive religious narrative at its center a culture must decline. Perhaps. There are, after all, other sources-mythology, politics, philosophy, and science; for example--but it is certain that no culture can flourish without narratives of transcendent orjgin and power.
  • This does not mean that the mere existence of such a narrative ensures a culture's stability and strength. There are destructive narratives. A narrative provides meaning, not necessarily survival-as, for example, the story provided by Adolf Hitler to the German nation in t:he 1930s.
  • What story does American education wish to tell now? In a growing Technopoly, what do we believe education is for?
  • The answers are discouraging, and one of. them can be inferred from any television commercial urging the young to stay in school. The commercial will either imply or state explicitly that education will help the persevering student to get a ·good job. And that's it. Well, not quite. There is also the idea that we educate ourselves to compete with the Japanese or the Germans in an economic struggle to be number one.
  • Young men, for example, will learn how to make lay-up shots when they play basketball. To be able to make them is part of the The Great Symbol Drain 177 definition of what good players are. But they do not play basketball for that purpose. There is usually a broader, deeper, and more meaningful reason for wanting to play-to assert their manhood, to please their fathers, to be acceptable to their peers, even for the sheer aesthetic pleasure of the game itself. What you have to do to be a success must be addressed only after you have found a reason to be successful.
  • Bloom's solution is that we go back to the basics of Western thought.
  • He wants us to teach our students what Plato, Aristotle, Cicero, Saint Augustine, and other luminaries have had to say on the great ethical and epistemological questions. He believes that by acquainting themselves with great books our students will acquire a moral and intellectual foundation that will give meaning and texture to their lives.
  • Hirsch's encyclopedic list is not a solution but a description of the problem of information glut. It is therefore essentially incoherent. But it also confuses a consequence of education with a purpose. Hirsch attempted to answer the question "What is an educated person?" He left unanswered the question "What is an education for?"
  • Those who reject Bloom's idea have offered several arguments against it. The first is that such a purpose for education is elitist: the mass of students would not find the great story of
  • Western civilization inspiring, are too deeply alienated from the past to find it so, and would therefore have difficulty connecting the "best that has been thought and said" to their own struggles to find q1eaning in their lives.
  • A second argument, coming from what is called a "leftist" perspective, is even more discouraging. In a sense, it offers a definition of what is meant by elitism. It asserts that the "story of Western civilization" is a partial, biased, and even oppressive one. It is not the story of blacks, American Indians, Hispanics, women, homosexuals-of any people who are not white heterosexual males of Judea-Christian heritage. This claim denies that there is or can be a national culture, a narrative of organizing power and inspiring symbols which all citizens can identify with and draw sustenance from. If this is true, it means nothing less than that our national symbols have been drained of their power to unite, and that education must become a tribal affair; that is, each subculture must find its own story and symbols, and use them as the moral basis of education.
  • nto this void comes the Technopoly story, with its emphasis on progress without limits, rights without responsibilities, and technology without cost. The T echnopoly story is without a moral center. It puts in its place efficiency, interest, and economic advance. It promises heaven on earth through the conveniences of technological progress. It casts aside all traditional narratives and symbols that· suggest stability and orderliness, and tells, instead, of a life of skills, technical expertise, and the ecstasy of consumption. Its purpose is to produce functionaries for an ongoing Technopoly.
  • It answers Bloom by saying that the story of Western civilization is irrelevant; it answers the political left by saying there is indeed a common culture whose name is T echnopoly and whose key symbol is now the computer, toward which there must be neither irreverence nor blasphemy. It even answers Hirsch by saying that there are items on his list that, if thought about too deeply and taken too seriously, will interfere with the progress of technology.
Javier E

Why it's as hard to escape an echo chamber as it is to flee a cult | Aeon Essays - 0 views

  • there are two very different phenomena at play here, each of which subvert the flow of information in very distinct ways. Let’s call them echo chambers and epistemic bubbles. Both are social structures that systematically exclude sources of information. Both exaggerate their members’ confidence in their beliefs.
  • they work in entirely different ways, and they require very different modes of intervention
  • An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side.
  • ...90 more annotations...
  • start with epistemic bubbles
  • That omission might be purposeful
  • But that omission can also be entirely inadvertent. Even if we’re not actively trying to avoid disagreement, our Facebook friends tend to share our views and interests
  • An ‘echo chamber’ is a social structure from which other relevant voices have been actively discredited. Where an epistemic bubble merely omits contrary views, an echo chamber brings its members to actively distrust outsiders.
  • an echo chamber is something like a cult. A cult isolates its members by actively alienating them from any outside sources. Those outside are actively labelled as malignant and untrustworthy.
  • In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined.
  • The way to break an echo chamber is not to wave “the facts” in the faces of its members. It is to attack the echo chamber at its root and repair that broken trust.
  • Looking to others for corroboration is a basic method for checking whether one has reasoned well or badly
  • They have been in the limelight lately, most famously in Eli Pariser’s The Filter Bubble (2011) and Cass Sunstein’s #Republic: Divided Democracy in the Age of Social Media (2017).
  • The general gist: we get much of our news from Facebook feeds and similar sorts of social media. Our Facebook feed consists mostly of our friends and colleagues, the majority of whom share our own political and cultural views
  • various algorithms behind the scenes, such as those inside Google search, invisibly personalise our searches, making it more likely that we’ll see only what we want to see. These processes all impose filters on information.
  • Such filters aren’t necessarily bad. The world is overstuffed with information, and one can’t sort through it all by oneself: filters need to be outsourced.
  • That’s why we all depend on extended social networks to deliver us knowledge
  • any such informational network needs the right sort of broadness and variety to work
  • Each individual person in my network might be superbly reliable about her particular informational patch but, as an aggregate structure, my network lacks what Sanford Goldberg in his book Relying on Others (2010) calls ‘coverage-reliability’. It doesn’t deliver to me a sufficiently broad and representative coverage of all the relevant information.
  • Epistemic bubbles also threaten us with a second danger: excessive self-confidence.
  • An ‘epistemic bubble’ is an informational network from which relevant voices have been excluded by omission
  • Suppose that I believe that the Paleo diet is the greatest diet of all time. I assemble a Facebook group called ‘Great Health Facts!’ and fill it only with people who already believe that Paleo is the best diet. The fact that everybody in that group agrees with me about Paleo shouldn’t increase my confidence level one bit. They’re not mere copies – they actually might have reached their conclusions independently – but their agreement can be entirely explained by my method of selection.
  • Luckily, though, epistemic bubbles are easily shattered. We can pop an epistemic bubble simply by exposing its members to the information and arguments that they’ve missed.
  • echo chambers are a far more pernicious and robust phenomenon.
  • amieson and Cappella’s book is the first empirical study into how echo chambers function
  • echo chambers work by systematically alienating their members from all outside epistemic sources.
  • Their research centres on Rush Limbaugh, a wildly successful conservative firebrand in the United States, along with Fox News and related media
  • His constant attacks on the ‘mainstream media’ are attempts to discredit all other sources of knowledge. He systematically undermines the integrity of anybody who expresses any kind of contrary view.
  • outsiders are not simply mistaken – they are malicious, manipulative and actively working to destroy Limbaugh and his followers. The resulting worldview is one of deeply opposed force, an all-or-nothing war between good and evil
  • The result is a rather striking parallel to the techniques of emotional isolation typically practised in cult indoctrination
  • cult indoctrination involves new cult members being brought to distrust all non-cult members. This provides a social buffer against any attempts to extract the indoctrinated person from the cult.
  • The echo chamber doesn’t need any bad connectivity to function. Limbaugh’s followers have full access to outside sources of information
  • As Elijah Millgram argues in The Great Endarkenment (2015), modern knowledge depends on trusting long chains of experts. And no single person is in the position to check up on the reliability of every member of that chain
  • Their worldview can survive exposure to those outside voices because their belief system has prepared them for such intellectual onslaught.
  • exposure to contrary views could actually reinforce their views. Limbaugh might offer his followers a conspiracy theory: anybody who criticises him is doing it at the behest of a secret cabal of evil elites, which has already seized control of the mainstream media.
  • Perversely, exposure to outsiders with contrary views can thus increase echo-chamber members’ confidence in their insider sources, and hence their attachment to their worldview.
  • ‘evidential pre-emption’. What’s happening is a kind of intellectual judo, in which the power and enthusiasm of contrary voices are turned against those contrary voices through a carefully rigged internal structure of belief.
  • One might be tempted to think that the solution is just more intellectual autonomy. Echo chambers arise because we trust others too much, so the solution is to start thinking for ourselves.
  • that kind of radical intellectual autonomy is a pipe dream. If the philosophical study of knowledge has taught us anything in the past half-century, it is that we are irredeemably dependent on each other in almost every domain of knowledge
  • Limbaugh’s followers regularly read – but do not accept – mainstream and liberal news sources. They are isolated, not by selective exposure, but by changes in who they accept as authorities, experts and trusted sources.
  • we depend on a vastly complicated social structure of trust. We must trust each other, but, as the philosopher Annette Baier says, that trust makes us vulnerable. Echo chambers operate as a kind of social parasite on that vulnerability, taking advantage of our epistemic condition and social dependency.
  • I am quite confident that there are plenty of echo chambers on the political Left. More importantly, nothing about echo chambers restricts them to the arena of politics
  • The world of anti-vaccination is clearly an echo chamber, and it is one that crosses political lines. I’ve also encountered echo chambers on topics as broad as diet (Paleo!), exercise technique (CrossFit!), breastfeeding, some academic intellectual traditions, and many, many more
  • Here’s a basic check: does a community’s belief system actively undermine the trustworthiness of any outsiders who don’t subscribe to its central dogmas? Then it’s probably an echo chamber.
  • much of the recent analysis has lumped epistemic bubbles together with echo chambers into a single, unified phenomenon. But it is absolutely crucial to distinguish between the two.
  • Epistemic bubbles are rather ramshackle; they go up easily, and they collapse easily
  • Echo chambers are far more pernicious and far more robust. They can start to seem almost like living things. Their belief systems provide structural integrity, resilience and active responses to outside attacks
  • the two phenomena can also exist independently. And of the events we’re most worried about, it’s the echo-chamber effects that are really causing most of the trouble.
  • new data does, in fact, seem to show that people on Facebook actually do see posts from the other side, or that people often visit websites with opposite political affiliation.
  • their basis for evaluation – their background beliefs about whom to trust – are radically different. They are not irrational, but systematically misinformed about where to place their trust.
  • Many people have claimed that we have entered an era of ‘post-truth’.
  • Not only do some political figures seem to speak with a blatant disregard for the facts, but their supporters seem utterly unswayed by evidence. It seems, to some, that truth no longer matters.
  • This is an explanation in terms of total irrationality. To accept it, you must believe that a great number of people have lost all interest in evidence or investigation, and have fallen away from the ways of reason.
  • echo chambers offers a less damning and far more modest explanation. The apparent ‘post-truth’ attitude can be explained as the result of the manipulations of trust wrought by echo chambers.
  • We don’t have to attribute a complete disinterest in facts, evidence or reason to explain the post-truth attitude. We simply have to attribute to certain communities a vastly divergent set of trusted authorities.
  • An echo chamber doesn’t destroy their members’ interest in the truth; it merely manipulates whom they trust and changes whom they accept as trustworthy sources and institutions.
  • in many ways, echo-chamber members are following reasonable and rational procedures of enquiry. They’re engaging in critical reasoning. They’re questioning, they’re evaluating sources for themselves, they’re assessing different pathways to information. They are critically examining those who claim expertise and trustworthiness, using what they already know about the world
  • none of this weighs against the existence of echo chambers. We should not dismiss the threat of echo chambers based only on evidence about connectivity and exposure.
  • Notice how different what’s going on here is from, say, Orwellian doublespeak, a deliberately ambiguous, euphemism-filled language designed to hide the intent of the speaker.
  • echo chambers don’t trade in vague, ambiguous pseudo-speech. We should expect that echo chambers would deliver crisp, clear, unambiguous claims about who is trustworthy and who is not
  • clearly articulated conspiracy theories, and crisply worded accusations of an outside world rife with untrustworthiness and corruption.
  • Once an echo chamber starts to grip a person, its mechanisms will reinforce themselves.
  • In an epistemically healthy life, the variety of our informational sources will put an upper limit to how much we’re willing to trust any single person. Everybody’s fallible; a healthy informational network tends to discover people’s mistakes and point them out. This puts an upper ceiling on how much you can trust even your most beloved leader
  • nside an echo chamber, that upper ceiling disappears.
  • Being caught in an echo chamber is not always the result of laziness or bad faith. Imagine, for instance, that somebody has been raised and educated entirely inside an echo chamber
  • when the child finally comes into contact with the larger world – say, as a teenager – the echo chamber’s worldview is firmly in place. That teenager will distrust all sources outside her echo chamber, and she will have gotten there by following normal procedures for trust and learning.
  • It certainly seems like our teenager is behaving reasonably. She could be going about her intellectual life in perfectly good faith. She might be intellectually voracious, seeking out new sources, investigating them, and evaluating them using what she already knows.
  • The worry is that she’s intellectually trapped. Her earnest attempts at intellectual investigation are led astray by her upbringing and the social structure in which she is embedded.
  • Echo chambers might function like addiction, under certain accounts. It might be irrational to become addicted, but all it takes is a momentary lapse – once you’re addicted, your internal landscape is sufficiently rearranged such that it’s rational to continue with your addiction
  • Similarly, all it takes to enter an echo chamber is a momentary lapse of intellectual vigilance. Once you’re in, the echo chamber’s belief systems function as a trap, making future acts of intellectual vigilance only reinforce the echo chamber’s worldview.
  • There is at least one possible escape route, however. Notice that the logic of the echo chamber depends on the order in which we encounter the evidence. An echo chamber can bring our teenager to discredit outside beliefs precisely because she encountered the echo chamber’s claims first. Imagine a counterpart to our teenager who was raised outside of the echo chamber and exposed to a wide range of beliefs. Our free-range counterpart would, when she encounters that same echo chamber, likely see its many flaws
  • Those caught in an echo chamber are giving far too much weight to the evidence they encounter first, just because it’s first. Rationally, they should reconsider their beliefs without that arbitrary preference. But how does one enforce such informational a-historicity?
  • The escape route is a modified version of René Descartes’s infamous method.
  • Meditations on First Philosophy (1641). He had come to realise that many of the beliefs he had acquired in his early life were false. But early beliefs lead to all sorts of other beliefs, and any early falsehoods he’d accepted had surely infected the rest of his belief system.
  • The only solution, thought Descartes, was to throw all his beliefs away and start over again from scratch.
  • He could start over, trusting nothing and no one except those things that he could be entirely certain of, and stamping out those sneaky falsehoods once and for all. Let’s call this the Cartesian epistemic reboot.
  • Notice how close Descartes’s problem is to our hapless teenager’s, and how useful the solution might be. Our teenager, like Descartes, has problematic beliefs acquired in early childhood. These beliefs have infected outwards, infesting that teenager’s whole belief system. Our teenager, too, needs to throw everything away, and start over again.
  • Let’s call the modernised version of Descartes’s methodology the social-epistemic reboot.
  • when she starts from scratch, we won’t demand that she trust only what she’s absolutely certain of, nor will we demand that she go it alone
  • For the social reboot, she can proceed, after throwing everything away, in an utterly mundane way – trusting her senses, trusting others. But she must begin afresh socially – she must reconsider all possible sources of information with a presumptively equanimous eye. She must take the posture of a cognitive newborn, open and equally trusting to all outside sources
  • we’re not asking people to change their basic methods for learning about the world. They are permitted to trust, and trust freely. But after the social reboot, that trust will not be narrowly confined and deeply conditioned by the particular people they happened to be raised by.
  • Such a profound deep-cleanse of one’s whole belief system seems to be what’s actually required to escape. Look at the many stories of people leaving cults and echo chambers
  • Take, for example, the story of Derek Black in Florida – raised by a neo-Nazi father, and groomed from childhood to be a neo-Nazi leader. Black left the movement by, basically, performing a social reboot. He completely abandoned everything he’d believed in, and spent years building a new belief system from scratch. He immersed himself broadly and open-mindedly in everything he’d missed – pop culture, Arabic literature, the mainstream media, rap – all with an overall attitude of generosity and trust.
  • It was the project of years and a major act of self-reconstruction, but those extraordinary lengths might just be what’s actually required to undo the effects of an echo-chambered upbringing.
  • we need to attack the root, the systems of discredit themselves, and restore trust in some outside voices.
  • Stories of actual escapes from echo chambers often turn on particular encounters – moments when the echo-chambered individual starts to trust somebody on the outside.
  • Black’s is case in point. By high school, he was already something of a star on neo-Nazi media, with his own radio talk-show. He went on to college, openly neo-Nazi, and was shunned by almost every other student in his community college. But then Matthew Stevenson, a Jewish fellow undergraduate, started inviting Black to Stevenson’s Shabbat dinners. In Black’s telling, Stevenson was unfailingly kind, open and generous, and slowly earned Black’s trust. This was the seed, says Black, that led to a massive intellectual upheaval – a slow-dawning realisation of the depths to which he had been misled
  • Similarly, accounts of people leaving echo-chambered homophobia rarely involve them encountering some institutionally reported fact. Rather, they tend to revolve around personal encounters – a child, a family member, a close friend coming out.
  • hese encounters matter because a personal connection comes with a substantial store of trust.
  • We don’t simply trust people as educated experts in a field – we rely on their goodwill. And this is why trust, rather than mere reliability, is the key concept
  • goodwill is a general feature of a person’s character. If I demonstrate goodwill in action, then you have some reason to think that I also have goodwill in matters of thought and knowledge.
  • f one can demonstrate goodwill to an echo-chambered member – as Stevenson did with Black – then perhaps one can start to pierce that echo chamber.
  • the path I’m describing is a winding, narrow and fragile one. There is no guarantee that such trust can be established, and no clear path to its being established systematically.
  • what we’ve found here isn’t an escape route at all. It depends on the intervention of another. This path is not even one an echo-chamber member can trigger on her own; it is only a whisper-thin hope for rescue from the outside.
Javier E

What Gamergate should have taught us about the 'alt-right' | Technology | The Guardian - 0 views

  • Gamergate
  • The 2014 hashtag campaign, ostensibly founded to protest about perceived ethical failures in games journalism, clearly thrived on hate – even though many of those who aligned themselves with the movement either denied there was a problem with harassment, or wrote it off as an unfortunate side effect
  • ure, women, minorities and progressive voices within the industry were suddenly living in fear. Sure, those who spoke out in their defence were quickly silenced through exhausting bursts of online abuse. But that wasn’t why people supported it, right? They were disenfranchised, felt ignored, and wanted to see a systematic change.
  • ...23 more annotations...
  • Is this all sounding rather familiar now? Does it remind you of something?
  • it quickly became clear that the GamerGate movement was a mess – an undefined mission to Make Video Games Great Again via undecided means.
  • fter all, the culture war that began in games now has a senior representative in The White House. As a founder member and former executive chair of Brietbart News, Steve Bannon had a hand in creating media monster Milo Yiannopoulos, who built his fame and Twitter following by supporting and cheerleading Gamergate. This hashtag was the canary in the coalmine, and we ignored it.
  • Gamergate was an online movement that effectively began because a man wanted to punish his ex girlfriend. Its most notable achievement was harassing a large number of progressive figures - mostly women – to the point where they felt unsafe or considered leaving the industry
  • The similarities between Gamergate and the far-right online movement, the “alt-right”, are huge, startling and in no way a coincidence
  • These figures gave Gamergate a new sense of direction – generalising the rhetoric: this was now a wider war between “Social Justice Warriors” (SJWs) and everyday, normal, decent people. Games were simply the tip of the iceberg – progressive values, went the argument, were destroying everything
  • In 2016, new wave conservative media outlets like Breitbart have gained trust with their audience by painting traditional news sources as snooty and aloof. In 2014, video game YouTube stars, seeking to appear in touch with online gaming communities, unscrupulously proclaimed that traditional old-media sources were corrupt. Everything we’re seeing now, had its precedent two years ago.
  • With 2014’s Gamergate, Breitbart seized the opportunity to harness the pre-existing ignorance and anger among disaffected young white dudes. With Trump’s movement in 2016, the outlet was effectively running his campaign: Steve Bannon took leave of his role at the company in August 2016 when he was hired as chief executive of Trump’s presidential campaign
  • young men converted via 2014’s Gamergate, are being more widely courted now. By leveraging distrust and resentment towards women, minorities and progressives, many of Gamergate’s most prominent voices – characters like Mike Cernovich, Adam Baldwin, and Milo Yiannopoulos – drew power and influence from its chaos
  • no one in the movement was willing to be associated with the abuse being carried out in its name. Prominent supporters on Twitter, in subreddits and on forums like 8Chan, developed a range of pernicious rhetorical devices and defences to distance themselves from threats to women and minorities in the industry: the targets were lying or exaggerating, they were too precious; a language of dismissal and belittlement was formed against them. Safe spaces, snowflakes, unicorns, cry bullies. Even when abuse was proven, the usual response was that people on their side were being abused too. These techniques, forged in Gamergate, have become the standard toolset of far-right voices online
  • The majority of people who voted for Trump will never take responsibility for his racist, totalitarian policies, but they’ll provide useful cover and legitimacy for those who demand the very worst from the President Elect. Trump himself may have disavowed the “alt-right”, but his rhetoric has led to them feeling legitimised. As with Gamergate, the press risks being manipulated into a position where it has to tread a respectful middle ground that doesn’t really exist.
  • Using 4chan (and then the more sympathetic offshoot 8Chan) to plan their subversions and attacks made Gamergate a terribly sloppy operation, leaving a trail of evidence that made it quite clear the whole thing was purposefully, plainly nasty. But the video game industry didn’t have the spine to react, and allowed the movement to coagulate – forming a mass of spiteful disappointment that Breitbart was only more than happy to coddle
  • Historically, that seems to be Breitbart’s trick - strongly represent a single issue in order to earn trust, and then gradually indoctrinate to suit wider purposes. With Gamergate, they purposefully went fishing for anti-feminists. 2016’s batch of fresh converts – the white extremists – came from enticing conspiracy theories about the global neoliberal elite secretly controlling the world.
  • The greatest strength of Gamergate, though, was that it actually appeared to represent many left-leaning ideals: stamping out corruption in the press, pushing for better ethical practices, battling for openness.
  • There are similarities here with many who support Trump because of his promises to put an end to broken neo-liberalism, to “drain the swamp” of establishment corruption. Many left-leaning supporters of Gamergate sought to intellectualise their alignment with the hashtag, adopting familiar and acceptable labels of dissent – identifying as libertarian, egalitarian, humanist.
  • At best they unknowingly facilitated abuse, defending their own freedom of expression while those who actually needed support were threatened and attacked.
  • Genuine discussions over criticism, identity and censorship were paralysed and waylaid by Twitter voices obsessed with rhetorical fallacies and pedantic debating practices. While the core of these movements make people’s lives hell, the outer shell – knowingly or otherwise – protect abusers by insisting that the real problem is that you don’t want to talk, or won’t provide the ever-shifting evidence they politely require.
  • In 2017, the tactics used to discredit progressive game critics and developers will be used to discredit Trump and Bannon’s critics. There will be gaslighting, there will be attempts to make victims look as though they are losing their grip on reality, to the point that they gradually even start to believe it. The “post-truth” reality is not simply an accident – it is a concerted assault on the rational psyche.
  • The strangest aspect of Gamergate is that it consistently didn’t make any sense: people chose to align with it, and yet refused responsibility. It was constantly demanded that we debate the issues, but explanations and facts were treated with scorn. Attempts to find common ground saw the specifics of the demands being shifted: we want you to listen to us; we want you to change your ways; we want you to close your publication down. This movement that ostensibly wanted to protect free speech from cry bully SJWs simultaneously did what it could to endanger sites it disagreed with, encouraging advertisers to abandon support for media outlets that published stories critical of the hashtag. The petulance of that movement is disturbingly echoed in Trump’s own Twitter feed.
  • Looking back, Gamergate really only made sense in one way: as an exemplar of what Umberto Eco called “eternal fascism”, a form of extremism he believed could flourish at any point in, in any place – a fascism that would extol traditional values, rally against diversity and cultural critics, believe in the value of action above thought and encourage a distrust of intellectuals or experts – a fascism built on frustration and machismo. The requirement of this formless fascism would – above all else – be to remain in an endless state of conflict, a fight against a foe who must always be portrayed as impossibly strong and laughably weak
  • 2016 has presented us with a world in which our reality is being wilfully manipulated. Fake news, divisive algorithms, misleading social media campaigns.
  • The same voices moved into other geek communities, especially comics, where Marvel and DC were criticised for progressive storylines and decisions. They moved into science fiction with the controversy over the Hugo awards. They moved into cinema with the revolting kickback against the all-female Ghostbusters reboot.
  • Perhaps the true lesson of Gamergate was that the media is culturally unequipped to deal with the forces actively driving these online movements. The situation was horrifying enough two years ago, it is many times more dangerous now.
Maria Delzi

How Dangerous Neighborhoods Make You Feel Paranoid | TIME.com - 0 views

  • Simply walking through a sketchy-looking neighborhood can make you feel more paranoid and lower your trust in others
  • In a study published in the journal PeerJ, student volunteers who spent less than an hour in a more dangerous neighborhood showed significant changes in some of their social perceptions.
  • The researchers’ goal was to investigate the relationship between lower income neighborhoods and reduced trust and poor mental health.
  • ...10 more annotations...
  • from Newcastle University in the UK, wanted to determine whether the connection was due to people reacting to the environment around them, or because those who are generally less trusting were more likely to live in troubled areas. Prior research showed that kids who grew up in such neighborhoods were less likely to graduate from high school and more likely to develop stress that can lead to depression.
  • The study took 50 students, sent half of them to a low income, high crime neighborhood and the other half to an affluent neighborhood with little crime.
  • Before the students ventured into their respective areas, the researchers interviewed the neighborhood residents and found that residents of the high-crime neighborhood harbored more feelings of paranoia and lower levels of social trust compared to the residents of the other neighborhood.
  • The students in the study were not from either neighborhood, and did not know what the study was about. They were were dropped off by a taxi and told to deliver envelopes containing a packet of questions to a list of residential addresses. They spent 45 minutes walking around their assigned neighborhood distributing the envelopes. When the students returned, the researchers surveyed them about their experience, their feelings of trust, and their feelings of paranoia.
  • Despite the short amount of time they spent in the neighborhoods, the students picked up the prevailing social attitudes of the residents living in those environments; those who went to the more dangerous neighborhood scored higher on measures of paranoia and lower on measures of trust compared to the other group, just as the residents had.
  • Not only that, but their levels of reported paranoia and trust were indistinguishable from the residents who spent years living there.
  • That came as an intriguing surprise to other experts. Ingrid Gould Ellen, the director of the Urban Planning Program at New York University Wagner Graduate School of Public Service, studies how the make-up of neighborhoods can impact the attitudes and interactions of people who live in them
  • found that kids who live on blocks where violent crimes occurred the week before they took a standardized test performed worse on those tests than students from similar backgrounds who were not exposed to a violent crime in their neighborhood before their exam.
  • paranoia and lack of trust set in after just a short time in the more troubled neighborhood suggested how powerful the influence of these environments can be.
  • For urban planners, the findings confirm what most probably understood instinctively — that people do tend to make snap judgments about both their environments and the people in them based on visual cues such as broken windows and abandoned houses. But the results also show how these cues can influence deeper perceptions and mental states as well.
Javier E

Untier Of Knots « The Dish - 0 views

  • Benedict XVI and John Paul II focused on restoring dogmatic certainty as the counterpart to papal authority. Francis is arguing that both, if taken too far, can be sirens leading us away from God, not ensuring our orthodoxy but sealing us off in calcified positions and rituals that can come to mean nothing outside themselves
  • In this quest to seek and find God in all things there is still an area of uncertainty. There must be. If a person says that he met God with total certainty and is not touched by a margin of uncertainty, then this is not good. For me, this is an important key. If one has the answers to all the questions – that is the proof that God is not with him. It means that he is a false prophet using religion for himself. The great leaders of the people of God, like Moses, have always left room for doubt. You must leave room for the Lord, not for our certainties; we must be humble.
  • If the Christian is a restorationist, a legalist, if he wants everything clear and safe, then he will find nothing. Tradition and memory of the past must help us to have the courage to open up new areas to God.
  • ...31 more annotations...
  • In the end, you realize your only real option – against almost every fiber in your irate being – is to take each knot in turn, patiently and gently undo it, loosen a little, see what happens, and move on to the next. You will never know exactly when all the knots will resolve themselves – it can happen quite quickly after a while or seemingly never. But you do know that patience, and concern with the here and now, is the only way to “solve” the “problem.” You don’t look forward with a plan; you look down with a practice.
  • we can say what God is not, we can speak of his attributes, but we cannot say what He is. That apophatic dimension, which reveals how I speak about God, is critical to our theology
  • I would also classify as arrogant those theologies that not only attempted to define with certainty and exactness God’s attributes, but also had the pretense of saying who He was.
  • It is only in living that we achieve hints and guesses – and only hints and guesses – of what the Divine truly is. And because the Divine is found and lost by humans in time and history, there is no reachable truth for humans outside that time and history.
  • We are part of an unfolding drama in which the Christian, far from clinging to some distant, pristine Truth he cannot fully understand, will seek to understand and discern the “signs of the times” as one clue as to how to live now, in the footsteps of Jesus. Or in the words of T.S. Eliot, There is only the fight to recover what has been lost And found and lost again and again: and now, under conditions That seem unpropitious. But perhaps neither gain nor loss. For us, there is only the trying. The rest is not our business.
  • Ratzinger’s Augustinian notion of divine revelation: it is always a radical gift; it must always be accepted without question; it comes from above to those utterly unworthy below; and we are too flawed, too sinful, too human to question it in even the slightest respect. And if we ever compromise an iota on that absolute, authentic, top-down truth, then we can know nothing as true. We are, in fact, lost for ever.
  • A Christian life is about patience, about the present and about trust that God is there for us. It does not seek certainty or finality to life’s endless ordeals and puzzles. It seeks through prayer and action in the world to listen to God’s plan and follow its always-unfolding intimations. It requires waiting. It requires diligence
  • We may never know why exactly Benedict resigned as he did. But I suspect mere exhaustion of the body and mind was not the whole of it. He had to see, because his remains such a first-rate mind, that his project had failed, that the levers he continued to pull – more and more insistent doctrinal orthodoxy, more political conflict with almost every aspect of the modern world, more fastidious control of liturgy – simply had no impact any more.
  • The Pope must accompany those challenging existing ways of doing things! Others may know better than he does. Or, to feminize away the patriarchy: I dream of a church that is a mother and shepherdess. The church’s ministers must be merciful, take responsibility for the people, and accompany them like the good Samaritan, who washes, cleans, and raises up his neighbor. This is pure Gospel.
  • the key to Francis’ expression of faith is an openness to the future, a firm place in the present, and a willingness to entertain doubt, to discern new truths and directions, and to grow. Think of Benedict’s insistence on submission of intellect and will to the only authentic truth (the Pope’s), and then read this: Within the Church countless issues are being studied and reflected upon with great freedom. Differing currents of thought in philosophy, theology, and pastoral practice, if open to being reconciled by the Spirit in respect and love, can enable the Church to grow, since all of them help to express more clearly the immense riches of God’s word. For those who long for a monolithic body of doctrine guarded by all and leaving no room for nuance, this might appear as undesirable and leading to confusion. But in fact such variety serves to bring out and develop different facets of the inexhaustible riches of the Gospel.
  • Francis, like Jesus, has had such an impact in such a short period of time simply because of the way he seems to be. His being does not rely on any claims to inherited, ecclesiastical authority; his very way of life is the only moral authority he wants to claim.
  • faith is, for Francis, a way of life, not a set of propositions. It is a way of life in community with others, lived in the present yet always, deeply, insistently aware of eternity.
  • Father Howard Gray S.J. has put it simply enough: Ultimately, Ignatian spirituality trusts the world as a place where God dwells and labors and gathers all to himself in an act of forgiveness where that is needed, and in an act of blessing where that is prayed for.
  • Underlying all this is a profound shift away from an idea of religion as doctrine and toward an idea of religion as a way of life. Faith is a constantly growing garden, not a permanently finished masterpiece
  • Some have suggested that much of what Francis did is compatible with PTSD. He disowned his father and family business, and he chose to live homeless, and close to naked, in the neighboring countryside, among the sick and the animals. From being the dashing man of society he had once been, he became a homeless person with what many of us today would call, at first blush, obvious mental illness.
  • these actions – of humility, of kindness, of compassion, and of service – are integral to Francis’ resuscitation of Christian moral authority. He is telling us that Christianity, before it is anything else, is a way of life, an orientation toward the whole, a living commitment to God through others. And he is telling us that nothing – nothing – is more powerful than this.
  • I would not speak about, not even for those who believe, an “absolute” truth, in the sense that absolute is something detached, something lacking any relationship. Now, the truth is a relationship! This is so true that each of us sees the truth and expresses it, starting from oneself: from one’s history and culture, from the situation in which one lives, etc. This does not mean that the truth is variable and subjective. It means that it is given to us only as a way and a life. Was it not Jesus himself who said: “I am the way, the truth, the life”? In other words, the truth is one with love, it requires humbleness and the willingness to be sought, listened to and expressed.
  • “proselytism is solemn nonsense.” That phrase – deployed by the Pope in dialogue with the Italian atheist Eugenio Scalfari (as reported by Scalfari) – may seem shocking at first. But it is not about denying the revelation of Jesus. It is about how that revelation is expressed and lived. Evangelism, for Francis, is emphatically not about informing others about the superiority of your own worldview and converting them to it. That kind of proselytism rests on a form of disrespect for another human being. Something else is needed:
  • nstead of seeming to impose new obligations, Christians should appear as people who wish to share their joy, who point to a horizon of beauty and who invite others to a delicious banquet. It is not by proselytizing that the Church grows, but “by attraction.”
  • what you see in the life of Saint Francis is a turn from extreme violence to extreme poverty, as if only the latter could fully compensate for the reality of the former. This was not merely an injunction to serve the poor. It is the belief that it is only by being poor or becoming poor that we can come close to God
  • Pope Francis insists – and has insisted throughout his long career in the church – that poverty is a key to salvation. And in choosing the name Francis, he explained last March in Assisi, this was the central reason why:
  • Saint Francis. His conversion came after he had gone off to war in defense of his hometown, and, after witnessing horrifying carnage, became a prisoner of war. After his release from captivity, his strange, mystical journey began.
  • the priority of practice over theory, of life over dogma. Evangelization is about sitting down with anyone anywhere and listening and sharing and being together. A Christian need not be afraid of this encounter. Neither should an atheist. We are in this together, in the same journey of life, with the same ultimate mystery beyond us. When we start from that place – of radical humility and radical epistemological doubt – proselytism does indeed seem like nonsense, a form of arrogance and detachment, reaching for power, not freedom. And evangelization is not about getting others to submit their intellect and will to some new set of truths; it is about an infectious joy for a new way of living in the world. All it requires – apart from joy and faith – is patience.
  • “Preach the Gospel always. If necessary, with words.”
  • But there is little sense that a political or economic system can somehow end the problem of poverty in Francis’ worldview. And there is the discomfiting idea that poverty itself is not an unmitigated evil. There is, indeed, a deep and mysterious view, enunciated by Jesus, and held most tenaciously by Saint Francis, that all wealth, all comfort, and all material goods are suspect and that poverty itself is a kind of holy state to which we should all aspire.
  • Not only was Saint Francis to become homeless and give up his patrimony, he was to travel on foot, wearing nothing but a rough tunic held together with rope. Whatever else it is, this is not progressivism. It sees no structural, human-devised system as a permanent improver of our material lot. It does not envision a world without poverty, but instead a church of the poor and for the poor. The only material thing it asks of the world, or of God, is daily bread – and only for today, never for tomorrow.
  • From this perspective, the idea that a society should be judged by the amount of things it can distribute to as many people as possible is anathema. The idea that there is a serious social and political crisis if we cannot keep our wealth growing every year above a certain rate is an absurdity.
  • this is a 21st-century heresy. Which means, I think, that this Pope is already emerging and will likely only further emerge as the most potent critic of the newly empowered global capitalist project.
  • Now, the only dominant ideology in the world is the ideology of material gain – either through the relatively free markets of the West or the state-controlled markets of the East. And so the church’s message is now harder to obscure. It stands squarely against the entire dominant ethos of our age. It is the final resistance.
  • For Francis, history has not come to an end, and capitalism, in as much as it is a global ideology that reduces all of human activity to the cold currency of wealth, is simply another “ism” to be toppled in humankind’s unfolding journey toward salvation on earth.
  • Francis will grow as the church reacts to him; it will be a dynamic, not a dogma; and it will be marked less by the revelation of new things than by the new recognition of old things, in a new language. It will be, if its propitious beginnings are any sign, a patient untying of our collective, life-denying knots.
Javier E

The New York Times > Magazine > In the Magazine: Faith, Certainty and the Presidency of... - 0 views

  • The Delaware senator was, in fact, hearing what Bush's top deputies -- from cabinet members like Paul O'Neill, Christine Todd Whitman and Colin Powell to generals fighting in Iraq -- have been told for years when they requested explanations for many of the president's decisions, policies that often seemed to collide with accepted facts. The president would say that he relied on his ''gut'' or his ''instinct'' to guide the ship of state, and then he ''prayed over it.''
  • What underlies Bush's certainty? And can it be assessed in the temporal realm of informed consent?
  • Top officials, from cabinet members on down, were often told when they would speak in Bush's presence, for how long and on what topic. The president would listen without betraying any reaction. Sometimes there would be cross-discussions -- Powell and Rumsfeld, for instance, briefly parrying on an issue -- but the president would rarely prod anyone with direct, informed questions.
  • ...13 more annotations...
  • This is one key feature of the faith-based presidency: open dialogue, based on facts, is not seen as something of inherent value. It may, in fact, create doubt, which undercuts faith. It could result in a loss of confidence in the decision-maker and, just as important, by the decision-maker.
  • has spent a lot of time trying to size up the president. ''Most successful people are good at identifying, very early, their strengths and weaknesses, at knowing themselves,'' he told me not long ago. ''For most of us average Joes, that meant we've relied on strengths but had to work on our weakness -- to lift them to adequacy -- otherwise they might bring us down. I don't think the president really had to do that, because he always had someone there -- his family or friends -- to bail him out. I don't think, on balance, that has served him well for the moment he's in now as president. He never seems to have worked on his weaknesses.''
  • Details vary, but here's the gist of what I understand took place. George W., drunk at a party, crudely insulted a friend of his mother's. George senior and Barbara blew up. Words were exchanged along the lines of something having to be done. George senior, then the vice president, dialed up his friend, Billy Graham, who came to the compound and spent several days with George W. in probing exchanges and walks on the beach. George W. was soon born again. He stopped drinking, attended Bible study and wrestled with issues of fervent faith. A man who was lost was saved.
  • Rubenstein described that time to a convention of pension managers in Los Angeles last year, recalling that Malek approached him and said: ''There is a guy who would like to be on the board. He's kind of down on his luck a bit. Needs a job. . . . Needs some board positions.'' Though Rubenstein didn't think George W. Bush, then in his mid-40's, ''added much value,'' he put him on the Caterair board. ''Came to all the meetings,'' Rubenstein told the conventioneers. ''Told a lot of jokes. Not that many clean ones. And after a while I kind of said to him, after about three years: 'You know, I'm not sure this is really for you. Maybe you should do something else. Because I don't think you're adding that much value to the board. You don't know that much about the company.' He said: 'Well, I think I'm getting out of this business anyway. And I don't really like it that much. So I'm probably going to resign from the board.' And I said thanks. Didn't think I'd ever see him again.''
  • challenges -- from either Powell or his opposite number as the top official in domestic policy, Paul O'Neill -- were trials that Bush had less and less patience for as the months passed. He made that clear to his top lieutenants. Gradually, Bush lost what Richard Perle, who would later head a largely private-sector group under Bush called the Defense Policy Board Advisory Committee, had described as his open posture during foreign-policy tutorials prior to the 2000 campaign. (''He had the confidence to ask questions that revealed he didn't know very much,'' Perle said.) By midyear 2001, a stand-and-deliver rhythm was established. Meetings, large and small, started to take on a scripted quality.
  • That a deep Christian faith illuminated the personal journey of George W. Bush is common knowledge. But faith has also shaped his presidency in profound, nonreligious ways. The president has demanded unquestioning faith from his followers, his staff, his senior aides and his kindred in the Republican Party. Once he makes a decision -- often swiftly, based on a creed or moral position -- he expects complete faith in its rightness.
  • A cluster of particularly vivid qualities was shaping George W. Bush's White House through the summer of 2001: a disdain for contemplation or deliberation, an embrace of decisiveness, a retreat from empiricism, a sometimes bullying impatience with doubters and even friendly questioners.
  • By summer's end that first year, Vice President Dick Cheney had stopped talking in meetings he attended with Bush. They would talk privately, or at their weekly lunch. The president was spending a lot of time outside the White House, often at the ranch, in the presence of only the most trustworthy confidants.
  • ''When I was first with Bush in Austin, what I saw was a self-help Methodist, very open, seeking,'' Wallis says now. ''What I started to see at this point was the man that would emerge over the next year -- a messianic American Calvinist. He doesn't want to hear from anyone who doubts him.''
  • , I had a meeting with a senior adviser to Bush. He expressed the White House's displeasure, and then he told me something that at the time I didn't fully comprehend -- but which I now believe gets to the very heart of the Bush presidency.
  • The aide said that guys like me were ''in what we call the reality-based community,'' which he defined as people who ''believe that solutions emerge from your judicious study of discernible reality.'' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. ''That's not the way the world really works anymore,'' he continued. ''We're an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.''
  • ''If you operate in a certain way -- by saying this is how I want to justify what I've already decided to do, and I don't care how you pull it off -- you guarantee that you'll get faulty, one-sided information,'' Paul O'Neill, who was asked to resign his post of treasury secretary in December 2002, said when we had dinner a few weeks ago. ''You don't have to issue an edict, or twist arms, or be overt.''
  • George W. Bush and his team have constructed a high-performance electoral engine. The soul of this new machine is the support of millions of likely voters, who judge his worth based on intangibles -- character, certainty, fortitude and godliness -- rather than on what he says or does.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

Mind - Past Adversity May Aid Emotional Recovery - NYTimes.com - 0 views

  • “As with so many of life’s experiences, humans are simply not very good at predicting how they’ll behave when hit by a real adversity,”
  • no one can reliably predict who will move on quickly and who will lapse into longer-term despair.
  • the number of life blows a person has taken may affect his or her mental toughness more than any other factor.
  • ...4 more annotations...
  • “Each negative event a person faces leads to an attempt to cope, which forces people to learn about their own capabilities, about their support networks — to learn who their real friends are. That kind of learning, we think, is extremely valuable for subsequent coping,” up to a point.
  • A subset of the participants, 194, reported that they had experienced not one of the fairly comprehensive list of 37 events on the survey. “We wondered: Who are these people who have managed to go through life with nothing bad happening to them?”
  • Dr. Cohen Silver said. “Are they hyper-conscientious? Socially isolated? Just young? Or otherwise unique?” They weren’t, the researchers found. Stranger still, they were not the most satisfied with their lives. Their sense of well-being was about the same, on average, as people who had suffered up to a dozen memorable blows.
  • It was those in the middle, those reporting two to six stressful events, who scored highest on several measures of well-being, and who showed the most resilience in response to recent hits.
Javier E

Is Everyone a Little Bit Racist? - NYTimes.com - 0 views

  • Research in the last couple of decades suggests that the problem is not so much overt racists. Rather, the larger problem is a broad swath of people who consider themselves enlightened, who intellectually believe in racial equality, who deplore discrimination, yet who harbor unconscious attitudes that result in discriminatory policies and behavior.
  • The player takes on the role of a police officer who is confronted with a series of images of white or black men variously holding guns or innocent objects such as wallets or cellphones. The aim is to shoot anyone with a gun while holstering your weapon in other cases.Ordinary players (often university undergraduates) routinely shoot more quickly at black men than at white men, and are more likely to mistakenly shoot an unarmed black man than an unarmed white man.
  • Correll has found no statistically significant difference between the play of blacks and that of whites in the shooting game.
  • ...4 more annotations...
  • an uncomfortable starting point is to understand that racial stereotyping remains ubiquitous, and that the challenge is not a small number of twisted white supremacists but something infinitely more subtle and complex: People who believe in equality but who act in ways that perpetuate bias and inequality.
  • One finding is that we unconsciously associate “American” with “white.” Thus, in 2008, some California college students — many who were supporting Barack Obama for president — unconsciously treated Obama as more foreign than Tony Blair, the former British prime minister.
  • “There’s a whole culture that promotes this idea of aggressive young black men,” Correll notes. “In our minds, young black men are associated with danger.”
  • Joshua Correll of the University of Colorado at Boulder has used an online shooter video game to try to measure these unconscious attitudes (you can play the game yourself).
‹ Previous 21 - 40 of 2228 Next › Last »
Showing 20 items per page