Skip to main content

Home/ TOK Friends/ Group items tagged affinity

Rss Feed Group items tagged

Javier E

Stephen Hawking just gave humanity a due date for finding another planet - The Washingt... - 0 views

  • Hawking told the audience that Earth's cataclysmic end may be hastened by humankind, which will continue to devour the planet’s resources at unsustainable rates
  • “Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years. By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race.”
  • “I think the development of full artificial intelligence could spell the end of the human race,” Hawking told the BBC in a 2014 interview that touched upon everything from online privacy to his affinity for his robotic-sounding voice.
  • ...1 more annotation...
  • “Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate,” Hawking warned in recent months. “Humans, who are limited by slow biological evolution, couldn't compete and would be superseded.”
Javier E

Do Your Friends Actually Like You? - The New York Times - 1 views

  • Recent research indicates that only about half of perceived friendships are mutual. That is, someone you think is your friend might not be so keen on you. Or, vice versa, as when someone you feel you hardly know claims you as a bestie.
  • Some blame human beings’ basic optimism, if not egocentrism, for the disconnect between perceived and actual friendships. Others point to a misunderstanding of the very notion of friendship in an age when “friend” is used as a verb, and social inclusion and exclusion are as easy as a swipe or a tap on a smartphone screen.
  • It’s a concern because the authenticity of one’s relationships has an enormous impact on one’s health and well-being.
  • ...11 more annotations...
  • The study analyzed friendship ties among 84 subjects (ages 23 to 38) in a business management class by asking them to rank one another on a five-point continuum of closeness from “I don’t know this person” to “One of my best friends.” The feelings were mutual 53 percent of the time while the expectation of reciprocity was pegged at 94 percent. This is consistent with data from several other friendship studies conducted over the past decade, encompassing more than 92,000 subjects, in which the reciprocity rates ranged from 34 percent to 53 percent.
  • “Friendship is difficult to describe,” said Alexander Nehamas, a professor of philosophy at Princeton, who in his latest book, “On Friendship,” spends almost 300 pages trying to do just that. “It’s easier to say what friendship is not and, foremost, it is not instrumental.”
  • It is not a means to obtain higher status, wangle an invitation to someone’s vacation home or simply escape your own boredom. Rather, Mr. Nehamas said, friendship is more like beauty or art, which kindles something deep within us and is “appreciated for its own sake.
  • “Treating friends like investments or commodities is anathema to the whole idea of friendship,” said Ronald Sharp, a professor of English at Vassar College, who teaches a course on the literature of friendship. “It’s not about what someone can do for you, it’s who and what the two of you become in each other’s presence.”
  • “The notion of doing nothing but spending time in each other’s company has, in a way, become a lost art,” replaced by volleys of texts and tweets, Mr. Sharp said. “People are so eager to maximize efficiency of relationships that they have lost touch with what it is to be a friend.”
  • By his definition, friends are people you take the time to understand and allow to understand you.
  • Because time is limited, so, too, is the number of friends you can have, according to the work of the British evolutionary psychologist Robin I.M. Dunbar. He describes layers of friendship, where the topmost layer consists of only one or two people, say a spouse and best friend with whom you are most intimate and interact daily. The next layer can accommodate at most four people for whom you have great affinity, affection and concern and who require weekly attention to maintain. Out from there, the tiers contain more casual friends with whom you invest less time and tend to have a less profound and more tenuous connection. Without consistent contact, they easily fall into the realm of acquaintance. You may be friendly with them but they aren’t friends.
  • “There is a limited amount of time and emotional capital we can distribute, so we only have five slots for the most intense type of relationship,” Mr. Dunbar said. “People may say they have more than five but you can be pretty sure they are not high-quality friendships.
  • Such boasting implies they have soul mates to spare in a culture where we are taught that leaning on someone is a sign of weakness and power is not letting others affect you. But friendship requires the vulnerability of caring as well as revealing things about yourself that don’t match the polished image in your Facebook profile or Instagram feed, said Mr. Nehamas at Princeton. Trusting that your bond will continue, and might even be strengthened, despite your shortcomings and inevitable misfortunes, he said, is a risk many aren’t willing to take.
  • According to medical experts, playing it safe by engaging in shallow, unfulfilling or nonreciprocal relationships has physical repercussions. Not only do the resulting feelings of loneliness and isolation increase the risk of death as much as smoking, alcoholism and obesity; you may also lose tone, or function, in the so-called smart vagus nerve, which brain researchers think allows us to be in intimate, supportive and reciprocal relationships in the first place.
  • In the presence of a true friend, Dr. Banks said, the smart or modulating aspect of the vagus nerve is what makes us feel at ease rather than on guard as when we are with a stranger or someone judgmental. It’s what enables us to feel O.K. about exposing the soft underbelly of our psyche and helps us stay engaged and present in times of conflict. Lacking authentic friendships, the smart vagus nerve is not exercised. It loses tone and one’s anxiety remains high, making abiding, deep connections difficult.
Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Duncan H

Mitt Romney's Problem Speaking About Money - NYTimes.com - 0 views

  • Why is someone who is so good at making money so bad at talking about it?Mitt Romney is not the first presidential candidate who’s had trouble communicating with working-class voters: John Kerry famously enjoyed wind-surfing, and George Bush blamed a poor showing in a straw poll on the fact that many of his supporters were “at their daughter’s coming out party.”Veritable battalions of Kennedys and Roosevelts have dealt with the economic and cultural gaps between themselves and the voters over the years without much difficulty. Not so Barack Obama, whose attempt to commiserate with Iowa farmers in 2007 about crop prices by mentioning the cost of arugula at Whole Foods fell flat.
  • Romney’s reference last week to the fact that his wife “drives a couple of Cadillacs, actually” is not grounds in itself for a voter to oppose his candidacy. Neither was the $10,000 bet he offered to Rick Perry during a debate in December or the time he told a group of the unemployed in Florida that he was “also unemployed.”But his penchant for awkward references to his own wealth has underscored the suspicion that many voters have about his ability to understand their economic problems. His opponents in both parties  are gleefully highlighting these moments as a way to drive a wedge between Romney and the working class voters who have become an increasingly important part of the Republican Party base.
  • The current economic circumstances have undoubtedly exacerbated the problem for Romney. Had Obama initially sought the presidency during a primary season dominated by concerns about the domestic economy rather than war in Iraq, his explanation that small town voters “get bitter, they cling to guns or religion or antipathy to people who aren’t like them” might have created an opportunity for Hillary Clinton or even the populist message of John Edwards.
  • ...7 more annotations...
  • But Obama’s early opposition to the Iraq war gave him a political firewall that protected him throughout that primary campaign, while Romney has no such policy safe harbor to safeguard him from an intramural backlash.
  • Romney and Obama share a lack of natural affinity for this key group of swing voters, but it is Romney who needs to figure out some way of addressing this shortcoming if he wants to make it to the White House. It’s Romney’s misfortune that the voters’ prioritization of economic issues, his own privileged upbringing and his lack of connection with his party’s base on other core issues put him in a much more precarious position than candidate Obama ever reached.
  • By the time the 2008 general election rolled around, Obama had bolstered his outreach to these voters by recruiting the blue-collar avatar Joe Biden as his running mate. Should Romney win the Republican nomination this year, his advisers will almost certainly be tempted by the working-class credentials that a proletarian like New Jersey Governor Chris Christie or Florida Senator Marco Rubio would bring to the ticket.
  • Of more immediate concern to Team Romney should be how their candidate can overcome his habit of economic tone-deafness before Rick Santorum steals away enough working-class and culturally conservative voters to throw the Republican primary into complete and utter turmoil.
  • The curious thing about Romney’s verbal missteps is how limited they are to this very specific area of public policy. He is usually quite articulate when talking about foreign affairs and national security. Despite his complicated history on social and cultural matters like health care and abortion, his explanations are usually both coherent and comprehensible, even to those who oppose his positions. It’s only when he begins talking about economic issues – his biographical strength – that he seems to get clumsy.
  • The second possibility would be for him to outline a series of proposals specifically targeted at the needs of working-class and poor Americans, not only to control the damage from his gaffes but also to underscore the conservative premise that a right-leaning agenda will create opportunities for those on the lower rungs of the economic ladder. But while that approach might help Romney in a broader philosophical conversation, it’s unlikely to offer him much protection from the attacks and ridicule that his unforced errors will continue to bring him.
  • The question is why Romney hasn’t embraced a third alternative – admitting the obvious and then explaining why he gets so tongue-tied when the conversation turns to money. Romney’s upbringing and religious faith suggest a sense of obligation to the less fortunate and an unspoken understanding that it isn’t appropriate to call attention to one’s financial success.It wouldn’t be that hard for him to say something like:I was taught not to brag and boast and think I’m better than other people because of the successes I’ve had, so occasionally I’m going to say things that sound awkward. It’s because I’d rather talk about what it takes to get America back to work.
  •  
    Do you think the solution Douthat proposes would work?
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
grayton downing

Drug Widens Immunity to Flu | The Scientist Magazine® - 0 views

  • drug rapamycin paradoxically helped to protect mice against a diverse range of influenza viruses after the animals were vaccinated against just one flu strain.
  • many subtypes and strains of influenza, which evolve at great speed and often hybridize into entirely new strains. Current flu vaccines cannot protect against all of these strains, which forces scientists to try and predict those most likely to cause problems in the coming year.
  • In treated mice, the B cells produced a more diverse repertoire of antibodies, which targeted different parts of the incoming viruses, including regions that are conserved across many strains. This provided protection against flu viruses regardless of strain. 
  • ...3 more annotations...
  • cross-reactive antibodies bind relatively weakly to their targets and, under normal circumstances, would probably get outcompeted by antibodies with a narrower focus but higher affinity. “For whatever reason, antibodies to the conserved regions are very rare,”
  • possible to skew the response towards more broadly cross-reactive antibodies, in mice, in a particular situation,”
  • not advocating that we use rapamycin [in humans],” said McGargill. However, her group’s discovery could point to other ways of achieving the same effect, perhaps by manipulating the immune system into producing more cross-reactive antibodies. “Maybe instead of trying to enhance the immune response, we need to dampen it a little bit, and allow it to be more diverse,
Javier E

The Twitter Trap - NYTimes.com - 0 views

  • innovation often comes at a price. And sometimes I wonder if the price is a piece of ourselves.
  • Basically, we are outsourcing our brains to the cloud. The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.” But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.
  • Twitter is not just an ambient presence. It demands attention and response. It is the enemy of contemplation.
  • ...4 more annotations...
  • I’m not even sure these new instruments are genuinely “social.” There is something decidedly faux about the camaraderie of Facebook, something illusory about the connectedness of Twitter.
  • In an actual discussion, the marshaling of information is cumulative, complication is acknowledged, sometimes persuasion occurs. In a Twitter discussion, opinions and our tolerance for others’ opinions are stunted. Whether or not Twitter makes you stupid, it certainly makes some smart people sound stupid.
  • The shortcomings of social media would not bother me awfully if I did not suspect that Facebook friendship and Twitter chatter are displacing real rapport and real conversation, just as Gutenberg’s device displaced remembering. The things we may be unlearning, tweet by tweet — complexity, acuity, patience, wisdom, intimacy — are things that matter.
  • there is a wistful passage about the high-school cohort my daughter is about to join. Wolitzer describes them this way: “The generation that had information, but no context. Butter, but no bread. Craving, but no longing.”
catbclark

Why Do Many Reasonable People Doubt Science? - National Geographic Magazine - 0 views

  • Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking water systems, hardens tooth enamel and prevents tooth decay—a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brusher or not. That’s the scientific and medical consensus.
  • when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense
  • all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition.
  • ...61 more annotations...
  • Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts.
  • Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable, and rich in rewards—but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.
  • The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy.
  • In this bewildering world we have to decide what to believe and how to act on that. In principle that’s what science is for.
  • “Science is not a body of facts,” says geophysicist Marcia McNutt,
  • “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”
  • The scientific method leads us to truths that are less than self-evident, often mind-blowing, and sometimes hard to swallow.
  • We don’t believe you.
  • Galileo was put on trial and forced to recant. Two centuries later Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales, and even deep-sea mollusks is still a big ask for a lot of people. So is another 19th-century notion: that carbon dioxide, an invisible gas that we all exhale all the time and that makes up less than a tenth of one percent of the atmosphere, could be affecting Earth’s climate.
  • we intellectually accept these precepts of science, we subconsciously cling to our intuitions
  • Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They lurk in our brains, chirping at us as we try to make sense of the world.
  • Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics.
  • We have trouble digesting randomness; our brains crave pattern and meaning.
  • we can deceive ourselves.
  • Even for scientists, the scientific method is a hard discipline. Like the rest of us, they’re vulnerable to what they call confirmation bias—the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them
  • other scientists will try to reproduce them
  • Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
  • Many people in the United States—a far greater percentage than in other countries—retain doubts about that consensus or believe that climate activists are using the threat of global warming to attack the free market and industrial society generally.
  • news media give abundant attention to such mavericks, naysayers, professional controversialists, and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses
  • science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else—but their dogma is always wilting in the hot glare of new research.
  • But industry PR, however misleading, isn’t enough to explain why only 40 percent of Americans, according to the most recent poll from the Pew Research Center, accept that human activity is the dominant cause of global warming.
  • “science communication problem,”
  • yielded abundant new research into how people decide what to believe—and why they so often don’t accept the scientific consensus.
  • higher literacy was associated with stronger views—at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce beliefs that have already been shaped by their worldview.
  • “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change.
  • “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to—some kind of tax or regulation to limit emissions.
  • For a hierarchical individualist, Kahan says, it’s not irrational to reject established climate science: Accepting it wouldn’t change the world, but it might get him thrown out of his tribe.
  • Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers.
  • organizations funded in part by the fossil fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics.
  • Internet makes it easier than ever for climate skeptics and doubters of all kinds to find their own information and experts
  • Internet has democratized information, which is a good thing. But along with cable TV, it has made it possible to live in a “filter bubble” that lets in only the information with which you already agree.
  • How to convert climate skeptics? Throwing more facts at them doesn’t help.
  • people need to hear from believers they can trust, who share their fundamental values.
  • We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community.
  • “Believing in evolution is just a description about you. It’s not an account of how you reason.”
  • evolution actually happened. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines really do save lives. Being right does matter—and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.
  • Doubting science also has consequences.
  • In the climate debate the consequences of doubt are likely global and enduring. In the U.S., climate change skeptics have achieved their fundamental goal of halting legislative action to combat global warming.
  • “That line between science communication and advocacy is very hard to step back from,”
  • It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app.
  • that need to fit in is so strong that local values and local opinions are always trumping science.
  • not a sin to change your mind when the evidence demands it.
  • for the best scientists, the truth is more important than the tribe.
  • Students come away thinking of science as a collection of facts, not a method.
  • Shtulman’s research has shown that even many college students don’t really understand what evidence is.
  • “Everybody should be questioning,” says McNutt. “That’s a hallmark of a scientist. But then they should use the scientific method, or trust people using the scientific method, to decide which way they fall on those questions.”
  • science has made us the dominant organisms,
  • incredibly rapid change, and it’s scary sometimes. It’s not all progress.
  • But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on the Oprah Winfrey Show, “The University of Google is where I got my degree from.”)
    • catbclark
       
      Power of celebraties, internet as a source 
  • The scientific method doesn’t come naturally—but if you think about it, neither does democracy. For most of human history neither existed. We went around killing each other to get on a throne, praying to a rain god, and for better and much worse, doing things pretty much as our ancestors did.
  • We need to get a lot better at finding answers, because it’s certain the questions won’t be getting any simpler.
  • That the Earth is round has been known since antiquity—Columbus knew he wouldn’t sail off the edge of the world—but alternative geographies persisted even after circumnavigations had become common
  • We live in an age when all manner of scientific knowledge—from climate change to vaccinations—faces furious opposition.Some even have doubts about the moon landing.
  • Why Do Many Reasonable People Doubt Science?
  • science doubt itself has become a pop-culture meme.
  • Flat-Earthers held that the planet was centered on the North Pole and bounded by a wall of ice, with the sun, moon, and planets a few hundred miles above the surface. Science often demands that we discount our direct sensory experiences—such as seeing the sun cross the sky as if circling the Earth—in favor of theories that challenge our beliefs about our place in the universe.
  • . Yet just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not still random.
  • Sometimes scientists fall short of the ideals of the scientific method. Especially in biomedical research, there’s a disturbing trend toward results that can’t be reproduced outside the lab that found them, a trend that has prompted a push for greater transparency about how experiments are conducted
  • “Science will find the truth,” Collins says. “It may get it wrong the first time and maybe the second time, but ultimately it will find the truth.” That provisional quality of science is another thing a lot of people have trouble with.
  • scientists love to debunk one another
  • they will continue to trump science, especially when there is no clear downside to ignoring science.”
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

A scholar asks, 'Can democracy survive the Internet?' - The Washington Post - 0 views

  • Nathaniel Persily, a law professor at Stanford University
  • has written about this in a forthcoming issue of the Journal of Democracy in an article with a title that sums up his concerns: “Can Democracy Survive the Internet?”
  • Persily argues that the 2016 campaign broke down previously established rules and distinctions “between insiders and outsiders, earned media and advertising, media and non-media, legacy media and new media, news and entertainment and even foreign and domestic sources of campaign communication.”
  • ...10 more annotations...
  • Clinton played by old rules; Trump did not. He recognized the potential rewards of exploiting what the Internet offered, and he conducted his campaign through unconventional means.
  • “That’s what Donald Trump realized that a lot of us didn’t,” Persily said. “That it was more important to swamp the communication environment than it was to advocate for a particular belief or fight for the truth of a particular story,”
  • Persily notes that the Internet reacted to the Trump campaign “like an ecosystem welcoming a new and foreign species. His candidacy triggered new strategies and promoted established Internet forces. Some of these (such as the ‘alt-right’) were moved by ideological affinity, while others sought to profit financially or to further a geopolitical agenda.
  • The rise and power of the Internet has accelerated the decline of institutions that once provided a mediating force in campaigns. Neither the legacy media nor the established political parties exercise the power they once had as referees, particularly in helping to sort out the integrity of information.
  • legacy media that once helped set the agenda for political conversation now often take their cues from new media.
  • The Internet, however, involves characteristics that heighten the disruptive and damaging influences on political campaigns. One, Persily said, is the velocity of information, the speed with which news, including fake news, moves and expands and is absorbed. Viral communication can create dysfunction in campaigns and within democracies.
  • Another factor is the pervasiveness of anonymous communication, clearly greater and more odious today. Anonymity facilitates a coarsening of speech on the Internet. It has become more and more difficult to determine the sources of such information, including whether these communications are produced by real people or by automated programs known as “bots.”
  • “the prevalence of bots in spreading propaganda and fake news appears to have reached new heights. One study found that between 16 September and 21 October 2016, bots produced about a fifth of all tweets related to the upcoming election. Across all three presidential debates, pro-Trump twitter bots generated about four times as many tweets as pro-Clinton bots. During the final debate in particular, that figure rose to seven times as many.”
  • the fear of dark money and “shady outsiders” running television commercials “seems quaint when compared to networks of thousands of bots of uncertain geographic origin creating automated messages designed to malign candidates and misinform voters.”
  • When asked how worrisome all this is, Persily said, “I’m extremely concerned.” He was quick to say he did not believe government should or even could regulate this new environment. But, he said, “We need to come to grips with how the new communication environment affects people’s political beliefs, the information they receive and then the choices that they make.”
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
cvanderloo

St Patrick's day: why so many US presidents like to say 'I'm Irish' - 0 views

  • Biden is the most strongly identified Irish-American in the White House since John F Kennedy, the only other Catholic president.
  • rish nationalist sentiments run high in the US, especially among its large diaspora. US presidents frequently indulge these views, at least symbolically. But, in practical terms, they have had little impact on the US-UK relationship.
  • More than 30 million people in the US – about one in ten Americans – identify as “Irish”.
  • ...6 more annotations...
  • there are over six times as many people in the US who claim to be Irish in the US as those living in the Republic of Ireland itself.
  • If measured by when their last ancestor left Ireland, Joe Biden is no more Irish than Barack Obama
  • Perhaps the most dramatic example of this was shown by Jimmy Carter, who – on St Patrick’s Day 1976 – marched down Fifth Avenue in New York wearing a badge emblazoned with the slogan “England, get out of Ireland”.
  • With Donald Trump being the exception, nearly every president of the last half-century has identified as “Irish”, even when the evidence of such a link has been tenuous.
  • In spite of this, US presidential administrations have sought a more balanced approach. The US considers the UK to be one of its most valuable and important strategic partners. US presidents work closely with British governments, while also offering symbolic affirmation for Ireland.
  • While Biden’s personal affinities are clear, we should expect him to follow his predecessors in placing US security interests before Irish nationalist affections.
katherineharron

How coronavirus hypocrisy is tarnishing Boris Johnson's government (opinion) - CNN - 0 views

  • Johnson has proved staunch in his defense of his close ally since the latter was accused of breaking the UK's strict lockdown by driving 260 miles with his wife, who he admits was displaying some symptoms of coronavirus, and young son to be near his extended family.
  • In quarantine-fatigued Britain, however, where many have agonized over the command to stay away from frightened, sick and dying relatives, the Prime Minister's words have not gone down well. Highly unusually, several of his own Conservative MPs are now calling for Cummings to be sacked, and even the government-friendly Daily Mail asked: "What Planet Are They On?" of his decision to stand by his man.
  • In one of the more moving responses, Helen Goodman, until December the Labour Party MP for Durham, the northern town Cummings visited to stay in a property belonging to his parents, said she was "appalled" by his behavior, given her own father had died alone from Covid-19 in a local care home after she obeyed the rules and did not visit.
  • ...10 more annotations...
  • Saying he had no regrets, he added: "I believe in all circumstances I behaved reasonably and legally. The legal rules do not inevitably cover all circumstances - including those I found myself in." Also on Monday, Johnson expressed "regret" for the "confusion, anger and pain" experienced by the British people as a result of the controversy; when pressed on whether he believes Cummings' decision has compromised the government's coronavirus message, Johnson doubled down on his support for Cummings, asserting, "I do not believe that anybody at Number 10 has done anything to undermine our message."
  • "The regulations made clear, I believe, that risks to the health of a small child were an exceptional situation."
  • To talk of the British sense of fair play is almost a cliché. But there is certainly a particular sensitivity among Britons to suggestions of hypocrisy which have thus far thwarted Cummings' attempts to brush off criticism of his excursion, and which contrast with, say, the relative lack of fuss in the US over the revelation that Ivanka Trump, President Donald Trump's daughter, traveled from Washington DC to New Jersey to celebrate Passover last month.
  • A controversial figure who relishes his role as an outsider, he also has a common touch when it comes to distilling a message with a brilliance complemented by Johnson's own flair for capturing the national mood. So while it was Johnson, then-Mayor of London, who in 2016 sensed an appetite for leaving the EU which his more senior colleagues missed, it was Cummings, head of the Vote Leave Campaign, who boiled it down to the simple and devastatingly effective slogan of "Take Back Control."
  • For a man known for his gregarious nature, the British Prime Minister has few close political friends; his inexperienced cabinet was appointed as much for their loyalty and support for his key policy of leading the UK out of the European Union as any long-term affinity with Johnson.
  • At the start of the lockdown, Dr. Catherine Calderwood, Scotland's Chief Medical Officer, fell on her sword after admitting two overnight visits at her seaside holiday cottage, having fronted the campaign urging Scots to stay home. Though Calderwood apologized for her actions and initially said she planned to stay on in her post, she later released a statement that she had quit and acknowledging that the "justifiable focus" on her actions could pose a distraction to the response to the pandemic.
  • As senior adviser since summer 2019 when Johnson became Prime Minister, the notoriously prickly Cummings has rubbed many Downing Street denizens the wrong way. But when coronavirus hit, it was he who crafted the message, "Stay Home, Protect the NHS, Save Lives," which has come to define Britain's battle against the virus and the protective shield the country threw around its beloved health service.
  • The hitherto wildly popular Johnson's favorability ratings have begun to slip while a recent poll by YouGov found 49% disapproved of the Prime Minister's path out of lockdown compared to 36% who supported it.
  • The former Chief Constable of Durham Police, Mike Barton, has warned that Cummings' behavior, and the Prime Minister's defense of it, will make attempts to enforce the lockdown impossible, potentially endangering the slow but steady progress the UK has made in reducing the spread of the virus.
  • The consequences could be even more serious if a mass loss of faith in both the Johnson government and his lockdown results in the public breaking the rules just at the moment the Prime Minister is urging them to stand firm.
blythewallick

Recognizing Strangers | Psychology Today - 0 views

  • Benoit Monin, assistant professor of psychology at Stanford University, showed college students 80 photos of faces, then asked them which ones they recognized from among the 40 they'd seen in an earlier session. The more attractive the photo (as rated by another group of students) the more likely it was to be recognized—regardless of whether the face had been seen before.
  • "The face's attractiveness actually changes your perception of your past," in this case, the perception of whether you've seen the face before. The shortcut may lead to errors, but it may also help us manage our busy lives, says Monin. "We tend to like familiar things, so it makes perfect sense that over time we would use liking as a clue to familiarity."
  • In what he calls the "warm-glow heuristic," people consider their affinity for a specific person or place as an indicator of familiarity. As with other mental shortcuts, people resort to this heuristic when they lack enough data on which to base their decisions.
  • ...1 more annotation...
  • In a second session, he showed them an entirely new set of words and asked which words were familiar from the earlier, bogus session. Subjects were more likely to think they'd seen positive words—such as "charm" and "glory"—than either negative or neutral words that appear with the same frequency in English.
Javier E

The Philosopher Redefining Equality | The New Yorker - 0 views

  • The bank experience showed how you could be oppressed by hierarchy, working in an environment where you were neither free nor equal. But this implied that freedom and equality were bound together in some way beyond the basic state of being unenslaved, which was an unorthodox notion. Much social thought is rooted in the idea of a conflict between the two.
  • If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.
  • What if they weren’t opposed, Anderson wondered, but, like the sugar-phosphate chains in DNA, interlaced in a structure that we might not yet understand?
  • ...54 more annotations...
  • At fifty-nine, Anderson is the chair of the University of Michigan’s department of philosophy and a champion of the view that equality and freedom are mutually dependent, enmeshed in changing conditions through time.
  • She has built a case, elaborated across decades, that equality is the basis for a free society
  • Because she brings together ideas from both the left and the right to battle increasing inequality, Anderson may be the philosopher best suited to this awkward moment in American life. She builds a democratic frame for a society in which people come from different places and are predisposed to disagree.
  • she sketched out the entry-level idea that one basic way to expand equality is by expanding the range of valued fields within a society.
  • The ability not to have an identity that one carries from sphere to sphere but, rather, to be able to slip in and adopt whatever values and norms are appropriate while retaining one’s identities in other domains?” She paused. “That is what it is to be free.”
  • How do you move from a basic model of egalitarian variety, in which everybody gets a crack at being a star at something, to figuring out how to respond to a complex one, where people, with different allotments of talent and virtue, get unequal starts, and often meet with different constraints along the way?
  • The problem, she proposed, was that contemporary egalitarian thinkers had grown fixated on distribution: moving resources from lucky-seeming people to unlucky-seeming people, as if trying to spread the luck around.
  • Egalitarians should agree about clear cases of blameless misfortune: the quadriplegic child, the cognitively impaired adult, the teen-ager born into poverty with junkie parents. But Anderson balked there, too. By categorizing people as lucky or unlucky, she argued, these egalitarians set up a moralizing hierarchy.
  • In Anderson’s view, the way forward was to shift from distributive equality to what she called relational, or democratic, equality: meeting as equals, regardless of where you were coming from or going to.
  • By letting the lucky class go on reaping the market’s chancy rewards while asking others to concede inferior status in order to receive a drip-drip-drip of redistributive aid, these egalitarians were actually entrenching people’s status as superior or subordinate.
  • To the ugly and socially awkward: . . . Maybe you won’t be such a loser in love once potential dates see how rich you are.
  • . To the stupid and untalented: Unfortunately, other people don’t value what little you have to offer in the system of production. . . . Because of the misfortune that you were born so poorly endowed with talents, we productive ones will make it up to you: we’ll let you share in the bounty of what we have produced with our vastly superior and highly valued abilities. . . 
  • she imagined some citizens getting a state check and a bureaucratic letter:
  • This was, at heart, an exercise of freedom. The trouble was that many people, picking up on libertarian misconceptions, thought of freedom only in the frame of their own actions.
  • To be truly free, in Anderson’s assessment, members of a society had to be able to function as human beings (requiring food, shelter, medical care), to participate in production (education, fair-value pay, entrepreneurial opportunity), to execute their role as citizens (freedom to speak and to vote), and to move through civil society (parks, restaurants, workplaces, markets, and all the rest).
  • Anderson’s democratic model shifted the remit of egalitarianism from the idea of equalizing wealth to the idea that people should be equally free, regardless of their differences.
  • A society in which everyone had the same material benefits could still be unequal, in this crucial sense; democratic equality, being predicated on equal respect, wasn’t something you could simply tax into existence. “People, not nature, are responsible for turning the natural diversity of human beings into oppressive hierarchies,”
  • Her first book, “Value in Ethics and Economics,” appeared that year, announcing one of her major projects: reconciling value (an amorphous ascription of worth that is a keystone of ethics and economics) with pluralism (the fact that people seem to value things in different ways).
  • Philosophers have often assumed that pluralistic value reflects human fuzziness—we’re loose, we’re confused, and we mix rational thought with sentimental responses.
  • She offered an “expressive” theory: in her view, each person’s values could be various because they were socially expressed, and thus shaped by the range of contexts and relationships at play in a life. Instead of positing value as a basic, abstract quality across society (the way “utility” functioned for economists), she saw value as something determined by the details of an individual’s history.
  • Like her idea of relational equality, this model resisted the temptation to flatten human variety toward a unifying standard. In doing so, it helped expand the realm of free and reasoned economic choice.
  • Anderson’s model unseated the premises of rational-choice theory, in which individuals invariably make utility-maximizing decisions, occasionally in heartless-seeming ways. It ran with, rather than against, moral intuition. Because values were plural, it was perfectly rational to choose to spend evenings with your family, say, and have guilt toward the people you left in the lurch at work.
  • The theory also pointed out the limits on free-market ideologies, such as libertarianism.
  • In ethics, it broke across old factional debates. The core idea “has been picked up on by people across quite a range of positions,” Peter Railton, one of Anderson’s longtime colleagues, says. “Kantians and consequentialists alike”—people who viewed morality in terms of duties and obligations, and those who measured the morality of actions by their effects in the world—“could look at it and see something important.”
  • Traditionally, the discipline is taught through a-priori thought—you start with basic principles and reason forward. Anderson, by contrast, sought to work empirically, using information gathered from the world, identifying problems to be solved not abstractly but through the experienced problems of real people.
  • “Dewey argued that the primary problems for ethics in the modern world concerned the ways society ought to be organized, rather than personal decisions of the individual,”
  • In 2004, the Stanford Encyclopedia of Philosophy asked Anderson to compose its entry on the moral philosophy of John Dewey, who helped carry pragmatist methods into the social realm. Dewey had an idea of democracy as a system of good habits that began in civil life. He was an anti-ideologue with an eye for pluralism.
  • She started working with historians, trying to hone her understanding of ideas by studying them in the context of their creation. Take Rousseau’s apparent support of direct democracy. It’s rarely mentioned that, at the moment when he made that argument, his home town of Geneva had been taken over by oligarchs who claimed to represent the public. Pragmatism said that an idea was an instrument, which naturally gave rise to such questions as: an instrument for what, and where, and when?
  • In “What Is the Point of Equality?,” Anderson had already started to drift away from what philosophers, following Rawls, call ideal theory, based on an end vision for a perfectly just society. As Anderson began a serious study of race in America, though, she found herself losing faith in that approach entirely.
  • Broadly, there’s a culturally right and a culturally left ideal theory for race and society. The rightist version calls for color blindness. Instead of making a fuss about skin and ethnicity, its advocates say, society should treat people as people, and let the best and the hardest working rise.
  • The leftist theory envisions identity communities: for once, give black people (or women, or members of other historically oppressed groups) the resources and opportunities they need, including, if they want it, civil infrastructure for themselves.
  • In “The Imperative of Integration,” published in 2010, Anderson tore apart both of these models. Sure, it might be nice to live in a color-blind society, she wrote, but that’s nothing like the one that exists.
  • But the case for self-segregation was also weak. Affinity groups provided welcome comfort, yet that wasn’t the same as power or equality, Anderson pointed out. And there was a goose-and-gander problem. Either you let only certain groups self-segregate (certifying their subordinate status) or you also permitted, say, white men to do it,
  • Anderson’s solution was “integration,” a concept that, especially in progressive circles, had been uncool since the late sixties. Integration, by her lights, meant mixing on the basis of equality.
  • in attending to these empirical findings over doctrine, she announced herself as a non-ideal theorist: a philosopher with no end vision of society. The approach recalls E. L. Doctorow’s description of driving at night: “You can see only as far as the headlights, but you can make the whole trip that way.”
  • or others, though, a white woman making recommendations on race policy raised questions of perspective. She was engaging through a mostly white Anglo-American tradition. She worked from the premise that, because she drew on folders full of studies, the limits of her own perspective were not constraining.
  • Some philosophers of color welcomed the book. “She’s taking the need for racial justice seriously, and you could hardly find another white political philosopher over a period of decades doing that,”
  • Recently, Anderson changed the way she assigns undergraduate essays: instead of requiring students to argue a position and fend off objections, doubling down on their original beliefs, she asks them to discuss their position with someone who disagrees, and to explain how and why, if at all, the discussion changed their views.
  • The challenge of pluralism is the challenge of modern society: maintaining equality amid difference in a culture given to constant and unpredictable change.
  • Rather than fighting for the ascendancy of certain positions, Anderson suggests, citizens should fight to bolster healthy institutions and systems—those which insure that all views and experiences will be heard. Today’s righteous projects, after all, will inevitably seem fatuous and blinkered from the vantage of another age.
  • Smith saw the markets as an escape from that order. Their “most important” function, he explained, was to bring “liberty and security” to those “who had before lived almost in a continual state of war with their neighbours, and of servile dependency upon their superiors.”
  • Anderson zeroed in on Adam Smith, whose “The Wealth of Nations,” published in 1776, is taken as a keystone of free-market ideology. At the time, English labor was subject to uncompensated apprenticeships, domestic servitude, and some measure of clerical dominion.
  • Smith, in other words, was an egalitarian. He had written “The Wealth of Nations” in no small part to be a solution to what we’d now call structural inequality—the intractable, compounding privileges of an arbitrary hierarchy.
  • It was a historical irony that, a century later, writers such as Marx pointed to the market as a structure of dominion over workers; in truth, Smith and Marx had shared a socioeconomic project. And yet Marx had not been wrong to trash Smith’s ideas, because, during the time between them, the world around Smith’s model had changed, and it was no longer a useful tool.
  • mages of free market society that made sense prior to the Industrial Revolution continue to circulate today as ideals, blind to the gross mismatch between the background social assumptions reigning in the seventeenth and eighteenth centuries, and today’s institutional realities. We are told that our choice is between free markets and state control, when most adults live their working lives under a third thing entirely: private government.
  • Today, people still try to use, variously, both Smith’s and Marx’s tools on a different, postindustrial world:
  • The unnaturalness of this top-heavy arrangement, combined with growing evidence of power abuses, has given many people reason to believe that something is fishy about the structure of American equality. Socialist and anti-capitalist models are again in vogue.
  • Anderson offers a different corrective path. She thinks it’s fine for some people to earn more than others. If you’re a brilliant potter, and people want to pay you more than the next guy for your pottery, great!
  • The problem isn’t that talent and income are distributed in unequal parcels. The problem is that Jeff Bezos earns more than a hundred thousand dollars a minute, while Amazon warehouse employees, many talented and hardworking, have reportedly resorted to urinating in bottles in lieu of a bathroom break. That circumstance reflects some structure of hierarchical oppression. It is a rip in the democratic fabric, and it’s increasingly the norm.
  • Andersonism holds that we don’t have to give up on market society if we can recognize and correct for its limitations—it may even be our best hope, because it’s friendlier to pluralism than most alternatives are.
  • we must be flexible. We must remain alert. We must solve problems collaboratively, in the moment, using society’s ears and eyes and the best tools that we can find.
  • “You can see that, from about 1950 to 1970, the typical American’s wages kept up with productivity growth,” she said. Then, around 1974, she went on, hourly compensation stagnated. American wages have been effectively flat for the past few decades, with the gains of productivity increasingly going to shareholders and to salaries for big bosses.
  • What changed? Anderson rattled off a constellation of factors, from strengthened intellectual-property law to winnowed antitrust law. Financialization, deregulation. Plummeting taxes on capital alongside rising payroll taxes. Privatization, which exchanged modest public-sector salaries for C.E.O. paydays. She gazed into the audience and blinked. “So now we have to ask: What has been used to justify this rather dramatic shift of labor-share of income?”
  • It was no wonder that industrial-age thinking was riddled with contradictions: it reflected what Anderson called “the plutocratic reversal” of classical liberal ideas. Those perversely reversed ideas about freedom were the ones that found a home in U.S. policy, and, well, here we were.
Javier E

Opinion | Richard Powers on What We Can Learn From Trees - The New York Times - 0 views

  • Theo and Robin have a nightly ritual where they say a prayer that Alyssa, the deceased wife and mother, taught them: May all sentient beings be free from needless suffering. That prayer itself comes from the four immeasurables in the Buddhist tradition.
  • When we enter into or recover this sense of kinship that was absolutely fundamental to so many indigenous cultures everywhere around the world at many, many different points in history, that there is no radical break between us and our kin, that even consciousness is shared, to some degree and to a large degree, with a lot of other creatures, then death stops seeming like the enemy and it starts seeming like one of the most ingenious kinds of design for keeping evolution circulating and keeping the experiment running and recombining.
  • Look, I’m 64 years old. I can remember sitting in psychology class as an undergraduate and having my professor declare that no, of course animals don’t have emotions because they don’t have an internal life. They don’t have conscious awareness. And so what looks to you like your dog being extremely happy or being extremely guilty, which dogs do so beautifully, is just your projection, your anthropomorphizing of those other creatures. And this prohibition against anthropomorphism created an artificial gulf between even those animals that are ridiculously near of kin to us, genetically.
  • ...62 more annotations...
  • I don’t know if that sounds too complicated. But the point is, it’s not just giving up domination. It’s giving up this sense of separateness in favor of a sense of kinship. And those people who do often wonder how they failed to see how much continuity there is in the more-than-human world with the human world.
  • to go from terror into being and into that sense that the experiment is sacred, not this one outcome of the experiment, is to immediately transform the way that you think even about very fundamental social and economic and cultural things. If the experiment is sacred, how can we possibly justify our food systems, for instance?
  • when I first went to the Smokies and hiked up into the old growth in the Southern Appalachians, it was like somebody threw a switch. There was some odd filter that had just been removed, and the world sounded different and smelled different.
  • richard powersYeah. In human exceptionalism, we may be completely aware of evolutionary continuity. We may understand that we have a literal kinship with the rest of creation, that all life on Earth employs the same genetic code, that there is a very small core of core genes and core proteins that is shared across all the kingdoms and phyla of life. But conceptually, we still have this demented idea that somehow consciousness creates a sanctity and a separation that almost nullifies the continuous elements of evolution and biology that we’ve come to understand.
  • if we want to begin this process of rehabilitation and transformation of consciousness that we are going to need in order to become part of the living Earth, it is going to be other kinds of minds that give us that clarity and strength and diversity and alternative way of thinking that could free us from this stranglehold of thought that looks only to the maximizing return on investment in very leverageable ways.
  • richard powersIt amazed me to get to the end of the first draft of “Bewilderment” and to realize how much Buddhism was in the book, from the simplest things.
  • I think there is nothing more science inflected than being out in the living world and the more-than-human world and trying to understand what’s happening.
  • And of course, we can combine this with what we were talking about earlier with death. If we see all of evolution as somehow leading up to us, all of human, cultural evolution leading up to neoliberalism and here we are just busily trying to accumulate and make meaning for ourselves, death becomes the enemy.
  • And you’re making the point in different ways throughout the book that it is the minds we think of as unusual, that we would diagnose as having some kind of problem or dysfunction that are, in some cases, are the only ones responding to the moment in the most common sense way it deserves. It is almost everybody else’s brain that has been broken.
  • it isn’t surprising. If you think of the characteristics of this dominant culture that we’ve been talking about — the fixation on control, the fixation on mastery, the fixation on management and accumulation and the resistance of decay — it isn’t surprising that that culture is also threatened by difference and divergence. It seeks out old, stable hierarchies — clear hierarchies — of control, and anything that’s not quite exploitable or leverageable in the way that the normal is terrifying and threatening.
  • And the more I looked for it, the more it pervaded the book.
  • ezra kleinI’ve heard you say that it has changed the way you measure a good day. Can you tell me about that?richard powersThat’s true.I suppose when I was still enthralled to commodity-mediated individualist market-driven human exceptionalism — we need a single word for this
  • And since moving to the Smokies and since publishing “The Overstory,” my days have been entirely inverted. I wake up, I go to the window, and I look outside. Or I step out onto the deck — if I haven’t been sleeping on the deck, which I try to do as much as I can in the course of the year — and see what’s in the air, gauge the temperature and the humidity and the wind and see what season it is and ask myself, you know, what’s happening out there now at 1,700 feet or 4,000 feet or 5,000 feet.
  • let me talk specifically about the work of a scientist who has herself just recently published a book. It’s Dr. Suzanne Simard, and the book is “Finding the Mother Tree.” Simard has been instrumental in a revolution in our way of thinking about what’s happening underground at the root level in a forest.
  • it was a moving moment for me, as an easterner, to stand up there and to say, this is what an eastern forest looks like. This is what a healthy, fully-functioning forest looks like. And I’m 56 years old, and I’d never seen it.
  • the other topics of that culture tend to circle back around these sorts of trends, human fascinations, ways of magnifying our throw weight and our ability and removing the last constraints to our desires and, in particular, to eliminate the single greatest enemy of meaning in the culture of the technological sublime that is, itself, such a strong instance of the culture of human separatism and commodity-mediated individualist capitalism— that is to say, the removal of death.
  • Why is it that we have known about the crisis of species extinction for at least half a century and longer? And I mean the lay public, not just scientists. But why has this been general knowledge for a long time without public will demanding some kind of action or change
  • And when you make kinship beyond yourself, your sense of meaning gravitates outwards into that reciprocal relationship, into that interdependence. And you know, it’s a little bit like scales falling off your eyes. When you do turn that corner, all of the sources of anxiety that are so present and so deeply internalized become much more identifiable. And my own sense of hope and fear gets a much larger frame of reference to operate in.
  • I think, for most of my life, until I did kind of wake up to forests and to trees, I shared — without really understanding this as a kind of concession or a kind of subscription — I did share this cultural consensus that meaning is a private thing that we do for ourselves and by ourselves and that our kind of general sense of the discoveries of the 19th and 20th century have left us feeling a bit unsponsored and adrift beyond the accident of human existence.
  • The largest single influence on any human being’s mode of thought is other human beings. So if you are surrounded by lots of terrified but wishful-thinking people who want to believe that somehow the cavalry is going to come at the last minute and that we don’t really have to look inwards and change our belief in where meaning comes from, that we will somehow be able to get over the finish line with all our stuff and that we’ll avert this disaster, as we have other kinds of disasters in the past.
  • I think what was happening to me at that time, as I was turning outward and starting to take the non-human world seriously, is my sense of meaning was shifting from something that was entirely about me and authored by me outward into this more collaborative, reciprocal, interdependent, exterior place that involved not just me but all of these other ways of being that I could make kinship with.
  • And I think I was right along with that sense that somehow we are a thing apart. We can make purpose and make meaning completely arbitrarily. It consists mostly of trying to be more in yourself, of accumulating in one form or another.
  • I can’t really be out for more than two or three miles before my head just fills with associations and ideas and scenes and character sketches. And I usually have to rush back home to keep it all in my head long enough to get it down on paper.
  • for my journey, the way to characterize this transition is from being fascinated with technologies of mastery and control and what they’re doing to us as human beings, how they’re changing what the capacities and affordances of humanity are and how we narrate ourselves, to being fascinated with technologies and sciences of interdependence and cooperation, of those sciences that increase our sense of kinship and being one of many, many neighbors.
  • And that’s an almost impossible persuasion to rouse yourself from if you don’t have allies. And I think the one hopeful thing about the present is the number of people trying to challenge that consensual understanding and break away into a new way of looking at human standing is growing.
  • And when you do subscribe to a culture like that and you are confronted with the reality of your own mortality, as I was when I was living in Stanford, that sense of stockpiling personal meaning starts to feel a little bit pointless.
  • And I just head out. I head out based on what the day has to offer. And to have that come first has really changed not only how I write, but what I’ve been writing. And I think it really shows in “Bewilderment.” It’s a totally different kind of book from my previous 12.
  • the marvelous thing about the work, which continues to get more sophisticated and continues to turn up newer and newer astonishments, is that there was odd kind of reciprocal interdependence and cooperation across the species barrier, that Douglas firs and birches were actually involved in these sharing back and forth of essential nutrients. And that’s a whole new way of looking at forest.
  • she began to see that the forests were actually wired up in very complex and identifiable ways and that there was an enormous system of resource sharing going on underground, that trees were sharing not only sugars and the hydrocarbons necessary for survival, but also secondary metabolites. And these were being passed back and forth, both symbiotically between the trees and the fungi, but also across the network to other trees so that there were actually trees in wired up, fungally-connected forests where large, dominant, healthy trees were subsidizing, as it were, trees that were injured or not in favorable positions or damaged in some way or just failing to thrive.
  • so when I was still pretty much a card-carrying member of that culture, I had this sense that to become a better person and to get ahead and to really make more of myself, I had to be as productive as possible. And that meant waking up every morning and getting 1,000 words that I was proud of. And it’s interesting that I would even settle on a quantitative target. That’s very typical for that kind of mindset that I’m talking about — 1,000 words and then you’re free, and then you can do what you want with the day.
  • there will be a threshold, as there have been for these other great social transformations that we’ve witnessed in the last couple of decades where somehow it goes from an outsider position to absolutely mainstream and common sense.
  • I am persuaded by those scholars who have showed the degree to which the concept of nature is itself an artificial construction that’s born of cultures of human separatism. I believe that everything that life does is part of the living enterprise, and that includes the construction of cities. And there is no question at all the warning that you just gave about nostalgia creating a false binary between the built world and the true natural world is itself a form of cultural isolation.
  • Religion is a technology to discipline, to discipline certain parts of the human impulse. A lot of the book revolves around the decoded neurofeedback machine, which is a very real literalization of a technology, of changing the way we think
  • one of the things I think that we have to take seriously is that we have created technologies to supercharge some parts of our natural impulse, the capitalism I think should be understood as a technology to supercharge the growth impulse, and it creates some wonders out of that and some horrors out of that.
  • richard powersSure. I base my machine on existing technology. Decoded neurofeedback is a kind of nascent field of exploration. You can read about it; it’s been publishing results for a decade. I first came across it in 2013. It involves using fMRI to record the brain activity of a human being who is learning a process, interacting with an object or engaged in a certain emotional state. That neural activity is recorded and stored as a data structure. A second subsequent human being is then also scanned in real time and fed kinds of feedback based on their own internal neural activity as determined by a kind of software analysis of their fMRI data structures.
  • And they are queued little by little to approximate, to learn how to approximate, the recorded states of the original subject. When I first read about this, I did get a little bit of a revelation. I did feel my skin pucker and think, if pushed far enough, this would be something like a telepathy conduit. It would be a first big step in answering that age-old question of what does it feel like to be something other than we are
  • in the book I simply take that basic concept and extend it, juke it up a little bit, blur the line between what the reader might think is possible right now and what they might wonder about, and maybe even introduce possibilities for this empathetic transference
  • ezra kleinOne thing I loved about the role this played in the book is that it’s highlighting its inverse. So a reader might look at this and say, wow, wouldn’t that be cool if we had a machine that could in real time change how we think and change our neural pathways and change our mental state in a particular direction? But of course, all of society is that machine,
  • Robin and Theo are in an airport. And you’ve got TVs everywhere playing the news which is to say playing a constant loop of outrage, and disaster, and calamity. And Robbie, who’s going through these neural feedback sessions during this period, turns to his dad and says, “Dad, you know how the training’s rewiring my brain? This is what is rewiring everybody else.”
  • ezra kleinI think Marshall McLuhan knew it all. I really do. Not exactly what it would look like, but his view and Postman’s view that we are creating a digital global nervous system is a way they put it, it was exactly right. A nervous system, it was such the exact right metaphor.
  • the great insight of McLuhan, to me, what now gets called the medium is the message is this idea that the way media acts upon us is not in the content it delivers. The point of Twitter is not the link that you click or even the tweet that you read; it is that the nature and structure of the Twitter system itself begins to act on your system, and you become more like it.If you watch a lot of TV, you become more like TV. If you watch a lot of Twitter, you become more like Twitter, Facebook more like Facebook. Your identities become more important to you — that the content is distraction from the medium, and the medium changes you
  • it is happening to all of us in ways that at least we are not engaging in intentionally, not at that level of how do we want to be transformed.
  • richard powersI believe that the digital neural system is now so comprehensive that the idea that you could escape it somewhere, certainly not in the Smokies, even more remotely, I think, becomes more and more laughable. Yeah, and to build on this idea of the medium being the message, not the way in which we become more like the forms and affordances of the medium is that we begin to expect that those affordances, the method in which those media are used, the physiological dependencies and castes of behavior and thought that are required to operate them and interact with them are actual — that they’re real somehow, and that we just take them into human nature and say no, this is what we’ve always wanted and we’ve simply been able to become more like our true selves.
  • Well, the warpage in our sense of time, the warpage in our sense of place, are profound. The ways in which digital feedback and the affordances of social media and all the rest have changed our expectations with regard to what we need to concentrate on, what we need to learn for ourselves, are changing profoundly.
  • If you look far enough back, you can find Socrates expressing great anxiety and suspicion about the ways in which writing is going to transform the human brain and human expectation. He was worried that somehow it was going to ruin our memories. Well, it did up to a point — nothing like the way the digital technologies have ruined our memories.
  • my tradition is Jewish, the Sabbath is a technology, is a technology to create a different relationship between the human being, and time, and growth, and productive society than you would have without the Sabbath which is framed in terms of godliness but is also a way of creating separation from the other impulses of the weak.
  • Governments are a technology, monogamy is a technology, a religiously driven technology, but now one that is culturally driven. And these things do good and they do bad. I’m not making an argument for any one of them in particular. But the idea that we would need to invent something wholly new to come up with a way to change the way human beings act is ridiculous
  • My view of the story of this era is that capitalism was one of many forces, and it has become, in many societies, functionally the only one that it was in relationship with religion, it was in relationship with more rooted communities.
  • it has become not just an economic system but a belief system, and it’s a little bit untrammeled. I’m not an anti-capitalist person, but I believe it needs countervailing forces. And my basic view is that it doesn’t have them anymore.
  • the book does introduce this kind of fable, this kind of thought experiment about the way the affordances that a new and slightly stronger technology of empathy might deflect. First of all, the story of a little boy and then the story of his father who’s scrambling to be a responsible single parent. And then, beyond that, the community of people who hear about this boy and become fascinated with him as a narrative, which again ripples outward through these digital technologies in ways that can’t be controlled or whose consequences can be foreseen.
  • I’ve talked about it before is something I’ve said is that I think a push against, functionally, materialism and want is an important weight in our society that we need. And when people say it is the way we’ll deal with climate change in the three to five year time frame, I become much more skeptical because to the point of things like the technology you have in the book with neural feedback, I do think one of the questions you have to ask is, socially and culturally, how do you move people’s minds so you can then move their politics?
  • You’re going to need something, it seems to me, outside of politics, that changes humans’ sense of themselves more fundamentally. And that takes a minute at the scale of billions.
  • richard powersWell, you are correct. And I don’t think it’s giving away any great reveal in the book to say that a reader who gets far enough into the story probably has this moment of recursive awareness where they, he or she comes to understand that what Robin is doing in this gradual training on the cast of mind of some other person is precisely what they’re doing in the act of reading the novel “Bewilderment” — by living this act of active empathy for these two characters, they are undergoing their own kind of neurofeedback.
  • The more we understand about the complexities of living systems, of organisms and the evolution of organisms, the more capable it is to feel a kind of spiritual awe. And that certainly makes it easier to have reverence for the experiment beyond me and beyond my species. I don’t think those are incommensurable or incompatible ways of knowing the world. In fact, I think to invoke one last time that Buddhist precept of interbeing, I think there is a kind of interbeing between the desire, the true selfless desire to understand the world out there through presence, care, measurement, attention, reproduction of experiment and the desire to have a spiritual affinity and shared fate with the world out there. They’re really the same project.
  • richard powersWell, sure. If we turn back to the new forestry again and researchers like Suzanne Simard who were showing the literal interconnectivity across species boundaries and the cooperation of resource sharing between different species in a forest, that is rigorous science, rigorous reproducible science. And it does participate in that central principle of practice, or collection of practices, which always requires the renunciation of personal wish and ego and prior belief in favor of empirical reproduction.
  • I’ve begun to see people beginning to build out of the humbling sciences a worldview that seems quite spiritual. And as you’re somebody who seems to me to have done that and it has changed your life, would you reflect on that a bit?
  • So much of the book is about the possibility of life beyond Earth. Tell me a bit about the role that’s playing. Why did you make the possibility of alien life in the way it might look and feel and evolve and act so central in a book about protecting and cherishing life here?
  • richard powersI’m glad that we’re slipping this in at the end because yes this framing of the book around this question of are we alone or does the universe want life it’s really important. Theo, Robin’s father, is an astrobiologist.
  • Imagine that everything happens just right so that every square inch of this place is colonized by new forms of experiments, new kinds of life. And the father trying to entertain his son with the story of this remarkable place in the sun just stopping him and saying, Dad, come on, that’s asking too much. Get real, that’s science fiction. That’s the vision that I had when I finished the book, an absolutely limitless sense of just how lucky we’ve had it here.
  • one thing I kept thinking about that didn’t make it into the final book but exists as a kind of parallel story in my own head is the father and son on some very distant planet in some very distant star, many light years from here, playing that same game. And the father saying, OK, now imagine a world that’s just the right size, and it has plate tectonics, and it has water, and it has a nearby moon to stabilize its rotation, and it has incredible security and safety from asteroids because of other large planets in the solar system.
  • they make this journey across the universe through all kinds of incubators, all kinds of petri dishes for life and the possibilities of life. And rather than answer the question — so where is everybody? — it keeps deferring the question, it keeps making that question more subtle and stranger
  • For the purposes of the book, Robin, who desperately believes in the sanctity of life beyond himself, begs his father for these nighttime, bedtime stories, and Theo gives him easy travel to other planets. Father and son going to a new planet based on the kinds of planets that Theo’s science is turning up and asking this question, what would life look like if it was able to get started here?
Javier E

How to Get Things Done When You Don't Want to Do Anything - The New York Times - 0 views

  • As you look for your motivation, it helps to think of it falling into two categories, said Stefano Di Domenico, a motivation researcher
  • First, there’s controlled motivation, when you feel you’re being ruled by outside forces like end-of-year bonuses and deadlines — or inner carrots and sticks, like guilt or people-pleasing.
  • Often when people say they’ve lost motivation, “what they really mean,” Dr. Di Domenico said, “is ‘I’m doing this because I have to, not because I want to.’”
  • ...15 more annotations...
  • The second kind, autonomous motivation, is what we’re seeking. This is when you feel like you’re self-directed, whether you have a natural affinity for the task at hand, or you’re doing something because you understand why it’s worthwhile.
  • Ms. Winder, who teaches workshops on reconnecting to your sense of purpose, often has students free write about what makes them come alive.
  • Clinical psychologist Richard M. Ryan, one of two scientists who developed a well-known approach to understanding motivation called self-determination theory, encourages those seeking lasting motivation to take a deep dive into their values.
  • when you connect the things that are important to you to the things you need to do — even the drudgeries — you can feel more in control of your actions. What do you love about your work? What core value does it meet?
  • Looking forward to a reward isn’t the best for long-term motivation. But several studies suggest that pairing small, immediate rewards to a task improves both motivation and fun.
  • Social connections like this are critical to rekindling motivation,
  • suggested considering how your motivation is tied to the people around you, whether that’s your family or your basketball team.
  • Reaching out lifts others, too. “Letting someone know that you are thinking of them is enough to kick-start their motivation,” and reminds them that you care,
  • People also motivate each other through competition.
  • Students in competitive groups exercised much more often than those in supportive social networks,
  • New athletic adventures can be motivational gold, too. A 2020 study suggested that trying out novel activities can help you stick with exercise.
  • Treating ourselves with compassion works much more effectively than beating ourselves up,
  • “People think they’re going to shame themselves into action,” yet self-compassion helps people stay focused on their goals, reduces fear of failure and improves self-confidence, which can also improve motivation, she said.
  • Students who were encouraged to be compassionate toward themselves after the test studied longer and performed better on a follow-up test, compared to students given either simple self-esteem-boosting comments or no instruction.
  • “The key thing about self-compassion and motivation is that it allows you to learn from your failures,”
Javier E

The Thread Vibes Are Off - by Anne Helen Petersen - 0 views

  • The way people post on Twitter is different from the way people post on LinkedIn which is different than how people post Facebook which is different from the way people post on Instagram, no matter how much Facebook keeps telling you to cross-post your IG stories
  • Some people whose job relies on onlineness (like me) have to refine their voices, their ways of being, across several platforms. But most normal people have found their lane — the medium that fits their message — and have stuck with it.
  • People post where they feel public speech “belongs.”
  • ...24 more annotations...
  • For some, the only speech they feel should be truly public should also be “professional.” Hence: LinkedIn, where the only associated image is a professional headshot, and the only conversations are those related to work.
  • Which is how some people really would like to navigate the public sphere: with total freedom and total impunity
  • Twitter is where you could publicly (if often anonymously) fight, troll, dunk, harass, joke, and generally speak without consequence; it’s also where the mundane status update/life musing (once the foundation of Facebook) could live peacefully.
  • Twitter was for publicly observing — through the scroll, but also by tweeting, retweeting, quote tweeting — while remaining effectively invisible, a reply-guy amongst reply-guys, a troll amongst trolls.
  • The Facebook of the 2010s was for broadcasting ideological stances under your real name and fighting with your close and extended community about them; now it’s (largely) about finding advice (and fighting about advice) in affinity groups (often) composed of people you’ve never met.
  • It rewards the esoteric, the visually witty, the mimetic — even more than Twitter.
  • Tiktok is for monologues, for expertise, for timing and performance. It’s without pretense.
  • On TikTok, you don’t reshare memes, you use them as the soundtrack to your reimagining, even if that reimagining is just “what if I do the same dance, only with my slightly dorky parents?
  • Instagram is serious and sincere (see: the success of the social justice slideshow) and almost never ironic — maybe because static visual irony is pretty hard to pull off.
  • Like YouTube, far fewer people are posting than consuming, which means that most people aren’t speaking at all.
  • And then there’s Instagram. People think Instagram is for extroverts, for people who want to broadcast every bit of their lives, but most Instagram users I know are shy — at least with public words. Instagram is where parents post pictures of their kids with the caption “these guys right here” or a picture of their dog with “a very good boy.”
  • The text doesn’t matter; the photo speaks loudest. Each post becomes overdetermined, especially when so readily viewed within the context of the greater grid
  • The more you understand your value as the sum of your visual parts, the more addictive, essential, and anxiety-producing Instagram becomes.
  • That emphasis on aesthetic perfection is part of what feminizes Instagram — but it’s also what makes it the most natural home for brands, celebrities, and influencers.
  • a static image can communicate a whole lifestyle — and brands have had decades of practice honing the practice in magazine ads and catalogs.
  • And what is an influencer if not a conduit for brands? What is a celebrity if not a conduit for their own constellation of brands?
  • If LinkedIn is the place where you can pretend that your whole life and personality is “business,” then Instagram is where you can pretend it’s all some form of leisure — or at least fun
  • A “fun” work trip, a “fun” behind-the-scenes shot, a brand doing the very hard work of trying to get you to click through and make a purchase with images that are fun fun fun.
  • On the flip side, Twitter was where you spoke with your real (verified) name — and with great, algorithm-assisted importance. You could amass clout simply by rephrasing others’ scoops in your own words, declaring opinions as facts, or just declaring. If Twitter was gendered masculine — which it certainly was, and is arguably even more so now — it was only because all of those behaviors are as well.
  • Instagram is a great place to post an announcement and feel celebrated or consoled but not feel like you have to respond to people
  • The conversation is easier to both control and ignore; of all the social networks, it most closely resembles the fawning broadcast style of the fan magazine, only the celebs control the final edit, not the magazine publisher
  • Celebrities initially glommed to Twitte
  • But its utility gradually faded: part of the problem was harassment, but part of it was context collapse, and the way it allowed words to travel across the platform and out of the celebrity’s control.
  • Instagram was just so much simpler, the communication so clearly in the celebrity wheelhouse. There is very little context collapse on Instagram — it’s all curation and control. As such, you can look interesting but say very little.
1 - 20 of 21 Next ›
Showing 20 items per page