Skip to main content

Home/ TOK Friends/ Group items matching "into" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Ivy League Schools Are Overrated. Send Your Kids Elsewhere. | New Republic - 1 views

  • a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough.
  • With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.”
  • On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.
  • ...52 more annotations...
  • When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine.
  • Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.
  • “Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance
  • Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table.
  • It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.
  • I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.
  • Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.
  • So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk.
  • There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”
  • What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?
  • The first thing that college is for is to teach you to think.
  • College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.
  • it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.
  • College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.
  • Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions.
  • Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.
  • Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect.
  • At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much
  • professors and students have largely entered into what one observer called a “nonaggression pact.”
  • higher marks for shoddier work.
  • today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses
  • they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.
  • Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify
  • there is now a thriving sector devoted to producing essay-ready summers
  • To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society.
  • what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.
  • The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things
  • As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell.
  • Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science
  • It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”
  • t almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.
  • Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals
  • That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present
  • The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself.
  • This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent
  • The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game
  • Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools.
  • s there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing
  • Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.
  • Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.
  • The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely
  • U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.
  • The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.
  • Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.
  • The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth
  • More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
  • reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality
  • The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first.
  • I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.
  • High-quality public education, financed with public money, for the benefit of all
  • Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides.
  • We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.
Javier E

Opinion | How Genetics Is Changing Our Understanding of 'Race' - The New York Times - 0 views

  • In 1942, the anthropologist Ashley Montagu published “Man’s Most Dangerous Myth: The Fallacy of Race,” an influential book that argued that race is a social concept with no genetic basis.
  • eginning in 1972, genetic findings began to be incorporated into this argument. That year, the geneticist Richard Lewontin published an important study of variation in protein types in blood. He grouped the human populations he analyzed into seven “races” — West Eurasians, Africans, East Asians, South Asians, Native Americans, Oceanians and Australians — and found that around 85 percent of variation in the protein types could be accounted for by variation within populations and “races,” and only 15 percent by variation across them. To the extent that there was variation among humans, he concluded, most of it was because of “differences between individuals.”
  • In this way, a consensus was established that among human populations there are no differences large enough to support the concept of “biological race.” Instead, it was argued, race is a “social construct,” a way of categorizing people that changes over time and across countries.
  • ...29 more annotations...
  • t is true that race is a social construct. It is also true, as Dr. Lewontin wrote, that human populations “are remarkably similar to each other” from a genetic point of view.
  • this consensus has morphed, seemingly without questioning, into an orthodoxy. The orthodoxy maintains that the average genetic differences among people grouped according to today’s racial terms are so trivial when it comes to any meaningful biological traits that those differences can be ignored.
  • With the help of these tools, we are learning that while race may be a social construct, differences in genetic ancestry that happen to correlate to many of today’s racial constructs are real.
  • I have deep sympathy for the concern that genetic discoveries could be misused to justify racism. But as a geneticist I also know that it is simply no longer possible to ignore average genetic differences among “races.”
  • Groundbreaking advances in DNA sequencing technology have been made over the last two decades
  • Care.
  • The orthodoxy goes further, holding that we should be anxious about any research into genetic differences among populations
  • You will sometimes hear that any biological differences among populations are likely to be small, because humans have diverged too recently from common ancestors for substantial differences to have arisen under the pressure of natural selection. This is not true. The ancestors of East Asians, Europeans, West Africans and Australians were, until recently, almost completely isolated from one another for 40,000 years or longer, which is more than sufficient time for the forces of evolution to work
  • I am worried that well-meaning people who deny the possibility of substantial biological differences among human populations are digging themselves into an indefensible position, one that will not survive the onslaught of science.
  • I am also worried that whatever discoveries are made — and we truly have no idea yet what they will be — will be cited as “scientific proof” that racist prejudices and agendas have been correct all along, and that those well-meaning people will not understand the science well enough to push back against these claims.
  • This is why it is important, even urgent, that we develop a candid and scientifically up-to-date way of discussing any such difference
  • While most people will agree that finding a genetic explanation for an elevated rate of disease is important, they often draw the line there. Finding genetic influences on a propensity for disease is one thing, they argue, but looking for such influences on behavior and cognition is another
  • Is performance on an intelligence test or the number of years of school a person attends shaped by the way a person is brought up? Of course. But does it measure something having to do with some aspect of behavior or cognition? Almost certainly.
  • Recent genetic studies have demonstrated differences across populations not just in the genetic determinants of simple traits such as skin color, but also in more complex traits like bodily dimensions and susceptibility to diseases.
  • in Iceland, there has been measurable genetic selection against the genetic variations that predict more years of education in that population just within the last century.
  • consider what kinds of voices are filling the void that our silence is creating
  • Nicholas Wade, a longtime science journalist for The New York Times, rightly notes in his 2014 book, “A Troublesome Inheritance: Genes, Race and Human History,” that modern research is challenging our thinking about the nature of human population differences. But he goes on to make the unfounded and irresponsible claim that this research is suggesting that genetic factors explain traditional stereotypes.
  • 139 geneticists (including myself) pointed out in a letter to The New York Times about Mr. Wade’s book, there is no genetic evidence to back up any of the racist stereotypes he promotes.
  • Another high-profile example is James Watson, the scientist who in 1953 co-discovered the structure of DNA, and who was forced to retire as head of the Cold Spring Harbor Laboratories in 2007 after he stated in an interview — without any scientific evidence — that research has suggested that genetic factors contribute to lower intelligence in Africans than in Europeans.
  • What makes Dr. Watson’s and Mr. Wade’s statements so insidious is that they start with the accurate observation that many academics are implausibly denying the possibility of average genetic differences among human populations, and then end with a claim — backed by no evidence — that they know what those differences are and that they correspond to racist stereotypes
  • They use the reluctance of the academic community to openly discuss these fraught issues to provide rhetorical cover for hateful ideas and old racist canards.
  • This is why knowledgeable scientists must speak out. If we abstain from laying out a rational framework for discussing differences among populations, we risk losing the trust of the public and we actively contribute to the distrust of expertise that is now so prevalent.
  • If scientists can be confident of anything, it is that whatever we currently believe about the genetic nature of differences among populations is most likely wrong.
  • For example, my laboratory discovered in 2016, based on our sequencing of ancient human genomes, that “whites” are not derived from a population that existed from time immemorial, as some people believe. Instead, “whites” represent a mixture of four ancient populations that lived 10,000 years ago and were each as different from one another as Europeans and East Asians are today.
  • For me, a natural response to the challenge is to learn from the example of the biological differences that exist between males and females
  • The differences between the sexes are far more profound than those that exist among human populations, reflecting more than 100 million years of evolution and adaptation. Males and females differ by huge tracts of genetic material
  • How do we accommodate the biological differences between men and women? I think the answer is obvious: We should both recognize that genetic differences between males and females exist and we should accord each sex the same freedoms and opportunities regardless of those differences
  • fulfilling these aspirations in practice is a challenge. Yet conceptually it is straightforward.
  • Compared with the enormous differences that exist among individuals, differences among populations are on average many times smaller, so it should be only a modest challenge to accommodate a reality in which the average genetic contributions to human traits differ.
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
Javier E

Look At Me by Patricia Snow | Articles | First Things - 0 views

  • Maurice stumbles upon what is still the gold standard for the treatment of infantile autism: an intensive course of behavioral therapy called applied behavioral analysis that was developed by psychologist O. Ivar Lovaas at UCLA in the 1970s
  • in a little over a year’s time she recovers her daughter to the point that she is indistinguishable from her peers.
  • Let Me Hear Your Voice is not a particularly religious or pious work. It is not the story of a miracle or a faith healing
  • ...54 more annotations...
  • Maurice discloses her Catholicism, and the reader is aware that prayer undergirds the therapy, but the book is about the therapy, not the prayer. Specifically, it is about the importance of choosing methods of treatment that are supported by scientific data. Applied behavioral analysis is all about data: its daily collection and interpretation. The method is empirical, hard-headed, and results-oriented.
  • on a deeper level, the book is profoundly religious, more religious perhaps than its author intended. In this reading of the book, autism is not only a developmental disorder afflicting particular individuals, but a metaphor for the spiritual condition of fallen man.
  • Maurice’s autistic daughter is indifferent to her mother
  • In this reading of the book, the mother is God, watching a child of his wander away from him into darkness: a heartbroken but also a determined God, determined at any cost to bring the child back
  • the mother doesn’t turn back, concedes nothing to the condition that has overtaken her daughter. There is no political correctness in Maurice’s attitude to autism; no nod to “neurodiversity.” Like the God in Donne’s sonnet, “Batter my heart, three-personed God,” she storms the walls of her daughter’s condition
  • Like God, she sets her sights high, commits both herself and her child to a demanding, sometimes painful therapy (life!), and receives back in the end a fully alive, loving, talking, and laughing child
  • the reader realizes that for God, the harrowing drama of recovery is never a singular, or even a twice-told tale, but a perennial one. Every child of his, every child of Adam and Eve, wanders away from him into darkness
  • we have an epidemic of autism, or “autism spectrum disorder,” which includes classic autism (Maurice’s children’s diagnosis); atypical autism, which exhibits some but not all of the defects of autism; and Asperger’s syndrome, which is much more common in boys than in girls and is characterized by average or above average language skills but impaired social skills.
  • At the same time, all around us, we have an epidemic of something else. On the street and in the office, at the dinner table and on a remote hiking trail, in line at the deli and pushing a stroller through the park, people go about their business bent over a small glowing screen, as if praying.
  • This latter epidemic, or experiment, has been going on long enough that people are beginning to worry about its effects.
  • for a comprehensive survey of the emerging situation on the ground, the interested reader might look at Sherry Turkle’s recent book, Reclaiming Conversation: The Power of Talk in a Digital Age.
  • she also describes in exhaustive, chilling detail the mostly horrifying effects recent technology has had on families and workplaces, educational institutions, friendships and romance.
  • many of the promises of technology have not only not been realized, they have backfired. If technology promised greater connection, it has delivered greater alienation. If it promised greater cohesion, it has led to greater fragmentation, both on a communal and individual level.
  • If thinking that the grass is always greener somewhere else used to be a marker of human foolishness and a temptation to be resisted, today it is simply a possibility to be checked out. The new phones, especially, turn out to be portable Pied Pipers, irresistibly pulling people away from the people in front of them and the tasks at hand.
  • all it takes is a single phone on a table, even if that phone is turned off, for the conversations in the room to fade in number, duration, and emotional depth.
  • an infinitely malleable screen isn’t an invitation to stability, but to restlessness
  • Current media, and the fear of missing out that they foster (a motivator now so common it has its own acronym, FOMO), drive lives of continual interruption and distraction, of virtual rather than real relationships, and of “little” rather than “big” talk
  • if you may be interrupted at any time, it makes sense, as a student explains to Turkle, to “keep things light.”
  • we are reaping deficits in emotional intelligence and empathy; loneliness, but also fears of unrehearsed conversations and intimacy; difficulties forming attachments but also difficulties tolerating solitude and boredom
  • consider the testimony of the faculty at a reputable middle school where Turkle is called in as a consultant
  • The teachers tell Turkle that their students don’t make eye contact or read body language, have trouble listening, and don’t seem interested in each other, all markers of autism spectrum disorder
  • Like much younger children, they engage in parallel play, usually on their phones. Like autistic savants, they can call up endless information on their phones, but have no larger context or overarching narrative in which to situate it
  • Students are so caught up in their phones, one teacher says, “they don’t know how to pay attention to class or to themselves or to another person or to look in each other’s eyes and see what is going on.
  • “It is as though they all have some signs of being on an Asperger’s spectrum. But that’s impossible. We are talking about a schoolwide problem.”
  • Can technology cause Asperger’
  • “It is not necessary to settle this debate to state the obvious. If we don’t look at our children and engage them in conversation, it is not surprising if they grow up awkward and withdrawn.”
  • In the protocols developed by Ivar Lovaas for treating autism spectrum disorder, every discrete trial in the therapy, every drill, every interaction with the child, however seemingly innocuous, is prefaced by this clear command: “Look at me!”
  • If absence of relationship is a defining feature of autism, connecting with the child is both the means and the whole goal of the therapy. Applied behavioral analysis does not concern itself with when exactly, how, or why a child becomes autistic, but tries instead to correct, do over, and even perhaps actually rewire what went wrong, by going back to the beginning
  • Eye contact—which we know is essential for brain development, emotional stability, and social fluency—is the indispensable prerequisite of the therapy, the sine qua non of everything that happens.
  • There are no shortcuts to this method; no medications or apps to speed things up; no machines that can do the work for us. This is work that only human beings can do
  • it must not only be started early and be sufficiently intensive, but it must also be carried out in large part by parents themselves. Parents must be trained and involved, so that the treatment carries over into the home and continues for most of the child’s waking hours.
  • there are foundational relationships that are templates for all other relationships, and for learning itself.
  • Maurice’s book, in other words, is not fundamentally the story of a child acquiring skills, though she acquires them perforce. It is the story of the restoration of a child’s relationship with her parents
  • it is also impossible to overstate the time and commitment that were required to bring it about, especially today, when we have so little time, and such a faltering, diminished capacity for sustained engagement with small children
  • The very qualities that such engagement requires, whether our children are sick or well, are the same qualities being bred out of us by technologies that condition us to crave stimulation and distraction, and by a culture that, through a perverse alchemy, has changed what was supposed to be the freedom to work anywhere into an obligation to work everywhere.
  • In this world of total work (the phrase is Josef Pieper’s), the work of helping another person become fully human may be work that is passing beyond our reach, as our priorities, and the technologies that enable and reinforce them, steadily unfit us for the work of raising our own young.
  • in Turkle’s book, as often as not, it is young people who are distressed because their parents are unreachable. Some of the most painful testimony in Reclaiming Conversation is the testimony of teenagers who hope to do things differently when they have children, who hope someday to learn to have a real conversation, and so o
  • it was an older generation that first fell under technology’s spell. At the middle school Turkle visits, as at many other schools across the country, it is the grown-ups who decide to give every child a computer and deliver all course content electronically, meaning that they require their students to work from the very medium that distracts them, a decision the grown-ups are unwilling to reverse, even as they lament its consequences.
  • we have approached what Turkle calls the robotic moment, when we will have made ourselves into the kind of people who are ready for what robots have to offer. When people give each other less, machines seem less inhuman.
  • robot babysitters may not seem so bad. The robots, at least, will be reliable!
  • If human conversations are endangered, what of prayer, a conversation like no other? All of the qualities that human conversation requires—patience and commitment, an ability to listen and a tolerance for aridity—prayer requires in greater measure.
  • this conversation—the Church exists to restore. Everything in the traditional Church is there to facilitate and nourish this relationship. Everything breathes, “Look at me!”
  • there is a second path to God, equally enjoined by the Church, and that is the way of charity to the neighbor, but not the neighbor in the abstract.
  • “Who is my neighbor?” a lawyer asks Jesus in the Gospel of Luke. Jesus’s answer is, the one you encounter on the way.
  • Virtue is either concrete or it is nothing. Man’s path to God, like Jesus’s path on the earth, always passes through what the Jesuit Jean Pierre de Caussade called “the sacrament of the present moment,” which we could equally call “the sacrament of the present person,” the way of the Incarnation, the way of humility, or the Way of the Cross.
  • The tradition of Zen Buddhism expresses the same idea in positive terms: Be here now.
  • Both of these privileged paths to God, equally dependent on a quality of undivided attention and real presence, are vulnerable to the distracting eye-candy of our technologies
  • Turkle is at pains to show that multitasking is a myth, that anyone trying to do more than one thing at a time is doing nothing well. We could also call what she was doing multi-relating, another temptation or illusion widespread in the digital age. Turkle’s book is full of people who are online at the same time that they are with friends, who are texting other potential partners while they are on dates, and so on.
  • This is the situation in which many people find themselves today: thinking that they are special to someone because of something that transpired, only to discover that the other person is spread so thin, the interaction was meaningless. There is a new kind of promiscuity in the world, in other words, that turns out to be as hurtful as the old kind.
  • Who can actually multitask and multi-relate? Who can love everyone without diluting or cheapening the quality of love given to each individual? Who can love everyone without fomenting insecurity and jealousy? Only God can do this.
  • When an individual needs to be healed of the effects of screens and machines, it is real presence that he needs: real people in a real world, ideally a world of God’s own making
  • Nature is restorative, but it is conversation itself, unfolding in real time, that strikes these boys with the force of revelation. More even than the physical vistas surrounding them on a wilderness hike, unrehearsed conversation opens up for them new territory, open-ended adventures. “It was like a stream,” one boy says, “very ongoing. It wouldn’t break apart.”
  • in the waters of baptism, the new man is born, restored to his true parent, and a conversation begins that over the course of his whole life reminds man of who he is, that he is loved, and that someone watches over him always.
  • Even if the Church could keep screens out of her sanctuaries, people strongly attached to them would still be people poorly positioned to take advantage of what the Church has to offer. Anxious people, unable to sit alone with their thoughts. Compulsive people, accustomed to checking their phones, on average, every five and a half minutes. As these behaviors increase in the Church, what is at stake is man’s relationship with truth itself.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Javier E

My Mom Believes In QAnon. I've Been Trying To Get Her Out. - 0 views

  • An early adopter of the QAnon mass delusion, on board since 2018, she held firm to the claim that a Satan-worshipping cabal of child sex traffickers controlled the world and the only person standing in their way was Trump. She saw him not merely as a politician but a savior, and she expressed her devotion in stark terms.
  • “The prophets have said Trump is anointed,” she texted me once. “God is using him to finally end the evil doings of the cabal which has hurt humanity all these centuries… We are in a war between good & evil.”
  • By 2020, I’d pretty much given up on swaying my mom away from her preferred presidential candidate. We’d spent many hours arguing over basic facts I considered indisputable. Any information I cited to prove Trump’s cruelty, she cut down with a corresponding counterattack. My links to credible news sources disintegrated against a wall of outlets like One America News Network, Breitbart, and Before It’s News. Any cracks I could find in her positions were instantly undermined by the inconvenient fact that I was, in her words, a member of “the liberal media,” a brainwashed acolyte of the sprawling conspiracy trying to take down her heroic leader.
  • ...20 more annotations...
  • The irony gnawed at me: My entire vocation as an investigative reporter was predicated on being able to reveal truths, and yet I could not even rustle up the evidence to convince my own mother that our 45th president was not, in fact, the hero she believed him to be. Or, for that matter, that John F. Kennedy Jr. was dead. Or that Tom Hanks had not been executed for drinking the blood of children.
  • The theories spun from Q’s messages seemed much easier to disprove. Oprah Winfrey couldn’t have been detained during a wave of deep state arrests because we could still see her conducting live interviews on television. Trump’s 4th of July speech at Mount Rushmore came to an end without John F. Kennedy Jr. revealing he was alive and stepping in as the president’s new running mate. The widespread blackouts that her Patriot friend’s “source from the Pentagon” had warned about failed to materialize. And I could testify firsthand that the CIA had no control over my newsroom’s editorial decisions.
  • “I believe the Holy Spirit led me to the QAnons to discover the truth which is being suppressed,” she texted me. “Otherwise, how would I be able to know the truth if the lamestream media suppresses the truth?”
  • Through the years, I’d battled against conspiracy theories my mom threw at me that were far more formidable than QAnon. I’d been stumped when she asked me to prove that Beyoncé wasn’t an Illuminati member, dumbfounded when research studies I sent her weren’t enough to reach an agreement on vaccine efficacy, and too worn down to say anything more than “that’s not true” when confronted with false allegations of murders committed by prominent politicians.
  • Eventually, I accepted the impasse. It didn’t seem healthy that every conversation we had would devolve into a circuitous debate about which one of us was on the side of the bad guys. So I tried to pick my battles.
  • But what I had dismissed as damaging inconsistencies turned out to be the core strength of the belief system: It was alive, flexible, sprouting more questions than answers, more clues to study, an investigation playing out in real time, with the fate of the world at stake.
  • With no overlap between our filters of reality, I was at a loss for any facts that would actually stick.
  • Meanwhile, she wondered where she’d gone wrong with me
  • She regretted not taking politics more seriously when I was younger. I’d grown up blinkered by American privilege, trained to ignore the dirty machinations securing my comforts. My mom had shed that luxury long ago.
  • The year my mom began falling down QAnon rabbit holes, I turned the age she was when she first arrived in the States. By then, I was no longer sure that America was worth the cost of her migration. When the real estate market collapsed under the weight of Wall Street speculation, she had to sell our house at a steep loss to avoid foreclosure and her budding career as a realtor evaporated. Her near–minimum wage jobs weren’t enough to cover her bills, so her credit card debts rose. She delayed retirement plans because she saw no path to breaking even anytime soon, though she was hopeful that a turnaround was on the horizon. Through the setbacks and detours, she drifted into the arms of the people and beliefs I held most responsible for her troubles.
  • With a fervor I knew was futile, I’d tell my mom she was missing the real conspiracy: The powerful people shaping policy to benefit their own interests, to maintain wealth and white predominance, through tax cuts and voter suppression, were commandeering her support solely by catering to her stance on the one issue she cared most about.
  • The voice my mom trusted most now was Trump’s. Our disagreements were no longer ideological to her but part of a celestial conflict.
  • “I love you but you have to be on the side of good,” she texted me. “Im sad cuz u have become part of the deep state. May God have mercy on you...I pray you will see the truth of the evil agenda and be on the side of Trump.”
  • She likened her fellow Patriots to the early Christians who spread the word of Jesus at the risk of persecution. She often sent me a meme with a caption about “ordinary people who spent countless hours researching, debating, meditating and praying” for the truth to be revealed to them. “Although they were mocked, dismissed and cast off, they knew their souls had agreed long ago to do this work.”
  • Last summer, as my mom marched in a pink MAGA hat amid maskless crowds, and armed extremists stalked racial justice protests, and a disputed election loomed like a time bomb, I entertained my darkest thoughts about the fate of our country. Was there any hope in a democracy without a shared set of basic facts? Had my elders fled one authoritarian regime only for their children to face another? Amid the gloom, I found only a single morsel of solace: My mom was as hopeful as she’d ever been.
  • I wish I could offer some evidence showing that the gulf between us might be narrowing, that my love, persistence, and collection of facts might be enough to draw her back into a reality we share, and that when our wager about the storm comes due in a few months, she’ll realize that the voices she trusts have been lying to her. But I don’t think that will happen
  • What can I do but try to limit the damage? Send my mom movie recommendations to occupy the free time she instead spends on conspiracy research. Shift our conversations to the common ground of cooking recipes and family gossip. Raise objections when her beliefs nudge her toward dangerous decisions.
  • I now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
  • understand
  • now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
Javier E

Accelerationism: how a fringe philosophy predicted the future we live in | World news | The Guardian - 1 views

  • Roger Zelazny, published his third novel. In many ways, Lord of Light was of its time, shaggy with imported Hindu mythology and cosmic dialogue. Yet there were also glints of something more forward-looking and political.
  • accelerationism has gradually solidified from a fictional device into an actual intellectual movement: a new way of thinking about the contemporary world and its potential.
  • Accelerationists argue that technology, particularly computer technology, and capitalism, particularly the most aggressive, global variety, should be massively sped up and intensified – either because this is the best way forward for humanity, or because there is no alternative.
  • ...31 more annotations...
  • Accelerationists favour automation. They favour the further merging of the digital and the human. They often favour the deregulation of business, and drastically scaled-back government. They believe that people should stop deluding themselves that economic and technological progress can be controlled.
  • Accelerationism, therefore, goes against conservatism, traditional socialism, social democracy, environmentalism, protectionism, populism, nationalism, localism and all the other ideologies that have sought to moderate or reverse the already hugely disruptive, seemingly runaway pace of change in the modern world
  • Robin Mackay and Armen Avanessian in their introduction to #Accelerate: The Accelerationist Reader, a sometimes baffling, sometimes exhilarating book, published in 2014, which remains the only proper guide to the movement in existence.
  • “We all live in an operating system set up by the accelerating triad of war, capitalism and emergent AI,” says Steve Goodman, a British accelerationist
  • A century ago, the writers and artists of the Italian futurist movement fell in love with the machines of the industrial era and their apparent ability to invigorate society. Many futurists followed this fascination into war-mongering and fascism.
  • One of the central figures of accelerationism is the British philosopher Nick Land, who taught at Warwick University in the 1990s
  • Land has published prolifically on the internet, not always under his own name, about the supposed obsolescence of western democracy; he has also written approvingly about “human biodiversity” and “capitalistic human sorting” – the pseudoscientific idea, currently popular on the far right, that different races “naturally” fare differently in the modern world; and about the supposedly inevitable “disintegration of the human species” when artificial intelligence improves sufficiently.
  • In our politically febrile times, the impatient, intemperate, possibly revolutionary ideas of accelerationism feel relevant, or at least intriguing, as never before. Noys says: “Accelerationists always seem to have an answer. If capitalism is going fast, they say it needs to go faster. If capitalism hits a bump in the road, and slows down” – as it has since the 2008 financial crisis – “they say it needs to be kickstarted.”
  • On alt-right blogs, Land in particular has become a name to conjure with. Commenters have excitedly noted the connections between some of his ideas and the thinking of both the libertarian Silicon Valley billionaire Peter Thiel and Trump’s iconoclastic strategist Steve Bannon.
  • “In Silicon Valley,” says Fred Turner, a leading historian of America’s digital industries, “accelerationism is part of a whole movement which is saying, we don’t need [conventional] politics any more, we can get rid of ‘left’ and ‘right’, if we just get technology right. Accelerationism also fits with how electronic devices are marketed – the promise that, finally, they will help us leave the material world, all the mess of the physical, far behind.”
  • In 1972, the philosopher Gilles Deleuze and the psychoanalyst Félix Guattari published Anti-Oedipus. It was a restless, sprawling, appealingly ambiguous book, which suggested that, rather than simply oppose capitalism, the left should acknowledge its ability to liberate as well as oppress people, and should seek to strengthen these anarchic tendencies, “to go still further … in the movement of the market … to ‘accelerate the process’”.
  • By the early 90s Land had distilled his reading, which included Deleuze and Guattari and Lyotard, into a set of ideas and a writing style that, to his students at least, were visionary and thrillingly dangerous. Land wrote in 1992 that capitalism had never been properly unleashed, but instead had always been held back by politics, “the last great sentimental indulgence of mankind”. He dismissed Europe as a sclerotic, increasingly marginal place, “the racial trash-can of Asia”. And he saw civilisation everywhere accelerating towards an apocalypse: “Disorder must increase... Any [human] organisation is ... a mere ... detour in the inexorable death-flow.”
  • With the internet becoming part of everyday life for the first time, and capitalism seemingly triumphant after the collapse of communism in 1989, a belief that the future would be almost entirely shaped by computers and globalisation – the accelerated “movement of the market” that Deleuze and Guattari had called for two decades earlier – spread across British and American academia and politics during the 90s. The Warwick accelerationists were in the vanguard.
  • In the US, confident, rainbow-coloured magazines such as Wired promoted what became known as “the Californian ideology”: the optimistic claim that human potential would be unlocked everywhere by digital technology. In Britain, this optimism influenced New Labour
  • The Warwick accelerationists saw themselves as participants, not traditional academic observers
  • The CCRU gang formed reading groups and set up conferences and journals. They squeezed into the narrow CCRU room in the philosophy department and gave each other impromptu seminars.
  • The main result of the CCRU’s frantic, promiscuous research was a conveyor belt of cryptic articles, crammed with invented terms, sometimes speculative to the point of being fiction.
  • At Warwick, however, the prophecies were darker. “One of our motives,” says Plant, “was precisely to undermine the cheery utopianism of the 90s, much of which seemed very conservative” – an old-fashioned male desire for salvation through gadgets, in her view.
  • K-punk was written by Mark Fisher, formerly of the CCRU. The blog retained some Warwick traits, such as quoting reverently from Deleuze and Guattari, but it gradually shed the CCRU’s aggressive rhetoric and pro-capitalist politics for a more forgiving, more left-leaning take on modernity. Fisher increasingly felt that capitalism was a disappointment to accelerationists, with its cautious, entrenched corporations and endless cycles of essentially the same products. But he was also impatient with the left, which he thought was ignoring new technology
  • lex Williams, co-wrote a Manifesto for an Accelerationist Politics. “Capitalism has begun to constrain the productive forces of technology,” they wrote. “[Our version of] accelerationism is the basic belief that these capacities can and should be let loose … repurposed towards common ends … towards an alternative modernity.”
  • What that “alternative modernity” might be was barely, but seductively, sketched out, with fleeting references to reduced working hours, to technology being used to reduce social conflict rather than exacerbate it, and to humanity moving “beyond the limitations of the earth and our own immediate bodily forms”. On politics and philosophy blogs from Britain to the US and Italy, the notion spread that Srnicek and Williams had founded a new political philosophy: “left accelerationism”.
  • Two years later, in 2015, they expanded the manifesto into a slightly more concrete book, Inventing the Future. It argued for an economy based as far as possible on automation, with the jobs, working hours and wages lost replaced by a universal basic income. The book attracted more attention than a speculative leftwing work had for years, with interest and praise from intellectually curious leftists
  • Even the thinking of the arch-accelerationist Nick Land, who is 55 now, may be slowing down. Since 2013, he has become a guru for the US-based far-right movement neoreaction, or NRx as it often calls itself. Neoreactionaries believe in the replacement of modern nation-states, democracy and government bureaucracies by authoritarian city states, which on neoreaction blogs sound as much like idealised medieval kingdoms as they do modern enclaves such as Singapore.
  • Land argues now that neoreaction, like Trump and Brexit, is something that accelerationists should support, in order to hasten the end of the status quo.
  • In 1970, the American writer Alvin Toffler, an exponent of accelerationism’s more playful intellectual cousin, futurology, published Future Shock, a book about the possibilities and dangers of new technology. Toffler predicted the imminent arrival of artificial intelligence, cryonics, cloning and robots working behind airline check-in desks
  • Land left Britain. He moved to Taiwan “early in the new millennium”, he told me, then to Shanghai “a couple of years later”. He still lives there now.
  • In a 2004 article for the Shanghai Star, an English-language paper, he described the modern Chinese fusion of Marxism and capitalism as “the greatest political engine of social and economic development the world has ever known”
  • Once he lived there, Land told me, he realised that “to a massive degree” China was already an accelerationist society: fixated by the future and changing at speed. Presented with the sweeping projects of the Chinese state, his previous, libertarian contempt for the capabilities of governments fell away
  • Without a dynamic capitalism to feed off, as Deleuze and Guattari had in the early 70s, and the Warwick philosophers had in the 90s, it may be that accelerationism just races up blind alleys. In his 2014 book about the movement, Malign Velocities, Benjamin Noys accuses it of offering “false” solutions to current technological and economic dilemmas. With accelerationism, he writes, a breakthrough to a better future is “always promised and always just out of reach”.
  • “The pace of change accelerates,” concluded a documentary version of the book, with a slightly hammy voiceover by Orson Welles. “We are living through one of the greatest revolutions in history – the birth of a new civilisation.”
  • Shortly afterwards, the 1973 oil crisis struck. World capitalism did not accelerate again for almost a decade. For much of the “new civilisation” Toffler promised, we are still waiting
katherineharron

Shouting into the apocalypse: The decade in climate change (opinion) - CNN - 0 views

  • What's that worn-out phrase? Shouting into the wind? Well, after a decade of rising pollution, failed politics and worsening disasters, it seems the many, many of us who care about the climate crisis increasingly are shouting into the hurricane, if not the apocalypse.
  • On the cusp of 2020, the state of the planet is far more dire than in 2010. Preserving a safe and healthy ecological system is no longer a realistic possibility. Now, we're looking at less bad options, ceding the fact that the virtual end of coral reefs, the drowning of some island nations, the worsening of already-devastating storms and the displacement of millions -- they seem close to inevitable. The climate crisis is already costly, deadly and deeply unjust, putting the most vulnerable people in the world, often who've done the least to cause this, at terrible risk.
  • There are two numbers you need to understand to put this moment in perspective.The first is 1.5. The Paris Agreement -- the international treaty on climate change, which admittedly is in trouble, but also is the best thing we've got -- sets the goal of limiting warming to 1.5 or, at most, below 2 degrees Celsius of warming.
  • ...6 more annotations...
  • Worldwide fossil fuel emissions are expected to be up 0.6% in 2019 over 2018, according to projections from the Global Carbon Project. In the past decade, humans have put more than 350 metric gigatons of carbon dioxide into the atmosphere from burning fossil fuels and other industrial processes, according to calculations provided by the World Resources Institute.
  • Meanwhile, scientists are becoming even more concerned about tipping points in the climate system that could lead to rapid rise in sea levels, the deterioration of the Amazon and so on. One particularly frightening commentary last month in the journal Nature, by several notable climate scientists, says the odds we can avoid tipping points in the climate system "could already have shrunk towards zero." In non-science-speak: We're there now.
  • This was the decade when some people finally started to see the climate crisis as personal. Climate attribution science, which looks for human fingerprints on extreme weather events, made its way into the popular imagination. We're starting to realize there are no truly "natural" disasters anymore. We've warmed the climate, and we're already making storms riskier.
  • The news media is picking that up, using terms such as "climate emergency" and "climate crisis" instead of the blander "climate change." Increasingly, lots of people are making these critical connections, which should motivate the political, social and economic revolution necessary to fix things.
  • Only 52% of American adults say they are "very" or "extremely" sure global warming is happening, according to a report from the Yale Program on Climate Change Communication and the George Mason University Center for Climate Change Communication, which is based on a 1,303 person survey conducted in November 2019. Yale's been asking that question for a while now. Go back a decade, to 2009, and the rate is about the same: 51%.
  • The bright spot -- and it truly is a bright one -- is that young people are waking up. They are shouting, loudly and with purpose. Witness Greta Thunberg, the dynamic teenager who started a one-girl protest outside the Swedish Parliament last year, demanding that adults take seriously this emergency, which threatens young people and future generations disproportionately.
Javier E

The Philosopher Redefining Equality | The New Yorker - 0 views

  • The bank experience showed how you could be oppressed by hierarchy, working in an environment where you were neither free nor equal. But this implied that freedom and equality were bound together in some way beyond the basic state of being unenslaved, which was an unorthodox notion. Much social thought is rooted in the idea of a conflict between the two.
  • If individuals exercise freedoms, conservatives like to say, some inequalities will naturally result. Those on the left basically agree—and thus allow constraints on personal freedom in order to reduce inequality. The philosopher Isaiah Berlin called the opposition between equality and freedom an “intrinsic, irremovable element in human life.” It is our fate as a society, he believed, to haggle toward a balance between them.
  • What if they weren’t opposed, Anderson wondered, but, like the sugar-phosphate chains in DNA, interlaced in a structure that we might not yet understand?
  • ...54 more annotations...
  • At fifty-nine, Anderson is the chair of the University of Michigan’s department of philosophy and a champion of the view that equality and freedom are mutually dependent, enmeshed in changing conditions through time.
  • She has built a case, elaborated across decades, that equality is the basis for a free society
  • Because she brings together ideas from both the left and the right to battle increasing inequality, Anderson may be the philosopher best suited to this awkward moment in American life. She builds a democratic frame for a society in which people come from different places and are predisposed to disagree.
  • she sketched out the entry-level idea that one basic way to expand equality is by expanding the range of valued fields within a society.
  • The ability not to have an identity that one carries from sphere to sphere but, rather, to be able to slip in and adopt whatever values and norms are appropriate while retaining one’s identities in other domains?” She paused. “That is what it is to be free.”
  • How do you move from a basic model of egalitarian variety, in which everybody gets a crack at being a star at something, to figuring out how to respond to a complex one, where people, with different allotments of talent and virtue, get unequal starts, and often meet with different constraints along the way?
  • The problem, she proposed, was that contemporary egalitarian thinkers had grown fixated on distribution: moving resources from lucky-seeming people to unlucky-seeming people, as if trying to spread the luck around.
  • Egalitarians should agree about clear cases of blameless misfortune: the quadriplegic child, the cognitively impaired adult, the teen-ager born into poverty with junkie parents. But Anderson balked there, too. By categorizing people as lucky or unlucky, she argued, these egalitarians set up a moralizing hierarchy.
  • In Anderson’s view, the way forward was to shift from distributive equality to what she called relational, or democratic, equality: meeting as equals, regardless of where you were coming from or going to.
  • By letting the lucky class go on reaping the market’s chancy rewards while asking others to concede inferior status in order to receive a drip-drip-drip of redistributive aid, these egalitarians were actually entrenching people’s status as superior or subordinate.
  • To the ugly and socially awkward: . . . Maybe you won’t be such a loser in love once potential dates see how rich you are.
  • . To the stupid and untalented: Unfortunately, other people don’t value what little you have to offer in the system of production. . . . Because of the misfortune that you were born so poorly endowed with talents, we productive ones will make it up to you: we’ll let you share in the bounty of what we have produced with our vastly superior and highly valued abilities. . . 
  • she imagined some citizens getting a state check and a bureaucratic letter:
  • This was, at heart, an exercise of freedom. The trouble was that many people, picking up on libertarian misconceptions, thought of freedom only in the frame of their own actions.
  • To be truly free, in Anderson’s assessment, members of a society had to be able to function as human beings (requiring food, shelter, medical care), to participate in production (education, fair-value pay, entrepreneurial opportunity), to execute their role as citizens (freedom to speak and to vote), and to move through civil society (parks, restaurants, workplaces, markets, and all the rest).
  • Anderson’s democratic model shifted the remit of egalitarianism from the idea of equalizing wealth to the idea that people should be equally free, regardless of their differences.
  • A society in which everyone had the same material benefits could still be unequal, in this crucial sense; democratic equality, being predicated on equal respect, wasn’t something you could simply tax into existence. “People, not nature, are responsible for turning the natural diversity of human beings into oppressive hierarchies,”
  • Her first book, “Value in Ethics and Economics,” appeared that year, announcing one of her major projects: reconciling value (an amorphous ascription of worth that is a keystone of ethics and economics) with pluralism (the fact that people seem to value things in different ways).
  • Philosophers have often assumed that pluralistic value reflects human fuzziness—we’re loose, we’re confused, and we mix rational thought with sentimental responses.
  • She offered an “expressive” theory: in her view, each person’s values could be various because they were socially expressed, and thus shaped by the range of contexts and relationships at play in a life. Instead of positing value as a basic, abstract quality across society (the way “utility” functioned for economists), she saw value as something determined by the details of an individual’s history.
  • Like her idea of relational equality, this model resisted the temptation to flatten human variety toward a unifying standard. In doing so, it helped expand the realm of free and reasoned economic choice.
  • Anderson’s model unseated the premises of rational-choice theory, in which individuals invariably make utility-maximizing decisions, occasionally in heartless-seeming ways. It ran with, rather than against, moral intuition. Because values were plural, it was perfectly rational to choose to spend evenings with your family, say, and have guilt toward the people you left in the lurch at work.
  • The theory also pointed out the limits on free-market ideologies, such as libertarianism.
  • In ethics, it broke across old factional debates. The core idea “has been picked up on by people across quite a range of positions,” Peter Railton, one of Anderson’s longtime colleagues, says. “Kantians and consequentialists alike”—people who viewed morality in terms of duties and obligations, and those who measured the morality of actions by their effects in the world—“could look at it and see something important.”
  • Traditionally, the discipline is taught through a-priori thought—you start with basic principles and reason forward. Anderson, by contrast, sought to work empirically, using information gathered from the world, identifying problems to be solved not abstractly but through the experienced problems of real people.
  • “Dewey argued that the primary problems for ethics in the modern world concerned the ways society ought to be organized, rather than personal decisions of the individual,”
  • In 2004, the Stanford Encyclopedia of Philosophy asked Anderson to compose its entry on the moral philosophy of John Dewey, who helped carry pragmatist methods into the social realm. Dewey had an idea of democracy as a system of good habits that began in civil life. He was an anti-ideologue with an eye for pluralism.
  • She started working with historians, trying to hone her understanding of ideas by studying them in the context of their creation. Take Rousseau’s apparent support of direct democracy. It’s rarely mentioned that, at the moment when he made that argument, his home town of Geneva had been taken over by oligarchs who claimed to represent the public. Pragmatism said that an idea was an instrument, which naturally gave rise to such questions as: an instrument for what, and where, and when?
  • In “What Is the Point of Equality?,” Anderson had already started to drift away from what philosophers, following Rawls, call ideal theory, based on an end vision for a perfectly just society. As Anderson began a serious study of race in America, though, she found herself losing faith in that approach entirely.
  • Broadly, there’s a culturally right and a culturally left ideal theory for race and society. The rightist version calls for color blindness. Instead of making a fuss about skin and ethnicity, its advocates say, society should treat people as people, and let the best and the hardest working rise.
  • The leftist theory envisions identity communities: for once, give black people (or women, or members of other historically oppressed groups) the resources and opportunities they need, including, if they want it, civil infrastructure for themselves.
  • In “The Imperative of Integration,” published in 2010, Anderson tore apart both of these models. Sure, it might be nice to live in a color-blind society, she wrote, but that’s nothing like the one that exists.
  • But the case for self-segregation was also weak. Affinity groups provided welcome comfort, yet that wasn’t the same as power or equality, Anderson pointed out. And there was a goose-and-gander problem. Either you let only certain groups self-segregate (certifying their subordinate status) or you also permitted, say, white men to do it,
  • Anderson’s solution was “integration,” a concept that, especially in progressive circles, had been uncool since the late sixties. Integration, by her lights, meant mixing on the basis of equality.
  • in attending to these empirical findings over doctrine, she announced herself as a non-ideal theorist: a philosopher with no end vision of society. The approach recalls E. L. Doctorow’s description of driving at night: “You can see only as far as the headlights, but you can make the whole trip that way.”
  • or others, though, a white woman making recommendations on race policy raised questions of perspective. She was engaging through a mostly white Anglo-American tradition. She worked from the premise that, because she drew on folders full of studies, the limits of her own perspective were not constraining.
  • Some philosophers of color welcomed the book. “She’s taking the need for racial justice seriously, and you could hardly find another white political philosopher over a period of decades doing that,”
  • Recently, Anderson changed the way she assigns undergraduate essays: instead of requiring students to argue a position and fend off objections, doubling down on their original beliefs, she asks them to discuss their position with someone who disagrees, and to explain how and why, if at all, the discussion changed their views.
  • The challenge of pluralism is the challenge of modern society: maintaining equality amid difference in a culture given to constant and unpredictable change.
  • Rather than fighting for the ascendancy of certain positions, Anderson suggests, citizens should fight to bolster healthy institutions and systems—those which insure that all views and experiences will be heard. Today’s righteous projects, after all, will inevitably seem fatuous and blinkered from the vantage of another age.
  • Smith saw the markets as an escape from that order. Their “most important” function, he explained, was to bring “liberty and security” to those “who had before lived almost in a continual state of war with their neighbours, and of servile dependency upon their superiors.”
  • Anderson zeroed in on Adam Smith, whose “The Wealth of Nations,” published in 1776, is taken as a keystone of free-market ideology. At the time, English labor was subject to uncompensated apprenticeships, domestic servitude, and some measure of clerical dominion.
  • Smith, in other words, was an egalitarian. He had written “The Wealth of Nations” in no small part to be a solution to what we’d now call structural inequality—the intractable, compounding privileges of an arbitrary hierarchy.
  • It was a historical irony that, a century later, writers such as Marx pointed to the market as a structure of dominion over workers; in truth, Smith and Marx had shared a socioeconomic project. And yet Marx had not been wrong to trash Smith’s ideas, because, during the time between them, the world around Smith’s model had changed, and it was no longer a useful tool.
  • mages of free market society that made sense prior to the Industrial Revolution continue to circulate today as ideals, blind to the gross mismatch between the background social assumptions reigning in the seventeenth and eighteenth centuries, and today’s institutional realities. We are told that our choice is between free markets and state control, when most adults live their working lives under a third thing entirely: private government.
  • Today, people still try to use, variously, both Smith’s and Marx’s tools on a different, postindustrial world:
  • The unnaturalness of this top-heavy arrangement, combined with growing evidence of power abuses, has given many people reason to believe that something is fishy about the structure of American equality. Socialist and anti-capitalist models are again in vogue.
  • Anderson offers a different corrective path. She thinks it’s fine for some people to earn more than others. If you’re a brilliant potter, and people want to pay you more than the next guy for your pottery, great!
  • The problem isn’t that talent and income are distributed in unequal parcels. The problem is that Jeff Bezos earns more than a hundred thousand dollars a minute, while Amazon warehouse employees, many talented and hardworking, have reportedly resorted to urinating in bottles in lieu of a bathroom break. That circumstance reflects some structure of hierarchical oppression. It is a rip in the democratic fabric, and it’s increasingly the norm.
  • Andersonism holds that we don’t have to give up on market society if we can recognize and correct for its limitations—it may even be our best hope, because it’s friendlier to pluralism than most alternatives are.
  • we must be flexible. We must remain alert. We must solve problems collaboratively, in the moment, using society’s ears and eyes and the best tools that we can find.
  • “You can see that, from about 1950 to 1970, the typical American’s wages kept up with productivity growth,” she said. Then, around 1974, she went on, hourly compensation stagnated. American wages have been effectively flat for the past few decades, with the gains of productivity increasingly going to shareholders and to salaries for big bosses.
  • What changed? Anderson rattled off a constellation of factors, from strengthened intellectual-property law to winnowed antitrust law. Financialization, deregulation. Plummeting taxes on capital alongside rising payroll taxes. Privatization, which exchanged modest public-sector salaries for C.E.O. paydays. She gazed into the audience and blinked. “So now we have to ask: What has been used to justify this rather dramatic shift of labor-share of income?”
  • It was no wonder that industrial-age thinking was riddled with contradictions: it reflected what Anderson called “the plutocratic reversal” of classical liberal ideas. Those perversely reversed ideas about freedom were the ones that found a home in U.S. policy, and, well, here we were.
Javier E

The Adams Principle ❧ Current Affairs - 0 views

  • This type of glib quasi-logic works really well in comedy, especially in a format where space is restricted, and where the quick, disposable nature of the strip limits your ability to draw humor from character and plot. You take an idea, find a way to subvert or deconstruct it, and you get an absurd result.
  • while the idea of a “cubicle job” can seem to younger readers like relative bliss, they were (and are) still an emblem of boredom and absurdity, a sign that life was being slowly colonized by gray shapes and Powerpoint slides. Throughout his classic-era work, Adams hits on the feeling that the world has been made unnatural, unconducive to life; materially adequate, but spiritually exhausting. 
  • He makes constant use of something I’m going to call, for want of a better term, the sophoid: something which has the outer semblance of wisdom, but none of the substance; something that sounds weighty if you say it confidently enough, yet can be easily thrown away as “just a thought” if it won’t hold up to scrutiny.
  • ...10 more annotations...
  • Adams did not just stick to comics: he is the author of over a dozen books (not counting the comic compendiums), which advise and analyze not only on surviving the office but also on daily life, future technology trends, romance, self-help strategy, and more. 
  • In his earlier books, you can feel the weight of the 1990s pressing down on his work, flattening and numbing its potency; this was the period that social scientist Francis Fukuyama dubbed “the end of history”, when the Cold War had ended, the West had won, 9/11 was just two numbers, and there were no grand missions left, no worlds left to conquer. While for millions of people, both in the United States and abroad, life was still chaotic and miserable, a lot of people found themselves living lives that were under no great immediate threat: without bombs or fascism or the threat of eviction to worry about, there was nothing left to do but to go to the office and enjoy fast-casual dining and Big Gulps, just as the Founding Fathers envisioned.
  • This dull but steady life produced a sense of slow-burn anxiety prominent in much of the pop culture of the time, as can be seen in movies such as Office Space, Fight Club and The Matrix, movies which cooed to their audience: there’s got to be more to life than this, right?
  • Beware: as I’m pretty sure Nietzsche said, when you gaze into Dilbert, eventually Dilbert gazes back into you.
  • for someone who satirizes business bullshit, Adams is a person who seems to have bought into much of it wholeheartedly; when he explains his approach to life he tends to speak in LinkedIn truisms, expounding on his “skill stacks” and “maximizing [his] personal energy”. (You can read more about this in his career advice book, How to Fail at Almost Everything and Still Win Big;
  • Following his non-Dilbert career more carefully, you can see that at every stage of his career, he’s actually quite heavily invested in the bullshit he makes fun of every day, or at least some aspects of it: he possesses an MBA from UC Berkeley, and has launched or otherwise been involved in a significant number of business ventures, most amusingly a health food wrap called the “Dilberito”.
  • In the past few years, Adams has gained some notoriety as a Trump supporter; having slowly moved from “vaguely all-over-the-place centrist who has some odd thoughts and thinks some aspects of Trump are impressive” to full-on MAGA guy, even writing a book called Win Bigly praising Trump’s abilities as a “master persuader”.
  • this is a guy who hates drab corporatespeak but loves the ideology behind it, a guy who describes the vast powerlessness of life but believes you can change it by writing some words on a napkin. That blend of rebellion against the symptoms of post-Cold War society and sworn allegiance to its machinations couldn’t lead anywhere else but to Trump, a man who rails against ‘elites’ while allowing them to run the country into the ground.
  • In Dilbert the Pointy-haired Boss uses this type of thinking to evil ends, in the tradition of Catch-22 and other satires of systemic brutality, but the relatable characters use it to their advantage too—by using intellectual sleight of hand with the boss to justify doing less work, or by finding clever ways to look busy when they’re not, or to avoid people who are unpleasant to be around.
  • I just think Adams is a guy who spent so long in the world of slick aphorisms and comic-strip logic that it eventually ate into his brain, became his entire manner of thinking
Javier E

The Age of Social Media Is Ending - The Atlantic - 0 views

  • Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
  • A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
  • “social media,” a name so familiar that it has ceased to bear meaning. But two decades ago, that term didn’t exist
  • ...35 more annotations...
  • a “web 2.0” revolution in “user-generated content,” offering easy-to-use, easily adopted tools on websites and then mobile apps. They were built for creating and sharing “content,”
  • As the original name suggested, social networking involved connecting, not publishing. By connecting your personal network of trusted contacts (or “strong ties,” as sociologists call them) to others’ such networks (via “weak ties”), you could surface a larger network of trusted contacts
  • The whole idea of social networks was networking: building or deepening relationships, mostly with people you knew. How and why that deepening happened was largely left to the users to decide.
  • That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts.
  • Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
  • A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
  • The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
  • The toxicity of social media makes it easy to forget how truly magical this innovation felt when it was new. From 2004 to 2009, you could join Facebook and everyone you’d ever known—including people you’d definitely lost track of—was right there, ready to connect or reconnect. The posts and photos I saw characterized my friends’ changing lives, not the conspiracy theories that their unhinged friends had shared with them
  • Twitter, which launched in 2006, was probably the first true social-media site, even if nobody called it that at the time. Instead of focusing on connecting people, the site amounted to a giant, asynchronous chat room for the world. Twitter was for talking to everyone—which is perhaps one of the reasons journalists have flocked to it
  • on Twitter, anything anybody posted could be seen instantly by anyone else. And furthermore, unlike posts on blogs or images on Flickr or videos on YouTube, tweets were short and low-effort, making it easy to post many of them a week or even a day.
  • soon enough, all social networks became social media first and foremost. When groups, pages, and the News Feed launched, Facebook began encouraging users to share content published by others in order to increase engagement on the service, rather than to provide updates to friends. LinkedIn launched a program to publish content across the platform, too. Twitter, already principally a publishing platform, added a dedicated “retweet” feature, making it far easier to spread content virally across user networks.
  • When we look back at this moment, social media had already arrived in spirit if not by name. RSS readers offered a feed of blog posts to catch up on, complete with unread counts. MySpace fused music and chatter; YouTube did it with video (“Broadcast Yourself”)
  • From being asked to review every product you buy to believing that every tweet or Instagram image warrants likes or comments or follows, social media produced a positively unhinged, sociopathic rendition of human sociality.
  • Other services arrived or evolved in this vein, among them Reddit, Snapchat, and WhatsApp, all far more popular than Twitter. Social networks, once latent routes for possible contact, became superhighways of constant content
  • Although you can connect the app to your contacts and follow specific users, on TikTok, you are more likely to simply plug into a continuous flow of video content that has oozed to the surface via algorithm.
  • In the social-networking era, the connections were essential, driving both content creation and consumption. But the social-media era seeks the thinnest, most soluble connections possible, just enough to allow the content to flow.
  • This is also why journalists became so dependent on Twitter: It’s a constant stream of sources, events, and reactions—a reporting automat, not to mention an outbound vector for media tastemakers to make tastes.
  • “influencer” became an aspirational role, especially for young people for whom Instagram fame seemed more achievable than traditional celebrity—or perhaps employment of any kind.
  • social-media operators discovered that the more emotionally charged the content, the better it spread across its users’ networks. Polarizing, offensive, or just plain fraudulent information was optimized for distribution. By the time the platforms realized and the public revolted, it was too late to turn off these feedback loops.
  • The ensuing disaster was multipar
  • Rounding up friends or business contacts into a pen in your online profile for possible future use was never a healthy way to understand social relationships.
  • when social networking evolved into social media, user expectations escalated. Driven by venture capitalists’ expectations and then Wall Street’s demands, the tech companies—Google and Facebook and all the rest—became addicted to massive scale
  • Social media showed that everyone has the potential to reach a massive audience at low cost and high gain—and that potential gave many people the impression that they deserve such an audience.
  • On social media, everyone believes that anyone to whom they have access owes them an audience: a writer who posted a take, a celebrity who announced a project, a pretty girl just trying to live her life, that anon who said something afflictive
  • When network connections become activated for any reason or no reason, then every connection seems worthy of traversing.
  • people just aren’t meant to talk to one another this much. They shouldn’t have that much to say, they shouldn’t expect to receive such a large audience for that expression, and they shouldn’t suppose a right to comment or rejoinder for every thought or notion either.
  • Facebook and all the rest enjoyed a massive rise in engagement and the associated data-driven advertising profits that the attention-driven content economy created. The same phenomenon also created the influencer economy, in which individual social-media users became valuable as channels for distributing marketing messages or product sponsorships by means of their posts’ real or imagined reach
  • That’s no surprise, I guess, given that the model was forged in the fires of Big Tech companies such as Facebook, where sociopathy is a design philosophy.
  • If change is possible, carrying it out will be difficult, because we have adapted our lives to conform to social media’s pleasures and torments. It’s seemingly as hard to give up on social media as it was to give up smoking en masse
  • Quitting that habit took decades of regulatory intervention, public-relations campaigning, social shaming, and aesthetic shifts. At a cultural level, we didn’t stop smoking just because the habit was unpleasant or uncool or even because it might kill us. We did so slowly and over time, by forcing social life to suffocate the practice. That process must now begin in earnest for social media.
  • Something may yet survive the fire that would burn it down: social networks, the services’ overlooked, molten core. It was never a terrible idea, at least, to use computers to connect to one another on occasion, for justified reasons, and in moderation
  • The problem came from doing so all the time, as a lifestyle, an aspiration, an obsession. The offer was always too good to be true, but it’s taken us two decades to realize the Faustian nature of the bargain.
  • when I first wrote about downscale, the ambition seemed necessary but impossible. It still feels unlikely—but perhaps newly plausible.
  • To win the soul of social life, we must learn to muzzle it again, across the globe, among billions of people. To speak less, to fewer people and less often–and for them to do the same to you, and everyone else as well
  • We cannot make social media good, because it is fundamentally bad, deep in its very structure. All we can do is hope that it withers away, and play our small part in helping abandon it.
Javier E

Musk, SBF, and the Myth of Smug, Castle-Building Nerds - 0 views

  • Experts in content moderation suggested that Musk’s actual policies lacked any coherence and, if implemented, would have all kinds of unintended consequences. That has happened with verification. Almost every decision he makes is an unforced error made with extreme confidence in front of a growing audience of people who already know he has messed up, and is supported by a network of sycophants and blind followers who refuse to see or tell him that he’s messing up. The dynamic is … very Trumpy!
  • As with the former president, it can be hard at times for people to believe or accept that our systems are so broken that a guy who is clearly this inept can also be put in charge of something so important. A common pundit claim before Donald Trump got into the White House was that the gravity of the job and prestige of the office might humble or chasten him.
  • The same seems true for Musk. Even people skeptical of Musk’s behavior pointed to his past companies as predictors of future success. He’s rich. He does smart-people stuff. The rockets land pointy-side up!
  • ...18 more annotations...
  • Time and again, we learned there was never a grand plan or big ideas—just weapons-grade ego, incompetence, thin skin, and prejudice against those who don’t revere him.
  • Despite all the incredible, damning reporting coming out of Twitter and all of Musk’s very public mistakes, many people still refuse to believe—even if they detest him—that he is simply incompetent.
  • What is amazing about the current moment is that, despite how ridiculous it all feels, a fundamental tenet of reality and logic appears to be holding true: If you don’t know what you’re doing or don’t really care, you’ll run the thing you’re in charge of into the ground, and people will notice.
  • And so the moment feels too dumb and too on the nose to be real and yet also very real—kind of like all of reality in 2022.
  • I don’t really know where any of this will lead, but one interesting possibility is that Musk gets increasingly reactionary and trollish in his politics and stewardship of Twitter.
  • Leaving the politics aside, from a basic customer-service standpoint this is generally an ill-advised way for the owner of a company to treat an elected official when that elected official wishes to know why your service has failed them. The reason it is ill-advised is because then the elected official could tweet something like what Senator Markey tweeted on Sunday: “One of your companies is under an FTC consent decree. Auto safety watchdog NHTSA is investigating another for killing people. And you’re spending your time picking fights online. Fix your companies. Or Congress will.”
  • It seems clear that Musk, like any dedicated social-media poster, thrives on validation, so it makes sense that, as he continues to dismantle his own mystique as an innovator, he might look for adoration elsewhere
  • Recent history has shown that, for a specific audience, owning the libs frees a person from having to care about competency or outcome of their actions. Just anger the right people and you’re good, even if you’re terrible at your job. This won’t help Twitter’s financial situation, which seems bleak, but it’s … something!
  • Bankman-Fried, the archetype, appealed to people for all kinds of reasons. His narrative as a philanthropist, and a smart rationalist, and a stone-cold weirdo was something people wanted to buy into because, generally, people love weirdos who don’t conform to systems and then find clever ways to work around them and become wildly successful as a result.
  • Bankman-Fried was a way that a lot of people could access and maybe obliquely understand what was going on in crypto. They may not have understood what FTX did, but they could grasp a nerd trying to leverage a system in order to do good in the world and advance progressive politics. In that sense, Bankman-Fried is easy to root for and exciting to cover. His origin story and narrative become more important than the particulars of what he may or may not be doing.
  • the past few weeks have been yet another reminder that the smug-nerd-genius narrative may sell magazines, and it certainly raises venture funding, but the visionary founder is, first and foremost, a marketing product, not a reality. It’s a myth that perpetuates itself. Once branded a visionary, the founder can use the narrative to raise money and generate a formidable net worth, and then the financial success becomes its own résumé. But none of it is real.
  • Adversarial journalism ideally questions and probes power. If it is trained on technology companies and their founders, it is because they either wield that power or have the potential to do so. It is, perhaps unintuitively, a form of respect for their influence and potential to disrupt. But that’s not what these founders want.
  • even if all tech coverage had been totally flawless, Silicon Valley would have rejected adversarial tech journalism because most of its players do not actually want the responsibility that comes with their potential power. They want only to embody the myth and reap the benefits. They want the narrative, which is focused on origins, ambitions, ethos, and marketing, and less on the externalities and outcomes.
  • Looking at Musk and Bankman-Fried, it would appear that the tech visionaries mostly get their way. For all the complaints of awful, negative coverage and biased reporting, people still want to cheer for and give money to the “‘smug nerds building castles in the sky.’” Though they vary wildly right now in magnitude, their wounds are self-inflicted—and, perhaps, the result of believing their own hype.
  • That’s because, almost always, the smug-nerd-genius narrative is a trap. It’s one that people fall into because they need to believe that somebody out there is so brilliant, they can see the future, or that they have some greater, more holistic understanding of the world (or that such an understanding is possible)
  • It’s not unlike a conspiracy theory in that way. The smug-nerd-genius narrative helps take the complexity of the world and make it more manageable.
  • Putting your faith in a space billionaire or a crypto wunderkind isn’t just sad fanboydom; it is also a way for people to outsource their brain to somebody else who, they believe, can see what they can’t
  • the smug nerd genius is exceedingly rare, and, even when they’re not outed as a fraud or a dilettante, they can be assholes or flawed like anyone else. There aren’t shortcuts for making sense of the world, and anyone who is selling themselves that way or buying into that narrative about them should read to us as a giant red flag.
Javier E

A Commencement Address Too Honest to Deliver in Person - The Atlantic - 0 views

  • Use this hiatus to do something you would never have done if this emergency hadn’t hit. When the lockdown lifts, move to another state or country. Take some job that never would have made sense if you were worrying about building a career—bartender, handyman, AmeriCorps volunteer.
  • If you use the next two years as a random hiatus, you may not wind up richer, but you’ll wind up more interesting.
  • The biggest way most colleges fail is this: They don’t plant the intellectual and moral seeds students are going to need later, when they get hit by the vicissitudes of life.
  • ...13 more annotations...
  • If you didn’t study Jane Austen while you were here, you probably lack the capacity to think clearly about making a marriage decision. If you didn’t read George Eliot, then you missed a master class on how to judge people’s character. If you didn’t read Nietzsche, you are probably unprepared to handle the complexities of atheism—and if you didn’t read Augustine and Kierkegaard, you’re probably unprepared to handle the complexities of faith.
  • The list goes on. If you didn’t read de Tocqueville, you probably don’t understand your own country. If you didn’t study Gibbon, you probably lack the vocabulary to describe the rise and fall of cultures and nations.
  • The wisdom of the ages is your inheritance; it can make your life easier. These resources often fail to get shared because universities are too careerist, or because faculty members are more interested in their academic specialties or politics than in teaching undergraduates, or because of a host of other reasons
  • What are you putting into your mind? Our culture spends a lot less time worrying about this, and when it does, it goes about it all wrong.
  • my worry is that, especially now that you’re out of college, you won’t put enough really excellent stuff into your brain.
  • I worry that it’s possible to grow up now not even aware that those upper registers of human feeling and thought exist.
  • The theory of maximum taste says that each person’s mind is defined by its upper limit—the best that it habitually consumes and is capable of consuming.
  • After college, most of us resolve to keep doing this kind of thing, but we’re busy and our brains are tired at the end of the day. Months and years go by. We get caught up in stuff, settle for consuming Twitter and, frankly, journalism. Our maximum taste shrinks.
  • I’m worried about the future of your maximum taste. People in my and earlier generations, at least those lucky enough to get a college education, got some exposure to the classics, which lit a fire that gets rekindled every time we sit down to read something really excellent.
  • the “theory of maximum taste.” This theory is based on the idea that exposure to genius has the power to expand your consciousness. If you spend a lot of time with genius, your mind will end up bigger and broader than if you spend your time only with run-of-the-mill stuff.
  • the whole culture is eroding the skill the UCLA scholar Maryanne Wolf calls “deep literacy,” the ability to deeply engage in a dialectical way with a text or piece of philosophy, literature, or art.
  • “To the extent that you cannot perceive the world in its fullness, to the same extent you will fall back into mindless, repetitive, self-reinforcing behavior, unable to escape.”
  • I can’t say that to you, because it sounds fussy and elitist and OK Boomer. And if you were in front of me, you’d roll your eyes.
  •  
    Or as the neurologist Richard Cytowic put it to Adam Garfinkle, "To the extent that you cannot perceive the world in its fullness, to the same extent you will fall back into mindless, repetitive, self-reinforcing behavior, unable to escape."*
Javier E

Opinion | The Question of Transgender Care - The New York Times - 0 views

  • Doctors and researchers have proposed various theories to try to explain these trends. One is that greater social acceptance of trans people has enabled people to seek these therapies. Another is that teenagers are being influenced by the popularity of searching and experimenting around identity. A third is that the rise of teen mental health issues may be contributing to gender dysphoria.
  • Some activists and medical practitioners on the left have come to see the surge in requests for medical transitioning as a piece of the new civil rights issue of our time — offering recognition to people of all gender identities.
  • Transition through medical interventions was embraced by providers in the United States and Europe after a pair of small Dutch studies showed that such treatment improved patients’ well-being
  • ...11 more annotations...
  • a 2022 Reuters investigation found that some American clinics were quite aggressive with treatment: None of the 18 U.S. clinics that Reuters looked at performed long assessments on their patients, and some prescribed puberty blockers on the first visit.
  • As Cass writes in her report, “The toxicity of the debate is exceptional.” She continues, “There are few other areas of health care where professionals are so afraid to openly discuss their views, where people are vilified on social media and where name-calling echoes the worst bullying behavior.”
  • The report’s greatest strength is its epistemic humility. Cass is continually asking, “What do we really know?” She is carefully examining the various studies — which are high quality, which are not. She is down in the academic weeds.
  • he notes that the quality of the research in this field is poor. The current treatments are “built on shaky foundations,” she writes in The BMJ. Practitioners have raced ahead with therapies when we don’t know what the effects will be. As Cass tells The BMJ, “I can’t think of another area of pediatric care where we give young people a potentially irreversible treatment and have no idea what happens to them in adulthood.”
  • She writes in her report, “The option to provide masculinizing/feminizing hormones from age 16 is available, but the review would recommend extreme caution.
  • her core conclusion is this: “For most young people, a medical pathway will not be the best way to manage their gender-related distress.” She realizes that this conclusion will not please many of the young people she has come to know, but this is where the evidence has taken her.
  • In 1877 a British philosopher and mathematician named William Kingdon Clifford published an essay called “The Ethics of Belief.” In it he argued that if a shipowner ignored evidence that his craft had problems and sent the ship to sea having convinced himself it was safe, then of course we would blame him if the ship went down and all aboard were lost. To have a belief is to bear responsibility, and one thus has a moral responsibility to dig arduously into the evidence, avoid ideological thinking and take into account self-serving biases.
  • “It is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence,” Clifford wrote
  • A belief, he continued, is a public possession. If too many people believe things without evidence, “the danger to society is not merely that it should believe wrong things, though that is great enough; but that it should become credulous, and lose the habit of testing things and inquiring into them; for then it must sink back into savagery.”
  • Since the Trump years, this habit of not consulting the evidence has become the underlying crisis in so many realms. People segregate into intellectually cohesive teams, which are always dumber than intellectually diverse teams. Issues are settled by intimidation, not evidence
  • Our natural human tendency is to be too confident in our knowledge, too quick to ignore contrary evidence. But these days it has become acceptable to luxuriate in those epistemic shortcomings, not to struggle against them. See, for example, the modern Republican Party.
Javier E

Can Political Theology Save Secularism? | Religion & Politics - 0 views

  • Osama bin Laden had forced us to admit that, while the U.S. may legally separate church and state, it cannot do so intellectually. Beneath even the most ostensibly faithless of our institutions and our polemicists lie crouching religious lions, ready to devour the infidels who set themselves in opposition to the theology of the free market and the messianic march of democracy
  • As our political system depends on a shaky separation between religion and politics that has become increasingly unstable, scholars are sensing the deep disillusionment afoot and trying to chart a way out.
  • At its best, Religion for Atheists is a chronicle of the smoldering heap that liberal capitalism has made of the social rhythms that used to serve as a buffer between humans and the random cruelty of the universe. Christian and Jewish traditions, Botton argues, reinforced the ideas that people are morally deficient, that disappointment and suffering are normative, and that death is inevitable. The abandonment of those realities for the delusions of the self-made individual, the fantasy superman who can bend reality to his will if he works hard enough and is positive enough, leaves little mystery to why we are perpetually stressed out, overworked, and unsatisfied.
  • ...12 more annotations...
  • Botton’s central obsession is the insane ways bourgeois postmoderns try to live, namely in a perpetual upward swing of ambition and achievement, where failure indicates character deficiency despite an almost total lack of social infrastructure to help us navigate careers, relationships, parenting, and death. But he seems uninterested in how those structures were destroyed or what it might take to rebuild them
  • Botton wants to keep bourgeois secularism and add a few new quasi-religious social routines. Quasi-religious social routines may indeed be a part of the solution, as we shall see, but they cannot be simply flung atop a regime as indifferent to human values as liberal capitalism.
  • Citizens see the structure behind the façade and lose faith in the myth of the state as a dispassionate, egalitarian arbiter of conflict. Once theological passions can no longer be sublimated in material affluence and the fiction of representative democracy, it is little surprise to see them break out in movements that are, on both the left and the right, explicitly hostile to the liberal state.
  • Western politics have an auto-immune disorder: they are structured to pretend that their notions of reason, right, and sovereignty are detached from a deeply theological heritage. When pressed by war and economic dysfunction, liberal ideas prove as compatible with zealotry and domination as any others.
  • Secularism is not strictly speaking a religion, but it represents an orientation toward religion that serves the theological purpose of establishing a hierarchy of legitimate social values. Religion must be “privatized” in liberal societies to keep it out of the way of economic functioning. In this view, legitimate politics is about making the trains run on time and reducing the federal deficit; everything else is radicalism. A surprising number of American intellectuals are able to persuade themselves that this vision of politics is sufficient, even though the train tracks are crumbling, the deficit continues to gain on the GDP, and millions of citizens are sinking into the dark mire of debt and permanent unemployment.
  • Critchley has made a career forging a philosophical account of human ethical responsibility and political motivation. His question is: after the rational hopes of the Enlightenment corroded into nihilism, how do humans write a believable story about what their existence means in the world? After the death of God, how do we account for our feelings of moral responsibility, and how might that account motivate us to resist the deadening political system we face?
  • The question is what to do in the face of the unmistakable religious and political nihilism currently besetting Western democracies.
  • both Botton and Critchley believe the solution involves what Derrida called a “religion without religion”—for Critchley a “faith of the faithless,” for Botton a “religion for atheists.”
  • a new political becoming will require a complete break with the status quo, a new political sphere that we understand as our own deliberate creation, uncoupled from the theological fictions of natural law or God-given rights
  • Critchley proposes as the foundation of politics “the poetic construction of a supreme fiction … a fiction that we know to be a fiction and yet in which we believe nonetheless.” Following the French philosopher Alain Badiou and the Apostle Paul, Critchley conceives political “truth” as something like fidelity: a radical loyalty to the historical moment where true politics came to life.
  • But unlike an evangelist, Critchley understands that attempting to fill the void with traditional religion is to slip back into a slumber that reinforces institutions desperate to maintain the political and economic status quo. Only in our condition of brokenness and finitude, uncomforted by promises of divine salvation, can we be open to a connection with others that might mark the birth of political resistance
  • This is the crux of the difference between Critchley’s radical faithless faith and Botton’s bourgeois secularism. Botton has imagined religion as little more than a coping mechanism for the “terrifying degrees of pain which arise from our vulnerability,” seemingly unaware that the pain and vulnerability may intensify many times over. It won’t be enough to simply to sublimate our terror in confessional restaurants and atheist temples. The recognition of finitude, the weight of our nothingness, can hollow us into a different kind of self: one without illusions or reputations or private property, one with nothing but radical openness to others. Only then can there be the possibility of meaning, of politics, of hope.
Emily Horwitz

Struggle For Smarts? How Eastern And Western Cultures Tackle Learning : Shots - Health News : NPR - 1 views

  • In 1979, when Jim Stigler was still a graduate student at the University of Michigan, he went to Japan to research teaching methods and found himself sitting in the back row of a crowded fourth grade math class.
  • and one kid was just totally having trouble with it. His cube looked all cockeyed, so the teacher said to him, 'Why don't you go put yours on the board?' So right there I thought, 'That's interesting! He took the one who can't do it and told him to go and put it on the board.'"
  • the kid didn't break into tears. Stigler says the child continued to draw his cube with equanimity. "And at the end of the class, he did make his cube look right! And the teacher said to the class, 'How does that look, class?' And they all looked up and said, 'He did it!' And they broke into applause." The kid smiled a huge smile and sat down, clearly proud of himself.
  • ...12 more annotations...
  • very early ages we [in America] see struggle as an indicator that you're just not very smart," Stigler says. "It's a sign of low ability — people who are smart don't struggle, they just naturally get it, that's our folk theory. Whereas in Asian cultures they tend to see struggle more as an opportunity."
  • For the most part in American culture, intellectual struggle in schoolchildren is seen as an indicator of weakness, while in Eastern cultures it is not only tolerated, it is often used to measure emotional strength.
  • to understand why these two cultures view struggle so differently, it's good to step back and examine how they think about where academic excellence comes from.
  • American mother is communicating to her son that the cause of his success in school is his intelligence. He's smart — which, Li says, is a common American view.
  • children are not creative. Our children do not have individuality. They're just robots. You hear the educators from Asian countries express that concern, a
  • "So the focus is on the process of persisting through it despite the challenges, not giving up, and that's what leads to success," Li says.
  • Obviously if struggle indicates weakness — a lack of intelligence — it makes you feel bad, and so you're less likely to put up with it. But if struggle indicates strength — an ability to face down the challenges that inevitably occur when you are trying to learn something — you're more willing to accept it.
  • American students "worked on it less than 30 seconds on average and then they basically looked at us and said, 'We haven't had this,'" he says.
  • Japanese students worked for the entire hour on the impossible problem.
  • Westerns tend to worry that their kids won't be able to compete against Asian kids who excel in many areas but especially in math and science. Jin Li says that educators from Asian countries have their own set of worries.
  • "The idea of intelligence in believed in the West as a cause," Li explains. "She is telling him that there is something in him, in his mind, that enables him to do what he does."
  • in the Japanese classrooms that he's studied, teachers consciously design tasks that are slightly beyond the capabilities of the students they teach, so the students can actually experience struggling with something just outside their reach. Then, once the task is mastered, the teachers actively point out that the student was able to accomplish it through the students hard work and struggle.
  •  
    An interesting look into the differences between how Eastern and Western cultures see academic struggle
Javier E

The Way to Produce a Person - NYTimes.com - 0 views

  • the brain is a malleable organ. Every time you do an activity, or have a thought, you are changing a piece of yourself into something slightly different than it was before. Every hour you spend with others, you become more like the people around you.
  • Gradually, you become a different person. If there is a large gap between your daily conduct and your core commitment, you will become more like your daily activities and less attached to your original commitment. You will become more hedge fund, less malaria.
  • I would worry about turning yourself into a means rather than an end. If you go to Wall Street mostly to make money for charity, you may turn yourself into a machine for the redistribution of wealth. You may turn yourself into a fiscal policy.
  • ...2 more annotations...
  • a human life is not just a means to produce outcomes, it is an end in itself. When we evaluate our friends, we don’t just measure the consequences of their lives. We measure who they intrinsically are. We don’t merely want to know if they have done good. We want to know if they are good.
  • We live in a relentlessly commercial culture, so it’s natural that many people would organize their lives in utilitarian and consequentialist terms. But it’s possible to get carried away with this kind of thinking — to have logic but no wisdom
Javier E

Putting Economic Data Into Context - The New York Times - 0 views

  • economic historians have been wrestling with this problem for years and have produced an excellent calculator for converting historical data into contemporary figures. The site is called Measuring Worth,
  • Today we use price indexes to convert monetary values from the past into “real” values today. The best-known such index is the Consumer Price Index published monthly by the Bureau of Labor Statistics. For those interested only in a simple inflation adjustment, the bureau maintains a useful calculator.
  • The area where this is the biggest problem is probably large budget numbers. The raw data is almost universally useless. Saying that the budget deficit was $680.3 billion in fiscal year 2013 tells the average person absolutely nothing of value. It’s just a large number that sounds scary. It would help to at least know that it is down from $1.087 trillion in 2012 and a peak of $1.413 trillion in 2009, but that’s not entirely adequate.
  • ...6 more annotations...
  • it makes no sense to compare the federal budget to a family budget, which is what the Consumer Price Index is based on. One needs to use a broader index, like the gross domestic product deflator, which measures price changes throughout the entire economy.
  • For large numbers, the percentage of the gross domestic product is both the easiest to find and best to use.
  • Since the “burden” of the debt basically falls on the entire economy, the debt-to-G.D.P. ratio is generally considered the best measure of that burden. It also facilitates international comparisons without having to worry about exchange-rate adjustments.
  • international price comparisons can be especially tricky because current market exchange rates may not accurately reflect relative values or standards of living. Economists generally prefer to use something called “purchasing power parity,” but such data is not always easy to come by
  • There is much more to say on this topic. I recommend an essay on the Measuring Worth website that discusses different measures of value over time and how they materially affect our perceptions. There are also new statistical measures coming online that may provide even better data, like the Billion Prices Project from M.I.T., which gathers price data in real time directly from store price scanners.
  • This is an area where trial and error is the best strategy. The important thing is to make an effort to provide proper context where it appears necessary and not to simply ignore the problem.
Javier E

Science and gun violence: why is the research so weak? [Part 2] - Boing Boing - 1 views

  • Scientists are missing some important bits of data that would help them better understand the effects of gun policy and the causes of gun-related violence. But that’s not the only reason why we don’t have solid answers. Once you have the data, you still have to figure out what it means. This is where the research gets complicated, because the problem isn’t simply about what we do and don’t know right now. The problem, say some scientists, is that we —from the public, to politicians, to even scientists themselves—may be trying to force research to give a type of answer that we can’t reasonably expect it to offer. To understand what science can do for the gun debates, we might have to rethink what “evidence-based policy” means to us.
  • For the most part, there aren’t a lot of differences in the data that these studies are using. So how can they reach such drastically different conclusions? The issue is in the kind of data that exists, and what you have to do to understand it, says Charles Manski, professor of economics at Northwestern University. Manski studies the ways that other scientists do research and how that research translates into public policy.
  • Even if we did have those gaps filled in, Manski said, what we’d have would still just be observational data, not experimental data. “We don’t have randomized, controlled experiments, here,” he said. “The only way you could do that, you’d have to assign a gun to some people randomly at birth and follow them throughout their lives. Obviously, that’s not something that’s going to work.”
  • ...14 more annotations...
  • This means that, even under the best circumstances, scientists can’t directly test what the results of a given gun policy are. The best you can do is to compare what was happening in a state before and after a policy was enacted, or to compare two different states, one that has the policy and one that doesn’t. And that’s a pretty inexact way of working.
  • Add in enough assumptions, and you can eventually come up with an estimate. But is the estimate correct? Is it even close to reality? That’s a hard question to answer, because the assumptions you made—the correlations you drew between cause and effect, what you know and what you assume to be true because of that—might be totally wrong.
  • It’s hard to tease apart the effect of one specific change, compared to the effects of other things that could be happening at the same time.
  • This process of taking the observational data we do have and then running it through a filter of assumptions plays out in the real world in the form of statistical modeling. When the NAS report says that nobody yet knows whether more guns lead to more crime, or less crime, what they mean is that the models and the assumptions built into those models are all still proving to be pretty weak.
  • From either side of the debate, he said, scientists continue to produce wildly different conclusions using the same data. On either side, small shifts in the assumptions lead the models to produce different results. Both factions continue to choose sets of assumptions that aren’t terribly logical. It’s as if you decided that anybody with blue shoes probably had a belly-button piercing. There’s not really a good reason for making that correlation. And if you change the assumption—actually, belly-button piercings are more common in people who wear green shoes—you end up with completely different results.
  • The Intergovernmental Panel on Climate Change (IPCC) produces these big reports periodically, which analyze lots of individual papers. In essence, they’re looking at lots of trees and trying to paint you a picture of the forest. IPCC reports are available for free online, you can go and read them yourself. When you do, you’ll notice something interesting about the way that the reports present results. The IPCC never says, “Because we burned fossil fuels and emitted carbon dioxide into the atmosphere then the Earth will warm by x degrees.” Instead, those reports present a range of possible outcomes … for everything. Depending on the different models used, different scenarios presented, and the different assumptions made, the temperature of the Earth might increase by anywhere between 1.5 and 4.5 degrees Celsius.
  • What you’re left with is an environment where it’s really easy to prove that your colleague’s results are probably wrong, and it’s easy for him to prove that yours are probably wrong. But it’s not easy for either of you to make a compelling case for why you’re right.
  • Statistical modeling isn’t unique to gun research. It just happens to be particularly messy in this field. Scientists who study other topics have done a better job of using stronger assumptions and of building models that can’t be upended by changing one small, seemingly randomly chosen detail. It’s not that, in these other fields, there’s only one model being used, or even that all the different models produce the exact same results. But the models are stronger and, more importantly, the scientists do a better job of presenting the differences between models and drawing meaning from them.
  • “Climate change is one of the rare scientific literatures that has actually faced up to this,” Charles Manski said. What he means is that, when scientists model climate change, they don’t expect to produce exact, to-the-decimal-point answers.
  • “It’s been a complete waste of time, because we can’t validate one model versus another,” Pepper said. Most likely, he thinks that all of them are wrong. For instance, all the models he’s seen assume that a law will affect every state in the same way, and every person within that state in the same way. “But if you think about it, that’s just nonsensical,” he said.
  • On the one hand, that leaves politicians in a bit of a lurch. The response you might mount to counteract a 1.5 degree increase in global average temperature is pretty different from the response you’d have to 4.5 degrees. On the other hand, the range does tell us something valuable: the temperature is increasing.
  • The problem with this is that it flies in the face of what most of us expect science to do for public policy. Politics is inherently biased, right? The solutions that people come up with are driven by their ideologies. Science is supposed to cut that Gordian Knot. It’s supposed to lay the evidence down on the table and impartially determine who is right and who is wrong.
  • Manski and Pepper say that this is where we need to rethink what we expect science to do. Science, they say, isn’t here to stop all political debate in its tracks. In a situation like this, it simply can’t provide a detailed enough answer to do that—not unless you’re comfortable with detailed answers that are easily called into question and disproven by somebody else with a detailed answer.
  • Instead, science can reliably produce a range of possible outcomes, but it’s still up to the politicians (and, by extension, up to us) to hash out compromises between wildly differing values on controversial subjects. When it comes to complex social issues like gun ownership and gun violence, science doesn’t mean you get to blow off your political opponents and stake a claim on truth. Chances are, the closest we can get to the truth is a range that encompasses the beliefs of many different groups.
Javier E

Charlie Sykes on Where the Right Went Wrong - The New York Times - 0 views

  • t I have to admit that the campaign has made my decision easier. The conservative media is broken and the conservative movement deeply compromised.
  • Before this year, I thought I had a relatively solid grasp on what conservatism stood for and where it was going
  • I was under the impression that conservatives actually believed things about free trade, balanced budgets, character and respect for constitutional rights. Then along came this campaign.
  • ...15 more annotations...
  • When I wrote in August 2015 that Mr. Trump was a cartoon version of every left-wing media stereotype of the reactionary, nativist, misogynist right, I thought that I was well within the mainstream of conservative thought — only to find conservative Trump critics denounced for apostasy by a right that decided that it was comfortable with embracing Trumpism.
  • relatively few of my listeners bought into the crude nativism Mr. Trump was selling at his rallies.
  • What they did buy into was the argument that this was a “binary choice.” No matter how bad Mr. Trump was, my listeners argued, he could not possibly be as bad as Mrs. Clinton. You simply cannot overstate this as a factor in the final outcome
  • Even among Republicans who had no illusions about Mr. Trump’s character or judgment, the demands of that tribal loyalty took precedence. To resist was an act of betrayal.
  • In this binary tribal world, where everything is at stake, everything is in play, there is no room for quibbles about character, or truth, or principles.
  • If everything — the Supreme Court, the fate of Western civilization, the survival of the planet — depends on tribal victory, then neither individuals nor ideas can be determinative.
  • As our politics have become more polarized, the essential loyalties shift from ideas, to parties, to tribes, to individuals. Nothing else ultimately matters.
  • For many listeners, nothing was worse than Hillary Clinton. Two decades of vilification had taken their toll: Listeners whom I knew to be decent, thoughtful individuals began forwarding stories with conspiracy theories about President Obama and Mrs. Clinton — that he was a secret Muslim, that she ran a child sex ring out of a pizza parlor. When I tried to point out that such stories were demonstrably false, they generally refused to accept evidence that came from outside their bubble. The echo chamber had morphed into a full-blown alternate reality silo of conspiracy theories, fake news and propaganda.
  • In this political universe, voters accept that they must tolerate bizarre behavior, dishonesty, crudity and cruelty, because the other side is always worse; the stakes are such that no qualms can get in the way of the greater cause.
  • When it became clear that I was going to remain #NeverTrump, conservatives I had known and worked with for more than two decades organized boycotts of my show. One prominent G.O.P. activist sent out an email blast calling me a “Judas goat,” and calling for postelection retribution.
  • And then, there was social media. Unless you have experienced it, it’s difficult to describe the virulence of the Twitter storms that were unleashed on Trump skeptics. In my timelines, I found myself called a “cuckservative,” a favorite gibe of white nationalists; and someone Photoshopped my face into a gas chamber. Under the withering fire of the trolls, one conservative commentator and Republican political leader after another fell in line.
  • we had succeeded in persuading our audiences to ignore and discount any information from the mainstream media. Over time, we’d succeeded in delegitimizing the media altogether — all the normal guideposts were down, the referees discredited.
  • That left a void that we conservatives failed to fill. For years, we ignored the birthers, the racists, the truthers and other conspiracy theorists who indulged fantasies of Mr. Obama’s secret Muslim plot to subvert Christendom, or who peddled baseless tales of Mrs. Clinton’s murder victims. Rather than confront the purveyors of such disinformation, we changed the channel because, after all, they were our allies, whose quirks could be allowed or at least ignored
  • We destroyed our own immunity to fake news, while empowering the worst and most reckless voices on the right.
  • This was not mere naïveté. It was also a moral failure, one that now lies at the heart of the conservative movement even in its moment of apparent electoral triumph. Now that the election is over, don’t expect any profiles in courage from the Republican Party pushing back against those trends; the gravitational pull of our binary politics is too strong.
« First ‹ Previous 41 - 60 of 1578 Next › Last »
Showing 20 items per page