Skip to main content

Home/ TOK Friends/ Group items tagged nudge

Rss Feed Group items tagged

Javier E

The Power of Nudges, for Good and Bad - The New York Times - 0 views

  • Nudges, small design changes that can markedly affect individual behavior, have been catching on. These techniques rely on insights from behavioral science
  • when used ethically, they can be very helpful. But we need to be sure that they aren’t being employed to sway people to make bad decisions that they will later regret.
  • Three principles should guide the use of nudges:■ All nudging should be transparent and never misleading.■ It should be as easy as possible to opt out of the nudge, preferably with as little as one mouse click.■ There should be good reason to believe that the behavior being encouraged will improve the welfare of those being nudged.
  • ...6 more annotations...
  • the government teams in Britain and the United States that have focused on nudging have followed these guidelines scrupulously.
  • the private sector is another matter. In this domain, I see much more troubling behavior.
  • Many companies are nudging purely for their own profit and not in customers’ best interests. In a recent column in The New York Times, Robert Shiller called such behavior “phishing.” Mr. Shiller and George Akerlof, both Nobel-winning economists, have written a book on the subject, “Phishing for Phools.”
  • Some argue that phishing — or evil nudging — is more dangerous in government than in the private sector. The argument is that government is a monopoly with coercive power, while we have more choice in the private sector over which newspapers we read and which airlines we fly.
  • I think this distinction is overstated. In a democracy, if a government creates bad policies, it can be voted out of office. Competition in the private sector, however, can easily work to encourage phishing rather than stifle it.
  • One example is the mortgage industry in the early 2000s. Borrowers were encouraged to take out loans that they could not repay when real estate prices fell. Competition did not eliminate this practice, because it was hard for anyone to make money selling the advice “Don’t take that loan.”
Javier E

Opinion | How Behavioral Economics Took Over America - The New York Times - 0 views

  • Some behavioral interventions do seem to lead to positive changes, such as automatically enrolling children in school free lunch programs or simplifying mortgage information for aspiring homeowners. (Whether one might call such interventions “nudges,” however, is debatable.)
  • it’s not clear we need to appeal to psychology studies to make some common-sense changes, especially since the scientific rigor of these studies is shaky at best.
  • Nudges are related to a larger area of research on “priming,” which tests how behavior changes in response to what we think about or even see without noticing
  • ...16 more annotations...
  • Behavioral economics is at the center of the so-called replication crisis, a euphemism for the uncomfortable fact that the results of a significant percentage of social science experiments can’t be reproduced in subsequent trials
  • this key result was not replicated in similar experiments, undermining confidence in a whole area of study. It’s obvious that we do associate old age and slower walking, and we probably do slow down sometimes when thinking about older people. It’s just not clear that that’s a law of the mind.
  • And these attempts to “correct” human behavior are based on tenuous science. The replication crisis doesn’t have a simple solution
  • Journals have instituted reforms like having scientists preregister their hypotheses to avoid the possibility of results being manipulated during the research. But that doesn’t change how many uncertain results are already out there, with a knock-on effect that ripples through huge segments of quantitative social scienc
  • The Johns Hopkins science historian Ruth Leys, author of a forthcoming book on priming research, points out that cognitive science is especially prone to building future studies off disputed results. Despite the replication crisis, these fields are a “train on wheels, the track is laid and almost nothing stops them,” Dr. Leys said.
  • These cases result from lax standards around data collection, which will hopefully be corrected. But they also result from strong financial incentives: the possibility of salaries, book deals and speaking and consulting fees that range into the millions. Researchers can get those prizes only if they can show “significant” findings.
  • It is no coincidence that behavioral economics, from Dr. Kahneman to today, tends to be pro-business. Science should be not just reproducible, but also free of obvious ideology.
  • Technology and modern data science have only further entrenched behavioral economics. Its findings have greatly influenced algorithm design.
  • The collection of personal data about our movements, purchases and preferences inform interventions in our behavior from the grocery store to who is arrested by the police.
  • Setting people up for safety and success and providing good default options isn’t bad in itself, but there are more sinister uses as well. After all, not everyone who wants to exploit your cognitive biases has your best interests at heart.
  • Despite all its flaws, behavioral economics continues to drive public policy, market research and the design of digital interfaces.
  • One might think that a kind of moratorium on applying such dubious science would be in order — except that enacting one would be practically impossible. These ideas are so embedded in our institutions and everyday life that a full-scale audit of the behavioral sciences would require bringing much of our society to a standstill.
  • There is no peer review for algorithms that determine entry to a stadium or access to credit. To perform even the most banal, everyday actions, you have to put implicit trust in unverified scientific results.
  • We can’t afford to defer questions about human nature, and the social and political policies that come from them, to commercialized “research” that is scientifically questionable and driven by ideology. Behavioral economics claims that humans aren’t rational.
  • That’s a philosophical claim, not a scientific one, and it should be fought out in a rigorous marketplace of ideas. Instead of unearthing real, valuable knowledge of human nature, behavioral economics gives us “one weird trick” to lose weight or quit smoking.
  • Humans may not be perfectly rational, but we can do better than the predictably irrational consequences that behavioral economics has left us with today.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Web Privacy, and How Consumers Let Down Their Guard - NYTimes.com - 0 views

  • We are hurried and distracted and don’t pay close attention to what we are doing. Often, we turn over our data in exchange for a deal we can’t refuse.
  • his research argues that when it comes to privacy, policy makers should carefully consider how people actually behave. We don’t always act in our own best interest, his research suggests. We can be easily manipulated by how we are asked for information. Even something as simple as a playfully designed site can nudge us to reveal more of ourselves than a serious-looking one.
  • “His work has gone a long way in trying to help us figure out how irrational we are in privacy related decisions,” says Woodrow Hartzog, an assistant professor of law who studies digital privacy at Samford University in Birmingham, Ala. “We have too much confidence in our ability to make decisions.”
  • ...13 more annotations...
  • Solutions to our leaky privacy system tend to focus on transparency and control — that our best hope is knowing what our data is being used for and choosing whether to participate. But a challenge to that conventional wisdom emerges in his research. Giving users control may be an essential step, but it may also be a bit of an illusion.
  • personal data is what fuels the barons of the Internet age. Mr. Acquisti investigates the trade-offs that users make when they give up that data, and who gains and loses in those transactions. Often there are immediate rewards (cheap sandals) and sometimes intangible risks downstream (identity theft). “
  • “The technologist in me loves the amazing things the Internet is allowing us to do,” he said. “The individual who cares about freedom is concerned about the technology being hijacked, from a technology of freedom into a technology of surveillance.”
  • EARLY in his sojourn in this country, Mr. Acquisti asked himself a question that would become the guiding force of his career: Do Americans value their privacy?
  • If we have something — in this case, ownership of our purchase data — we are more likely to value it. If we don’t have it at the outset, we aren’t likely to pay extra to acquire it. Context matters.
  • “What worries me,” he said, “is that transparency and control are empty words that are used to push responsibility to the user for problems that are being created by others.”
  • We are constantly asked to make decisions about personal data amid a host of distractions, like an e-mail, a Twitter notification or a text message. If Mr. Acquisti is correct, those distractions may hinder our sense of self-protection when it comes to privacy.
  • His latest weapon against distraction is an iPad application, which lets him create a to-do list every morning and set timers for each task: 30 minutes for e-mail, 60 minutes to grade student papers, and so on.
  • it is not surprising that he is cautious in revealing himself online. He says he doesn’t feel compelled to post a picture of his meals on Instagram. He uses different browsers for different activities. He sometimes uses tools that show which ad networks are tracking him. But he knows he cannot hide entirely, which is why some people, he says, follow a policy of “rational ignorance.”
  • The online advertising industry insists that the data is scrambled to make it impossible to identify individuals.
  • Mr. Acquisti offers a sobering counterpoint. In 2011, he took snapshots with a webcam of nearly 100 students on campus. Within minutes, he had identified about one-third of them using facial recognition software. In addition, for about a fourth of the subjects whom he could identify, he found out enough about them on Facebook to guess at least a portion of their Social Security numbers.
  • The point of the experiment was to show how easy it is to identify people from the rich trail of data they scatter around the Web, including seemingly harmless pictures. Facebook can be especially valuable for identity thieves, particularly when a user’s birth date is visible to the public.
  • Does that mean Facebook users should lie about their birthdays (and break Facebook’s terms of service)? Mr. Acquisti demurred. He would say only that there are “complex trade-offs” to be made. “I reveal my date of birth and hometown on my Facebook profile and an identity thief can reconstruct my Social Security number and steal my identity,” he said, “or someone can send me ‘happy birthday’ messages on the day of my birthday, which makes me feel very good.”
Megan Flanagan

The Big Search to Find Out Where Dogs Come From - The New York Times - 0 views

  • scientists are still debating exactly when and where the ancient bond originated
  • agree that they evolved from ancient wolves
  • he essence of the idea is that people actively bred wolves to become dogs just the way they now breed dogs to be tiny or large, or to herd sheep.
  • ...22 more annotations...
  • Wolves are hard to tame, even as puppies, and many researchers find it much more plausible that dogs, in effect, invented themselves.
  • gradually evolved to become tamer and tamer, producing lots of offspring because of the relatively easy pickings
  • researchers question whether dogs experience feelings like love and loyalty, or whether their winning ways are just a matter of instincts that evolved because being a hanger-on is an easier way to make a living than running down elk.
  • dogs and wolves interbreed easily and some scientists are not convinced that the two are even different species
  • generally agree that there is good evidence that dogs were domesticated around 15,000 years ago
  • “Maybe dog domestication on some level kicks off this whole change in the way that humans are involved and responding to and interacting with their environment,
  • most dog breeds were invented in the 19th century during a period of dog obsession that he called “the giant whirlwind blender of the European crazy Victorian dog-breeding frenzy.
  • “There’s hardly a person working in canine genetics that’s not working on that project
  • Almost every group has a different origination hypothesis
  • jaws and occasionally nearly complete skulls from old and recent dogs, wolves and canids that could fall into either category.
  • will be able to determine whether the domestication process occurred closer to 15,000 or 30,000 years ago,
  • major achievement in the world of canine science, and a landmark in the analysis of ancient DNA to show evolution, migrations and descent,
  • based on DNA evidence and the shape of ancient skulls, that dog domestication occurred well over 30,000 years ago.
  • he became fed up with the lack of ancient DNA evidence in papers about the origin of dogs.
  • identified a skull about 32,000 years old from a Belgian cave in Goyet as an early dog.
  • arguing that the evidence just wasn’t there to call the Goyet skull a dog,
  • claims are controversial and is willing, like the rest of the world of canine science, to risk damage to the fossils themselves to get more information on not just the mitochondrial DNA but also the nuclear DNA.
  • geneticists try to establish is how different the DNA of one animal is from another. Adding ancient DNA gives many more points of reference over a long time span.
  • will be able to identify changes in the skulls or jaws of those wolves that show shifts to more doglike shapes, helping to narrow the origins of domestication
  • the project will publish a flagship paper from all of the participants describing their general findings
  • a group in China was forming with the goal of sequencing 10,000 dog genomes
  • growing increasingly confident that they will find what they want, and come close to settling the thorny question of when and where the tearing power of a wolf jaw first gave way to the persuasive force of a nudge from a dog’s cold nose.
Javier E

A Harvard Scholar on the Enduring Lessons of Chinese Philosophy - The New York Times - 0 views

  • Since 2006, Michael Puett has taught an undergraduate survey course at Harvard University on Chinese philosophy, examining how classic Chinese texts are relevant today. The course is now one of Harvard’s most popular, third only to “Introduction to Computer Science” and “Principles of Economics.”
  • So-called Confucianism, for example, is read as simply being about forcing people to accept their social roles, while so-called Taoism is about harmonizing with the larger natural world. So Confucianism is often presented as bad and Taoism as good. But in neither case are we really learning from them.
  • we shouldn’t domesticate them to our own way of thinking. When we read them as self-help, we are assuming our own definition of the self and then simply picking up pieces of these ideas that fit into such a vision
  • ...11 more annotations...
  • these ideas are not about looking within and finding oneself. They are about overcoming the self. They are, in a sense, anti-self-help.
  • Today, we are often told that our goal should be to look within and find ourselves, and, once we do, to strive to be sincere and authentic to that true self, always loving ourselves and embracing ourselves for who we are. All of this sounds great and is a key part of what we think of as a properly “modern” way to live.
  • But what if we’re, on the contrary, messy selves that tend to fall into ruts and patterns of behavior? If so, the last thing we would want to be doing is embracing ourselves for who we are — embracing, in other words, a set of patterns we’ve fallen into. The goal should rather be to break these patterns and ruts, to train ourselves to interact better with those around us.
  • Certainly some strains of Chinese political theory will take this vision of the self — that we tend to fall into patterns of behavior — to argue for a more paternalistic state that will, to use a more recent term, “nudge” us into better patterns.
  • many of the texts we discuss in the book go the other way, and argue that the goal should be to break us from being such passive creatures — calling on us to do things that break us out of these patterns and allow us to train ourselves to start altering our behavior for the better.
  • You argue that Chinese philosophy views rituals as tools that can liberate us from these ruts.
  • Rituals force us for a brief moment to become a different person and to interact with those around us in a different way. They work because they break us from the patterns that we fall into and that otherwise dominate our behavior.
  • In the early Han dynasty, for example, we have examples of rituals that called for role reversals. The father would be called upon to play the son, and the son would play the father. Each is forced to see the world from the other’s perspective, with the son learning what it’s like to be in a position of authority and the father remembering what it was like to be the more subservient one
  • We tend to think that we live in a globalized world, but in a lot of ways we really don’t. The truth is that for a long time only a very limited number of ideas have dominated the world, while ideas that arose elsewhere were seen as “traditional” and not worth learning from.
  • imagine future generations that grow up reading Du Fu along with Shakespeare, and Confucius along with Plato. Imagine that type of world, where great ideas — wherever they arose — are thought about and wrestled with.
  • There’s a very strong debate going on in China about values — a sense that everything has become about wealth and power, and a questioning about whether this should be rethought. And among the ideas that are being brought into the debate are these earlier notions about the self and about how one can lead a good life. So, while the government is appropriating some of these ideas in particular ways, the broader public is debating them, and certainly with very different interpretations.
dicindioha

BBC - Future - The tricks being played on you by UK roads - 0 views

  • When you walk or drive in the UK, you’re being nudged by dozens of hidden messages embedded in the roads and pavements.
  • He suffers from a rare inherited condition that leaves him only able to make out vague colour contrasts around him. Yet he is able to safely pick his way through the hectic city streets, thanks to dozens of hidden messages embedded in our roads and pavements that few of us even notice are there.
  • This subtle form of communication is not just confined to the pavement, either: increasingly, motorists and cyclists are also unknowingly being told what to do.
  • ...8 more annotations...
  • A horizontal pattern of raised lines going across the pavement tells blind pedestrians they are on the footpath side; raised lines running along the direction of travel indicate the side designated for cycles. A wide, raised line divides the two.
  • Because the raised bumps are unpleasant to ride across, cyclists instinctively are drawn toward the tramline pattern which runs in the same direction as they are traveling.
  • Elsewhere, it is possible to find raised, rounded ribs running across pavement, creating a corduroy pattern. They look like they might be there to provide additional grip; in fact, they are sending a warning to anyone who stands on them about what is ahead.
  • The idea is to guide people through busy areas and around objects by drawing them along these raised lines.
  • They found that uncertainty about the layout of the road ahead is a powerful way of getting drivers to slow down.
  • triangles painted along the edge of each road – create an impression of a narrower road for example, and make drivers more cautious.
  • They have been painting boxes onto the road that use a clever combination of white and dark paint to create the illusion of a speed hump.
  • In India, they have taken things even further by painting deliberate optical illusions to give the impression that obstacles are in the road ahead.
  •  
    This article talks about basically human perception and pattern recognition, and how this helps people who do not have all senses, like being blind. Bumps and grooves in the roads we walk on tell us, without us realizing it, what side we should be on and where there are stairs or platforms. It is interesting that there are patterns with these, as mentioned in the article, but everyday pedestrians do not really notice these patterns, and yet they are there to help us. Another interesting thing was the use of perception, and creating illusions of speed bumps or things in the road to get drivers to slow down. Here they play with perception to create an illusion of a speed bump and make traffic safer. sometimes what we think of as our perception incapabilities actually help us without realizing it.
Javier E

Anti-vaccine activists, 9/11 deniers, and Google's social search. - Slate Magazine - 1 views

  • democratization of information-gathering—when accompanied by smart institutional and technological arrangements—has been tremendously useful, giving us Wikipedia and Twitter. But it has also spawned thousands of sites that undermine scientific consensus, overturn well-established facts, and promote conspiracy theories
  • Meanwhile, the move toward social search may further insulate regular visitors to such sites; discovering even more links found by their equally paranoid friends will hardly enlighten them.
  • Initially, the Internet helped them find and recruit like-minded individuals and promote events and petitions favorable to their causes. However, as so much of our public life has shifted online, they have branched out into manipulating search engines, editing Wikipedia entries, harassing scientists who oppose whatever pet theory they happen to believe in, and amassing digitized scraps of "evidence" that they proudly present to potential recruits.
  • ...9 more annotations...
  • The Vaccine article contains a number of important insights. First, the anti-vaccination cohort likes to move the goal posts: As scientists debunked the link between autism and mercury (once present in some childhood inoculations but now found mainly in certain flu vaccines), most activists dropped their mercury theory and point instead to aluminum or said that kids received “too many too soon.”
  • Second, it isn't clear whether scientists can "discredit" the movement's false claims at all: Its members are skeptical of what scientists have to say—not least because they suspect hidden connections between academia and pharmaceutical companies that manufacture the vaccines.
  • mere exposure to the current state of the scientific consensus will not sway hard-core opponents of vaccination. They are too vested in upholding their contrarian theories; some have consulting and speaking gigs to lose while others simply enjoy a sense of belonging to a community, no matter how kooky
  • attempts to influence communities that embrace pseudoscience or conspiracy theories by having independent experts or, worse, government workers join them—the much-debated antidote of “cognitive infiltration” proposed by Cass Sunstein (who now heads the Office of Information and Regulatory Affairs in the White House)—w
  • perhaps, it's time to accept that many of these communities aren't going to lose core members regardless of how much science or evidence is poured on them. Instead, resources should go into thwarting their growth by targeting their potential—rather than existent—members.
  • Given that censorship of search engines is not an appealing or even particularly viable option, what can be done to ensure that users are made aware that all the pseudoscientific advice they are likely to encounter may not be backed by science?
  • One is to train our browsers to flag information that may be suspicious or disputed. Thus, every time a claim like "vaccination leads to autism" appears in our browser, that sentence woul
  • The second—and not necessarily mutually exclusive—option is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues like "global warming" or "vaccination." Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.
  • In more than a dozen countries Google already does something similar for users who are searching for terms like "ways to die" or "suicidal thoughts" by placing a prominent red note urging them to call the National Suicide Prevention Hotline.
Sophia C

New Truths That Only One Can See - NYTimes.com - 0 views

  • eproducible result may actually be the rarest of birds. Replication, the ability of another lab to reproduce a finding, is the gold standard of science, reassurance that you have discovered something true
  • With the most accessible truths already discovered, what remains are often subtle effects, some so delicate that they can be conjured up only under ideal circumstances, using highly specialized techniques.
  • any hypotheses already start with a high chance of being wrong
  • ...5 more annotations...
  • the human tendency to see what we want to see, unconscious bias is inevitable. Without any ill intent, a scientist may be nudged toward interpreting the data so it supports the hypothesis, even if just barely.
  • He found that a large proportion of the conclusions were undermined or contradicted by later studies.
  • He and his colleagues could not replicate 47 of 53 landmark papers about cancer
  • esearchers deeply believed that their findings were true. But that is the problem. The more passionate scientists are about their work, the more susceptible they are to bias
  • “The slightest shift in their microenvironment can alter the results — something a newcomer might not spot. It is common for even a seasoned scientist to struggle with cell lines and culture conditions, and unknowingly introduce changes that will make it seem that a study cannot be reproduced.
Javier E

Language and thought: Johnson: Does speaking German change how I see social relationshi... - 0 views

  • Roman Jakobson, a linguist, once said that “Languages differ essentially in what they must convey and not in what they may convey.” How do two-pronoun systems play into
  • this? In German, I must choose du or Sie every time I address someone. According to the logic of language shaping thought, I should therefore be more aware of social relations when I speak German.
  • A believer in the language-shapes-thought idea might argue that speaking German doesn't push me to always be more conscious of social relationships because I'm a non-native speaker, and so I haven't developed the habits of mind of lifelong German speakers. But plenty of native speakers of two-pronoun languages find this system irksome and awkward, just as I do.
  • ...2 more annotations...
  • there is another way in which the double-"you" distinction may nudge thought. It refers to what Dan Slobin, a linguist, has called “thinking for speaking”. Speakers of different languages may well see the world similarly most of the time, but when people are specifically planning to say something, different languages may temporarily force speakers to pay more attention to certain distinctions. For example, every time a German person says “you”, a little attention must be paid to formality. So split pronouns (or other features) may act as a kind of "prime" for certain thoughts or behaviours. Primes can be powerful. Every time I refer to my boss, for example, the formal "you" may prime me to be more aware of the formality and hierarchy of our relationship. So too when I must address an old friend.
  • A bigger question is whether differences between languages persist when people are not "thinking for speaking"—ie, whether they condition something we might call a robust worldview. When silently strolling down country lane, do speakers of different languages think in profoundly different ways? The popular view is “yes”, but furious debate among researchers continues.
dpittenger

The Long Conversation - NYTimes.com - 0 views

  • It is almost as though we have forgotten the matchless healing power of relationships, a power that I can attest to, since I have been on the couch for almost 45 years with the same person.
  • Still, it was her warmth and consistency as much as her illuminations that were nudging me away from my puppetlike relation to my impulses.
  • To her mind, it was good that our relationship was that deep and strong. To my mind, too.
grayton downing

Lab-Grown Model Brains | The Scientist Magazine® - 0 views

  • In an Austrian laboratory, a team of scientists has grown three-dimensional models of embryonic human brain
  • “Even the most complex organ—the human brain—can start to form without any micro-manipulation.”
  • Knoblich cautioned that the organoids are not “brains-in-a-jar.” “We’re talking about the very first steps of embryonic brain development, like in the first nine weeks of pregnancy,” he said. “They’re nowhere near an adult human brain and they don’t form anything that resembles a neuronal network.”
  • ...6 more annotations...
  • It took a huge amount of work to fine-tune the conditions, but once the team did, the organoids grew successfully within just 20 to 30 days.
  • scientists have developed organoids that mimic several human organs, including eyes, kidneys, intestines, and even brains.
  • They really highlight the ability just nudge these human embryonic cells and allow them to self-assemble
  • The mouse brain isn’t good enough for studying microcephaly,” said Huttner. “You need to put those genes into an adequate model like this one. It is, after all, human. It definitely enriches the field. There’s no doubt about that.”
  • organoids are unlikely to replace animal experiments entirely. “We can’t duplicate the elegance with which one can do genetics in animal models,” he said, “but we might be able to reduce the number of animal experiments, especially when it comes to toxicology or drug testing.”
  • the future, he hopes to develop larger organoids.
Javier E

What's a Metaphor For? - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • New research in the social and cognitive sciences makes it increasingly plain that metaphorical thinking influences our attitudes, beliefs, and actions in surprising, hidden, and often oddball ways.
  • "Metaphor conditions our interpretations of the stock market and, through advertising, it surreptitiously infiltrates our purchasing decisions. In the mouths of politicians, metaphor subtly nudges public opinion; in the minds of businesspeople, it spurs creativity and innovation. In science, metaphor is the preferred nomenclature for new theories and new discoveries; in psychology, it is the natural language of human relationships and emotions."
  • many modern thinkers and scholars have agreed that all language is at root metaphorical.
  • ...2 more annotations...
  • Of all philosophical writers on metaphor, Nietzsche probably draws the strongest conclusions from this situation. ''Tropes,'' he writes, ''are not something that can be added or abstracted from language at will—they are its truest nature.'' He argues that there is ''no real knowing apart from metaphor,'' by which he means that we experience reality through metaphors, and our notion of literal meaning simply reflects the ossification of language, as figures of speech lose their vitality. He emphasizes in The Genealogy of Morals how metaphor tends to extend its sway, to bring wider ranges of experience under its wing. He goes so far as to say that ''the drive toward the formation of metaphor is the fundamental human drive.'' For him, literal and figurative meaning are not stable categories, but historical ones determined by their social context.
  • the evolution of abstractions is always a case of going from the physical and sensible to the abstract.
Javier E

A smarter way to think about willpower - The Washington Post - 0 views

  • in a self-report questionnaire completed by more than 80,000 American adults, self-control ranked lowest among 24 strengths of character.
  • three out of four parents said they thought self-control has declined in the past half-century.
  • Without a time machine that allows us to travel backward and compare Americans from different decades on the same self-control measures, we can’t be sure. Indeed, the scant scientific evidence on the question suggests that if anything, the capacity to delay gratification may be increasing.
  • ...18 more annotations...
  • there are plenty of behaviors that require self-control that have held steady or even improved in recent decades
  • Cigarette smoking has fallen sharply since the Mad Men days.
  • Alcohol consumption peaked in 1980 and has fallen back to the same level as 1960
  • Seat belts,
  • are now used by 9 out of 10 motorists.
  • the ratio of household consumption to household net worth just hit a postwar low: In 2018 consumption was 13.2 percent of net worth, down from 16.3 percent in 1946.
  • it isn’t clear that savings habits have worsened since World War II.
  • Nevertheless, like every generation before us, we crave more self-control.
  • science shows that helping people do better in the internal tug-of-war of self-control depends on creating the right external environment.
  • some temptations require hard paternalism
  • some choices are not in our best interest. Taxing, regulating, restricting or even banning especially addictive drugs may lead to more freedom
  • Cellphones and soda
  • the benefits of constraining access may, in some cases, justify the costs
  • we recommend nudges — subtle changes in how choices are framed that make doing what’s in our long-term interest more obvious, easier or more attractiv
  • deploy science-backed strategies that make self-control easier.
  • putting temptations out of sight and out of reach:
  • disabling apps that, upon reflection, do more harm than good.
  • Anything you can do to put time and effort between you and indulgence makes self-control easier.
Javier E

What Cookies and Meth Have in Common - The New York Times - 0 views

  • Why would anyone continue to use recreational drugs despite the medical consequences and social condemnation? What makes someone eat more and more in the face of poor health?
  • modern humans have designed the perfect environment to create both of these addictions.
  • Drug exposure also contributes to a loss of self-control. Dr. Volkow found that low D2 was linked with lower activity in the prefrontal cortex, which would impair one’s ability to think critically and exercise restraint
  • ...17 more annotations...
  • Now we have a body of research that makes the connection between stress and addiction definitive. More surprising, it shows that we can change the path to addiction by changing our environment.
  • Neuroscientists have found that food and recreational drugs have a common target in the “reward circuit” of the brain, and that the brains of humans and other animals who are stressed undergo biological changes that can make them more susceptible to addiction.
  • In a 2010 study, Diana Martinez and colleagues at Columbia scanned the brains of a group of healthy controls and found that lower social status and a lower degree of perceived social support — both presumed to be proxies for stress — were correlated with fewer dopamine receptors, called D2s, in the brain’s reward circuit
  • The reward circuit evolved to help us survive by driving us to locate food or sex in our environment
  • Today, the more D2 receptors you have, the higher your natural level of stimulation and pleasure — and the less likely you are to seek out recreational drugs or comfort food to compensate
  • people addicted to cocaine, heroin, alcohol and methamphetamines experience a significant reduction in their D2 receptor levels that persists long after drug use has stopped. These people are far less sensitive to rewards, are less motivated and may find the world dull, once again making them prone to seek a chemical means to enhance their everyday life.
  • the myth has persisted that addiction is either a moral failure or a hard-wired behavior — that addicts are either completely in command or literally out of their minds
  • The processed food industry has transformed our food into a quasi-drug, while the drug industry has synthesized ever more powerful drugs that have been diverted for recreational use.
  • At this point you may be wondering: What controls the reward circuit in the first place? Some of it is genetic. We know that certain gene variations elevate the risk of addiction to various drugs. But studies of monkeys suggest that our environment can trump genetics and rewire the brain.
  • simply by changing the environment, you can increase or decrease the likelihood of an animal becoming a drug addict.
  • The same appears true for humans. Even people who are not hard-wired for addiction can be made dependent on drugs if they are stressed
  • Is it any wonder, then, that the economically frightening situation that so many Americans experience could make them into addicts? You will literally have a different brain depending on your ZIP code, social circumstances and stress level.
  • In 1990, no state in our country had an adult obesity rate above 15 percent; by 2015, 44 states had obesity rates of 25 percent or higher. What changed?
  • What happened is that cheap, calorie-dense foods that are highly rewarding to your brain are now ubiquitous.
  • Nothing in our evolution has prepared us for the double whammy of caloric modern food and potent recreational drugs. Their power to activate our reward circuit, rewire our brain and nudge us in the direction of compulsive consumption is unprecedented.
  • Food, like drugs, stimulates the brain’s reward circuit. Chronic exposure to high-fat and sugary foods is similarly linked with lower D2 levels, and people with lower D2 levels are also more likely to crave such foods. It’s a vicious cycle in which more exposure begets more craving.
  • Fortunately, our brains are remarkably plastic and sensitive to experience. Although it’s far easier said than done, just limiting exposure to high-calorie foods and recreational drugs would naturally reset our brains to find pleasure in healthier foods and life without drugs.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

My Mom Believes In QAnon. I've Been Trying To Get Her Out. - 0 views

  • An early adopter of the QAnon mass delusion, on board since 2018, she held firm to the claim that a Satan-worshipping cabal of child sex traffickers controlled the world and the only person standing in their way was Trump. She saw him not merely as a politician but a savior, and she expressed her devotion in stark terms.
  • “The prophets have said Trump is anointed,” she texted me once. “God is using him to finally end the evil doings of the cabal which has hurt humanity all these centuries… We are in a war between good & evil.”
  • By 2020, I’d pretty much given up on swaying my mom away from her preferred presidential candidate. We’d spent many hours arguing over basic facts I considered indisputable. Any information I cited to prove Trump’s cruelty, she cut down with a corresponding counterattack. My links to credible news sources disintegrated against a wall of outlets like One America News Network, Breitbart, and Before It’s News. Any cracks I could find in her positions were instantly undermined by the inconvenient fact that I was, in her words, a member of “the liberal media,” a brainwashed acolyte of the sprawling conspiracy trying to take down her heroic leader.
  • ...20 more annotations...
  • The irony gnawed at me: My entire vocation as an investigative reporter was predicated on being able to reveal truths, and yet I could not even rustle up the evidence to convince my own mother that our 45th president was not, in fact, the hero she believed him to be. Or, for that matter, that John F. Kennedy Jr. was dead. Or that Tom Hanks had not been executed for drinking the blood of children.
  • The theories spun from Q’s messages seemed much easier to disprove. Oprah Winfrey couldn’t have been detained during a wave of deep state arrests because we could still see her conducting live interviews on television. Trump’s 4th of July speech at Mount Rushmore came to an end without John F. Kennedy Jr. revealing he was alive and stepping in as the president’s new running mate. The widespread blackouts that her Patriot friend’s “source from the Pentagon” had warned about failed to materialize. And I could testify firsthand that the CIA had no control over my newsroom’s editorial decisions.
  • “I believe the Holy Spirit led me to the QAnons to discover the truth which is being suppressed,” she texted me. “Otherwise, how would I be able to know the truth if the lamestream media suppresses the truth?”
  • Through the years, I’d battled against conspiracy theories my mom threw at me that were far more formidable than QAnon. I’d been stumped when she asked me to prove that Beyoncé wasn’t an Illuminati member, dumbfounded when research studies I sent her weren’t enough to reach an agreement on vaccine efficacy, and too worn down to say anything more than “that’s not true” when confronted with false allegations of murders committed by prominent politicians.
  • Eventually, I accepted the impasse. It didn’t seem healthy that every conversation we had would devolve into a circuitous debate about which one of us was on the side of the bad guys. So I tried to pick my battles.
  • But what I had dismissed as damaging inconsistencies turned out to be the core strength of the belief system: It was alive, flexible, sprouting more questions than answers, more clues to study, an investigation playing out in real time, with the fate of the world at stake.
  • With no overlap between our filters of reality, I was at a loss for any facts that would actually stick.
  • Meanwhile, she wondered where she’d gone wrong with me
  • She regretted not taking politics more seriously when I was younger. I’d grown up blinkered by American privilege, trained to ignore the dirty machinations securing my comforts. My mom had shed that luxury long ago.
  • The year my mom began falling down QAnon rabbit holes, I turned the age she was when she first arrived in the States. By then, I was no longer sure that America was worth the cost of her migration. When the real estate market collapsed under the weight of Wall Street speculation, she had to sell our house at a steep loss to avoid foreclosure and her budding career as a realtor evaporated. Her near–minimum wage jobs weren’t enough to cover her bills, so her credit card debts rose. She delayed retirement plans because she saw no path to breaking even anytime soon, though she was hopeful that a turnaround was on the horizon. Through the setbacks and detours, she drifted into the arms of the people and beliefs I held most responsible for her troubles.
  • With a fervor I knew was futile, I’d tell my mom she was missing the real conspiracy: The powerful people shaping policy to benefit their own interests, to maintain wealth and white predominance, through tax cuts and voter suppression, were commandeering her support solely by catering to her stance on the one issue she cared most about.
  • The voice my mom trusted most now was Trump’s. Our disagreements were no longer ideological to her but part of a celestial conflict.
  • “I love you but you have to be on the side of good,” she texted me. “Im sad cuz u have become part of the deep state. May God have mercy on you...I pray you will see the truth of the evil agenda and be on the side of Trump.”
  • She likened her fellow Patriots to the early Christians who spread the word of Jesus at the risk of persecution. She often sent me a meme with a caption about “ordinary people who spent countless hours researching, debating, meditating and praying” for the truth to be revealed to them. “Although they were mocked, dismissed and cast off, they knew their souls had agreed long ago to do this work.”
  • Last summer, as my mom marched in a pink MAGA hat amid maskless crowds, and armed extremists stalked racial justice protests, and a disputed election loomed like a time bomb, I entertained my darkest thoughts about the fate of our country. Was there any hope in a democracy without a shared set of basic facts? Had my elders fled one authoritarian regime only for their children to face another? Amid the gloom, I found only a single morsel of solace: My mom was as hopeful as she’d ever been.
  • I wish I could offer some evidence showing that the gulf between us might be narrowing, that my love, persistence, and collection of facts might be enough to draw her back into a reality we share, and that when our wager about the storm comes due in a few months, she’ll realize that the voices she trusts have been lying to her. But I don’t think that will happen
  • What can I do but try to limit the damage? Send my mom movie recommendations to occupy the free time she instead spends on conspiracy research. Shift our conversations to the common ground of cooking recipes and family gossip. Raise objections when her beliefs nudge her toward dangerous decisions.
  • I now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
  • understand
  • now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
ilanaprincilus06

Get the vax, win a shotgun: US states get creative to encourage vaccination | US politi... - 1 views

  • And West Virginia upped the ante, adding the chance to win hunting rifles or shotguns.
  • Governors across the country are resorting to almost shameless incentives to lure Americans who haven’t gotten a coronavirus vaccine to willingly take a jab.
  • Businesses, too, have stepped in to nudge the unvaccinated. The percentage of a state’s population that has been vaccinated varies dramatically. Some states are approaching 70%, and others are still below 50%.
  • ...2 more annotations...
  • The incentive programs have become a bipartisan trend with governors from deep-red states like West Virginia or deep-blue states like California offering a range of inducements.
  • “It would be really great if we didn’t need any incentives at all. Hopefully, not dying is a great incentive,” the governor said according to the Deseret News.
katedriscoll

Frontiers | A Digital Nudge to Counter Confirmation Bias | Big Data - 1 views

  • Information disorder in current information ecosystems arises not only from the publication of “fake news,” but also from individuals' subjective reading of news and from their propagating news to others. Sometimes the difference between real and fake information is apparent. However, often a message is written to evoke certain emotions and opinions by taking partially true base stories and injecting false statements such that the information looks realistic. In addition, the perception of the trustworthiness of news is often influenced by confirmation bias. As a result, people often believe distorted or outright incorrect news and spread such misinformation further. For example, it was shown that in the months preceding the 2016 American presidential election, organizations from both Russia and Iran ran organized efforts to create such stories and spread them on Twitter and Facebook (Cohen, 2018). It is therefore important to raise internet users' awareness of such practices. Key to this is providing users with means to understand whether information should be trusted or not.
  • In this section, we discuss how social networks increase the spread of biased news and misinformation. We discuss confirmation bias, echo chambers and other factors that may subconsciously influence a person's opinion. We show how these processes can interact to form a vicious circle that favors the rise of untrustworthy sources. Often, when an individual thinks they know something, they are satisfied by an explanation that confirms their belief, without necessarily considering all possible other explanations, and regardless of the veracity of this information. This is confirmation bias in action. Nickerson (1998) defined it as the tendency of people to both seek and interpret evidence that supports an already-held belief.
1 - 20 of 27 Next ›
Showing 20 items per page