Skip to main content

Home/ TOK Friends/ Group items matching "gullibility" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
8More

Why coal-fired power handouts would be an attack on climate and common sense | Environm... - 0 views

  • The coal industry knows that to stop runaway climate change all coal-powered generators need to close Australia joined 174 countries and the European Union in 2015, signing the Paris agreement. In doing so, Australia agreed to do its part in keeping the global temperature rise “well below” 2C.
  • According to data from the Office of the Chief Economist, the demand for coal-generated electricity has dropped by more than 15% in the past eight years.
  • Coal is now the most expensive form of new power. According to Bloomberg New Energy Finance, the cost of energy from a new coal power plant would be $134-$203/MWh. That’s more expensive than wind, solar or highly efficient combined-cycle gas (costing $61-$118/MWh, $78-$140/MWh and $74-$90/MWh, respectively).
  • ...3 more annotations...
  • The only people who still think we need the old-fashioned sort of “baseload power” that coal provides – power that is always running regardless of whether you need it – are those in the coal industry.
    • dicindioha
       
      This claim seems a bit extreme, saying that the only people still interested in coal are in the coal industry. It might be true, but I also feel as if some people do not think of where their power source comes from.
  • In the short term, that can be gas. But, in the longer term, to stop runaway climate change, that service will need to be supplied by renewable sources such as battery storage, hydro, solar thermal with storage or geothermal.
  • “As the world’s largest coal exporter, we have a vested interest in showing that we can provide both lower emissions and reliable baseload power with state-of-the-art, clean, coal-fired technology.”
  •  
    This article is really interesting because I think it goes to show that there is still some side of the global warming/climate change argument that is making progress. As we learned today, it is important to walk that middle line between over-skepticism and gullibility. Here people recognize that coal emissions are bad, and countries are taking a stand to try and lower that. It does make me wonder though what the future with coal holds, and if one day, we really will resort to renewable energy. It seems increasingly important. One more interesting thing I found was the use of the graphs to support the information, for graphs used to seem to me something people trust, but now I realize we have survival instincts associated even with data, and I wonder if some people would remark this as "fake news."
2More

Satire News Websites Are Cashing in on Gullible, Outraged Readers | New Republic - 1 views

  • The Daily Currant is a fake-news site of a different stripe: one entirely devoid of jokes. Whether this humorlessness is intentional or not—the site's founder contends his critics don't have a sense of subtlety—the site's business model as an ad-driven clickbait-generator relies on it. When Currant stories go viral, it's not because their satire contains essential truths, but rather because their satire is taken as truth—and usually that "truth" is engineered to outrage a particular frequency of the political spectrum. As Slate's Josh Voorhees wrote after Drudge fell for the Bloomberg story, "It's a classic Currant con, one that relies on its mark wanting to believe a particular story is true." 
  • The Daily Currant's headlines don’t engage in subtlety so much as fail entirely to signal humorous intention. That would be acceptable, perhaps even clever, if the stories themselves skillfully exploited the reader's initial credulity, the copy growing increasingly ludicrous until the reader realizes the joke. Instead, jokes sometimes materialize in the final lines, but they’re half-baked at best. The VA story ends with Obama dismissing calls for officials to resign. "Why," Obama asks, "would holding people accountable for their actions be necessary?” That neither funny nor satirical. But it rings true to partisans who genuinely believe that Obama thinks that way—the same people who, in a flash of outrage, are most likely to share the story on social media.
14More

The Price of Denialism - The New York Times - 0 views

  • As the comedian John Oliver so aptly put it in commenting on a recent Gallup poll that found that one in four Americans disbelieve in climate change: “You don’t need people’s opinion on a fact. You might as well have a poll asking: ‘Which number is bigger, 15 or 5?’ Or ‘Do owls exist’ or ‘Are there hats?’”
  • we are about to be steeped in political arguments on every conceivable issue, all carried out with the usual confusing mix of fact, opinion, opinion stated as fact and fact portrayed as opinion. How can we prepare ourselves to make sense of it?
  • A good first step would be to distinguish between skepticism and what has come to be known as denialism. In other words, we need to be able to tell when we believe or disbelieve in something based on high standards of evidence and when we are just engaging in a bit of motivated reasoning and letting our opinions take over
  • ...11 more annotations...
  • When we withhold belief because the evidence does not live up to the standards of science, we are skeptical. When we refuse to believe something, even in the face of what most others would take to be compelling evidence, we are engaging in denial. In most cases, we do this because at some level it upsets us to think that the theory is true.
  • The throes of denial must feel a lot like skepticism. The rest of the world “just doesn’t get it.” We are the ones being rigorous. How can others be so gullible in believing that something is “true” before all of the facts are in? Yet a warning should occur when these stars align and we find ourselves feeling self-righteous about a belief that apparently means more to us than the preservation of good standards of evidence
  • how to tell a fact from an opinion? By the time we sit down to evaluate the evidence for a scientific theory, it is probably too late. If we take the easy path in our thinking, it eventually becomes a habit. If we lie to others, sooner or later we may believe the lie ourselves. The real battle comes in training ourselves to embrace the right attitudes about belief formation in the first place, and for this we need to do a little philosophy.
  • a telltale sign of denialism: that these alleged skeptics usually have different standards of evidence for those theories that they want to believe
  • Surely few would willingly embrace the title of “denialist.” It sounds so much more rigorous and fair-minded to maintain one’s “skepticism.” To hold that the facts are not yet settled. That there is so much more that we do not know. That the science isn’t certain.
  • The problem here, however, is that this is based not only on a grave misunderstanding of science (which in a sense is never settled), but also of what it means to be a skeptic.
  • Doubting the overwhelming consensus of scientists on an empirical question, for which one has only the spottiest ideologically-motivated “evidence,” is not skepticism, it is the height of gullibility. It is to claim that it is much more likely that there is a vast conspiracy among thousands of climate scientists than that they have instead all merely arrived at the same conclusion because that is where they were led by the evidence.
  • Couldn’t the scientists nonetheless be wrong? Yes, of course. The history of science has shown us that any scientific theory (even Newton’s theory of gravity) can be wrong
  • this does not mean that one is a good skeptic merely for disbelieving the well-corroborated conclusions of science. To reject a cascade of scientific evidence that shows that the global temperature is warming and that humans are almost certainly the cause of it, is not good reasoning, even if some long-shot hypothesis comes along in 50 years to show us why we were wrong.
  • In scientific reasoning, there is such a thing as warrant. Our beliefs must be justified. This means that we should believe what the evidence tells us, even while science insists that we must also try our best to show how any given theory might be wrong. Science will sometimes miss the mark, but its successful track record suggests that there is no superior competitor in discovering the facts about the empirical world
  • When we cynically pretend to withhold belief long past the point at which ample evidence should have convinced us that something is true, we have stumbled past skepticism and landed in the realm of willful ignorance. This is not the realm of science, but of ideological crackpots
8More

Conspiracy theory psychology: People who claim to know the truth about JFK, UFOs, and 9... - 0 views

  • people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites.
  • the prevalence of such belief, documented in surveys, has forced scholars to take it more seriously. Conspiracy theory psychology is becoming an empirical field with a broader mission: to understand why so many people embrace this way of interpreting history.
  • “People low in trust of others are likely to believe that others are colluding against them,” the authors proposed. This sort of distrust, in other words, favors a certain kind of belief. It makes you more susceptible, not less, to claims of conspiracy.
  • ...5 more annotations...
  • The more you see the world this way—full of malice and planning instead of circumstance and coincidence—the more likely you are to accept conspiracy theories of all kinds. Once you buy into the first theory, with its premises of coordination, efficacy, and secrecy, the next seems that much more plausible.
  • The common thread between distrust and cynicism, as defined in these experiments, is a perception of bad character. More broadly, it’s a tendency to focus on intention and agency, rather than randomness or causal complexity. In extreme form, it can become paranoia
  • In mild form, it’s a common weakness known as the fundamental attribution error—ascribing others’ behavior to personality traits and objectives, forgetting the importance of situational factors and chance
  • Clearly, susceptibility to conspiracy theories isn’t a matter of objectively evaluating evidence. It’s more about alienation. People who fall for such theories don’t trust the government or the media. They aim their scrutiny at the official narrative, not at the alternative explanations
  • Conspiracy believers are the ultimate motivated skeptics. Their curse is that they apply this selective scrutiny not to the left or right, but to the mainstream. They tell themselves that they’re the ones who see the lies, and the rest of us are sheep. But believing that everybody’s lying is just another kind of gullibility.
21More

The Dangers of Pseudoscience - NYTimes.com - 0 views

  • the “demarcation problem,” the issue of what separates good science from bad science and pseudoscience (and everything in between). The problem is relevant for at least three reasons.
  • The first is philosophical: Demarcation is crucial to our pursuit of knowledge; its issues go to the core of debates on epistemology and of the nature of truth and discovery.
  • The second reason is civic: our society spends billions of tax dollars on scientific research, so it is important that we also have a good grasp of what constitutes money well spent in this regard.
  • ...18 more annotations...
  • Third, as an ethical matter, pseudoscience is not — contrary to popular belief — merely a harmless pastime of the gullible; it often threatens people’s welfare,
  • It is precisely in the area of medical treatments that the science-pseudoscience divide is most critical, and where the role of philosophers in clarifying things may be most relevant.
  • some traditional Chinese remedies (like drinking fresh turtle blood to alleviate cold symptoms) may in fact work
  • There is no question that some folk remedies do work. The active ingredient of aspirin, for example, is derived from willow bark, which had been known to have beneficial effects since the time of Hippocrates. There is also no mystery about how this happens: people have more or less randomly tried solutions to their health problems for millennia, sometimes stumbling upon something useful
  • What makes the use of aspirin “scientific,” however, is that we have validated its effectiveness through properly controlled trials, isolated the active ingredient, and understood the biochemical pathways through which it has its effects
  • In terms of empirical results, there are strong indications that acupuncture is effective for reducing chronic pain and nausea, but sham therapy, where needles are applied at random places, or are not even pierced through the skin, turn out to be equally effective (see for instance this recent study on the effect of acupuncture on post-chemotherapy chronic fatigue), thus seriously undermining talk of meridians and Qi lines
  • Asma at one point compares the current inaccessibility of Qi energy to the previous (until this year) inaccessibility of the famous Higgs boson,
  • But the analogy does not hold. The existence of the Higgs had been predicted on the basis of a very successful physical theory known as the Standard Model. This theory is not only exceedingly mathematically sophisticated, but it has been verified experimentally over and over again. The notion of Qi, again, is not really a theory in any meaningful sense of the word. It is just an evocative word to label a mysterious force
  • Philosophers of science have long recognized that there is nothing wrong with positing unobservable entities per se, it’s a question of what work such entities actually do within a given theoretical-empirical framework. Qi and meridians don’t seem to do any, and that doesn’t seem to bother supporters and practitioners of Chinese medicine. But it ought to.
  • what’s the harm in believing in Qi and related notions, if in fact the proposed remedies seem to help?
  • we can incorporate whatever serendipitous discoveries from folk medicine into modern scientific practice, as in the case of the willow bark turned aspirin. In this sense, there is no such thing as “alternative” medicine, there’s only stuff that works and stuff that doesn’t.
  • Second, if we are positing Qi and similar concepts, we are attempting to provide explanations for why some things work and others don’t. If these explanations are wrong, or unfounded as in the case of vacuous concepts like Qi, then we ought to correct or abandon them.
  • pseudo-medical treatments often do not work, or are even positively harmful. If you take folk herbal “remedies,” for instance, while your body is fighting a serious infection, you may suffer severe, even fatal, consequences.
  • Indulging in a bit of pseudoscience in some instances may be relatively innocuous, but the problem is that doing so lowers your defenses against more dangerous delusions that are based on similar confusions and fallacies. For instance, you may expose yourself and your loved ones to harm because your pseudoscientific proclivities lead you to accept notions that have been scientifically disproved, like the increasingly (and worryingly) popular idea that vaccines cause autism.
  • Philosophers nowadays recognize that there is no sharp line dividing sense from nonsense, and moreover that doctrines starting out in one camp may over time evolve into the other. For example, alchemy was a (somewhat) legitimate science in the times of Newton and Boyle, but it is now firmly pseudoscientific (movements in the opposite direction, from full-blown pseudoscience to genuine science, are notably rare).
  • The verdict by philosopher Larry Laudan, echoed by Asma, that the demarcation problem is dead and buried, is not shared by most contemporary philosophers who have studied the subject.
  • the criterion of falsifiability, for example, is still a useful benchmark for distinguishing science and pseudoscience, as a first approximation. Asma’s own counterexample inadvertently shows this: the “cleverness” of astrologers in cherry-picking what counts as a confirmation of their theory, is hardly a problem for the criterion of falsifiability, but rather a nice illustration of Popper’s basic insight: the bad habit of creative fudging and finagling with empirical data ultimately makes a theory impervious to refutation. And all pseudoscientists do it, from parapsychologists to creationists and 9/11 Truthers.
  • The borderlines between genuine science and pseudoscience may be fuzzy, but this should be even more of a call for careful distinctions, based on systematic facts and sound reasoning. To try a modicum of turtle blood here and a little aspirin there is not the hallmark of wisdom and even-mindedness. It is a dangerous gateway to superstition and irrationality.
13More

The Dangers of Pseudoscience - NYTimes.com - 0 views

  • “demarcation problem,” the issue of what separates good science from bad science and pseudoscience
  • Demarcation is crucial to our pursuit of knowledge; its issues go to the core of debates on epistemology and of the nature of truth and discovery
  • our society spends billions of tax dollars on scientific research, so it is important that we also have a good grasp of what constitutes money well spent in this regard
  • ...10 more annotations...
  • pseudoscience is not — contrary to popular belief — merely a harmless pastime of the gullible; it often threatens people’s welfare, sometimes fatally so
  • in the area of medical treatments that the science-pseudoscience divide is most critical, and where the role of philosophers in clarifying things may be most relevant
  • What makes the use of aspirin “scientific,” however, is that we have validated its effectiveness through properly controlled trials, isolated the active ingredient, and understood the biochemical pathways through which it has its effects
  • Popper’s basic insight: the bad habit of creative fudging and finagling with empirical data ultimately makes a theory impervious to refutation. And all pseudoscientists do it, from parapsychologists to creationists and 9/11 Truthers.
  • Philosophers of science have long recognized that there is nothing wrong with positing unobservable entities per se, it’s a question of what work such entities actually do within a given theoretical-empirical framework.
  • we are attempting to provide explanations for why some things work and others don’t. If these explanations are wrong, or unfounded as in the case of vacuous concepts like Qi, then we ought to correct or abandon them.
  • no sharp line dividing sense from nonsense, and moreover that doctrines starting out in one camp may over time evolve into the other.
  • inaccessibility of the famous Higgs boson, a sub-atomic particle postulated by physicists to play a crucial role in literally holding the universe together (it provides mass to all other particles)
  • The open-ended nature of science means that there is nothing sacrosanct in either its results or its methods.
  • The borderlines between genuine science and pseudoscience may be fuzzy, but this should be even more of a call for careful distinctions, based on systematic facts and sound reasoning
13More

Is Huckleberry Finn's ending really lacking? Not if you're talking psychology. | Litera... - 0 views

  • What is it exactly that critics of the novel’s final chapters object to?
  • As Leo Marx put it in a 1953 essay, when Tom enters the picture, Huck falls “almost completely under his sway once more, and we are asked to believe that the boy who felt pity for the rogues is now capable of making Jim’s capture the occasion for a game. He becomes Tom’s helpless accomplice, submissive and gullible.” And to Marx, this regressive transformation is as unforgiveable as it is unbelievable.
  • psychologically, the reversion is as sound as it gets, despite the fury that it inspires. Before we rush to judge Huck—and to criticize Twain for veering so seemingly off course—we’d do well to consider a few key elements of the situations.
  • ...10 more annotations...
  • Huck is a thirteen (or thereabouts)-year-old boy. He is, in other words, a teenager. What’s more, he is a teenager from the antebellum South. Add to that the disparity between his social standing and education and Tom Sawyer’s, and you get a picture of someone who is quite different from a righteous fifty-something (or even thirty-something) literary critic who is writing in the twentieth century for a literary audience. And that someone has to be judged appropriately for his age, background, and social context—and his creator, evaluated accordingly.
  • There are a few important issues at play. Huck is not an adult. Tom Sawyer is not a stranger. The South is not a psychology lab. And slavery is not a bunch of lines projected on a screen. Each one of these factors on its own is enough to complicate the situation immensely—and together, they create one big complicated mess, that makes it increasingly likely that Huck will act just as he does, by conforming to Tom’s wishes and reverting to their old group dynamic.
  • Tom is a part of Huck’s past, and there is nothing like context to cue us back to past habitual behavior in a matter of minutes. (That’s one of the reasons, incidentally, that drug addicts often revert back to old habits when back in old environments.)
  • Jim is an adult—and an adult who has become a whole lot like a parent to Huck throughout their adventures, protecting him and taking care of him (and later, of Tom as well) much as a parent would. And the behavior that he wants from Huck, when he wants anything at all, is prosocial in the extreme (an apology, to take the most famous example, for playing a trick on him in the fog; not much of an ask, it seems, unless you stop to consider that it’s a slave asking a white boy to acknowledge that he was in the wrong). Tom, on the other hand, is a peer. And his demands are far closer to the anti-social side of the scale. Is it so surprising, then, that Huck sides with his old mate?
  • Another crucial caveat to Huck’s apparent metamorphosis: we tend to behave differently in private versus public spheres.
  • behavior is highly contextual—especially when it comes to behaviors that may not be as socially acceptable as one might hope. Huck and Jim’s raft is akin to a private sphere. It is just them, alone on the river, social context flowing away. And when does Huck’s behavior start to shift? The moment that he returns to a social environment, when he joins the Grangerfords in their family feud.
  • When the researchers looked at conformity to parents, they found a steady decrease in conforming behavior. Indeed, for the majority of measures, peer and parental conformity were negatively correlated. And what’s more, the sharpest decline was in conformity to pro-social behaviors.
  • On the raft, Jim was in a new environment, where old rules need not apply—especially given its private nature. But how quickly old ways kick back in, irrespective of whether you were a Huck or a Jim in that prior context.
  • there is a chasm, she points out, between Huck’s stated affection for Jim and his willingness to then act on it, especially in these final episodes. She blames the divide on Twain’s racism. But wouldn’t it be more correct to blame Huck’s only too real humanity?
  • Twain doesn’t make Huck a hero. He makes him real. Can we blame the book for telling it like it is?
6More

The Dangers of Pseudoscience - 0 views

  • Philosophers of science have been preoccupied for a while with what they call the “demarcation problem,” the issue of what separates good science from bad science and pseudoscience (and everything in between).
  • Demarcation is crucial to our pursuit of knowledge; its issues go to the core of debates on epistemology and of the nature of truth and discovery
  • our society spends billions of tax dollars on scientific research, so it is important that we also have a good grasp of what constitutes money well spent in this regard
  • ...2 more annotations...
  • pseudoscience is not — contrary to popular belief — merely a harmless pastime of the gullible; it often threatens people’s welfare, sometimes fatally so
  • It is precisely in the area of medical treatments that the science-pseudoscience divide is most critical, and where the role of philosophers in clarifying things may be most relevant.
  •  
    Pseudoscience is dangerous for three reasons, a philosophical, a civic, and a ethical reason.
42More

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
75More

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
3More

Believe what you like: How we fit the facts around our prejudices - TOK Topics - 0 views

  • This idea of a gullible, pliable populace is, of course, nothing new. Voltaire said, “those who can make you believe absurdities can make you commit atrocities”. But no, says Mercier, Voltaire had it backwards: “It is wanting to commit atrocities that makes you believe absurdities”…
  • If someone says Obama is a Muslim, their primary reason may be to indicate that they are a member of the group of people who co-ordinate around that statement. When a social belief and a true belief are in conflict, Klintman says, people will opt for the belief that best signals their social identity – even if it means lying to themselves…
  • Such a “belief” – being largely performative – rarely translates into action. It remains what Mercier calls a reflective belief, with no consequences on one’s behaviour, as opposed to an intuitive belief, which guides decisions and actions.
3More

'The Goop Lab': Gwyneth Paltrow's Shiny, Cynical Netflix Show - The Atlantic - 0 views

  • to watch The Goop Lab as a series, with its arcing assumptions about the limitations of medical science, is also to wonder where to locate the line between open-mindedness and gullibility. It is to wonder why Gwyneth Paltrow, celebrity and salesperson, should be trusted as an arbiter of healt
  • The Goop Lab continues that lulzy approach—each episode begins with a title-card disclaimer that the show is “designed to entertain and inform” rather than offer medical advice—but combines the mirth with deep earnestness. That creates its own kind of chaos. What is the meaningful difference, legal niceties aside, between “information” and “advice”?
  • The Goop Lab is streaming into a moment in America that finds Medicare for All under discussion and the Affordable Care Act under attack. It presents itself as airy infotainment even as many Americans are unable to access even the most basic forms of medical care. That makes the show deeply uncomfortable to watch.
9More

With Dr. Stella Immanuel's viral video, this was the week America lost the war on misin... - 0 views

  • With nearly 150,000 dead from covid-19, we’ve not only lost the public-health war, we’ve lost the war for truth. Misinformation and lies have captured the castle.
  • And the bad guys’ most powerful weapon? Social media — in particular, Facebook
  • new research, out just this morning from Pew, tells us in painstaking numerical form exactly what’s going on, and it’s not pretty: Americans who rely on social media as their pathway to news are more ignorant and more misinformed than those who come to news through print, a news app on their phones or network TV.
  • ...6 more annotations...
  • nd that group is growing.
  • “Even as Americans who primarily turn to social media for political news are less aware and knowledgeable about a wide range of events and issues in the news, they are more likely than other Americans to have heard about a number of false or unproven claims.”
  • Specifically, they’ve been far more exposed to the conspiracy theory that powerful people intentionally planned the pandemic. Yet this group, says Pew, is also less concerned about the impact of made-up news like this than the rest of the U.S. population.
  • They’re absorbing fake news, but they don’t see it as a problem. In a society that depends on an informed citizenry to make reasonably intelligent decisions about self-governance, this is the worst kind of trouble.
  • In a sweeping piece on disinformation and the 2020 campaign in February — in the pre-pandemic era — the Atlantic’s McKay Coppins concluded with a telling quote from the political theorist Hannah Arendt that bears repetition now. Through an onslaught of lies, which may be debunked before the cycle is repeated, totalitarian leaders are able to instill in their followers “a mixture of gullibility and cynicism,” she warned.
  • Over time, people are conditioned to “believe everything and nothing, think that everything was possible and that nothing was true.” And then such leaders can do pretty much whatever they wish
1 - 14 of 14
Showing 20 items per page