Skip to main content

Home/ TOK Friends/ Group items tagged spying

Rss Feed Group items tagged

Javier E

The Worst Part of the Woodward Tapes Isn't COVID. - 0 views

  • 1. Woodward
  • I'd like to take the other side of this Trump-Woodward story and offer two curveball views:
  • (1) I do not believe that Donald Trump "knew" how dangerous the coronavirus was. Allow me to explain.
  • ...21 more annotations...
  • This is simply how the man talks. About everything. What's more, he says everything, takes the both sides of everything:
  • Does he believe any of this, either way? Almost certainly not. The man has the brain of a goldfish: He "believes" whatever is in front of him in the moment. No matter whether or not it contradicts something he believed five minutes ago or will believe ten minutes from now.
  • All this guy does is try to create panic. That's his move
  • (2) The most alarming part of the Woodward tapes is the way Trump talks about Kim Jong Un and the moment when Trump literally takes sides with Kim Jong Un against a former American president.
  • In a way, it would be comforting to believe that our president was intelligent enough to grasp the seriousness of the coronavirus, even if his judgment in how to deal with the outbreak was malicious or poor.
  • All of the available evidence suggests the opposite:
  • Donald Trump lacks the cognitive ability to understand any concepts more complicated than self-promotion or self-preservation.
  • Put those two together—constant exaggerating self-aggrandizement and the perpetual attempt to stoke panic—and what you have is a guy was just saying stuff to Woodward.
  • After the Woodward tapes, anyone still deluding themselves about the authoritarian danger Trump poses to America is, finally, all out of excuses.
  • This, right here, is the most damning revelation from the Woodward tapes (so far):   Trump reflected on his relationships with authoritarian leaders generally, including Turkish President Recep Tayyip Erdogan. “It’s funny, the relationships I have, the tougher and meaner they are, the better I get along with them,” he told Woodward. “You know? Explain that to me someday, okay?” It's not hard to explain. And it's not funny.
  • You have this incredible rise in interest in technology and excitement about technology and the beat itself really took off while I was there. But then at the same time, you have this massive new centralization of government control over technology and the use of technology to control people and along with that rising nationalism.
  • Paul Mozur, who covers China and tech for the New York Times and is currently living in Taiwain, after the Chinese expelled all foreign journalists. 
  • That was more apparent, I think, over the past five years or so after Xi Jinping really consolidated power, but the amount of cameras that went up on street corners, the degree to which you used to be able to — there’s a moment maybe seven or eight years ago — where Jack Ma talked about the Tiananmen Square crackdowns on Chinese social media and now that’s just so utterly unthinkable. The degree to which the censorship has increased now to the level where if you say certain things on WeChat, it’s very possible the police will show up at your door where you actually have a truly fully formed Internet Police. . .
  • I think a lot of Chinese people feel more secure from the cameras, there’s been a lot of propaganda out there saying the cameras are here for your safety. There is this extremely positive, almost Utopian take on technology in China, and a lot of the stuff that I think, our knee-jerk response from the United States would be to be worried about, they kind of embrace as a vision of the future. .
  • The main reasons WeChat is a concern if you were the United States government is number one, it’s become a major vector of the spread of Chinese propaganda and censorship, and because it’s a social network that is anchored by a vast majority of users in China who are censored and who are receptive to all this propaganda, even if you’re overseas using WeChat and not censored in the same way, what you get is mostly content shared from people who are living in a censored environment, so it basically stays a censored environment. I call that a super filter bubble; the idea is that there are multiple filter bubbles contending in a website like Facebook, but with WeChat, because it’s so dominated by government controls, you get one really big mega pro-China filter bubble that then is spread all over the the world over the app, even if people outside of China don’t face the same censorship. So that’s one thing.
  • The second is the surveillance is immense and anybody who creates an account in China brings the surveillance with them overseas
  • And most people, frankly, using WeChat overseas probably created the accounts in China, and even when they don’t create the account in China, when national security priorities hit a certain level, I think they’re probably going to use it to monitor people anyway. I’ve run into a number of people who have had run-ins with the Chinese Internet Police either in China, but some of them outside of China, in their day-to-day life using WeChat, and then they return home and it becomes apparent that the Internet Police were watching them the whole time, and they get a visit and the police have a discussion with them about what their activities have been
  • So it’s also a major way that the Chinese government is able to spy on and monitor people overseas and then unsurprisingly, because of that, it’s used as a way for the Chinese intel services to harass people overseas. . . .
  • WeChat is particularly suited to this in part because every single person who uses WeChat within China has it linked to their real identity. And then because everybody on WeChat has linked to their real identity, you can map their relationship networks and lean on them that way.
  • It also has a bunch of tools that the Chinese police use, for instance key words, where you can set an alarm so that if you were to say “Tiananmen”, they could set an alarm so that anytime you say that they get a warning about that, and then they go look at what you’ve written. So there’s all these tools that are uniquely created for Chinese state surveillance that are within the app that they can also use, so there’s a bunch of ways that the app is just better.
  • It’s also one of the very few unblocked communication tools that goes between the two countries. So for all these reasons it’s a very, very big deal. For the Chinese government, it’s an important tool of social control, and it’s been a way that they’ve been able to take the social controls that exist within China and expand them to the diaspora community in some pretty unnerving ways.
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
‹ Previous 21 - 22 of 22
Showing 20 items per page