Skip to main content

Home/ History Readings/ Group items tagged heuristics

Rss Feed Group items tagged

49More

Cognitive Biases and the Human Brain - The Atlantic - 0 views

  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • ...46 more annotations...
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • I met with Kahneman
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
13More

Donald Trump's Unstoppable Virality - The New York Times - 0 views

  • 2015 was the Year of Trump because he is the perfect candidate for our viral age. His success tells us a lot about the nature of what goes viral and how it reflects our beliefs and our fears
  • as long as stories about Mr. Trump are receiving as many eyeballs as possible, it doesn’t really matter if people are reacting negatively to him. In fact, it probably helps his popularity.
  • Virality can be about sheer news value, but emotion also plays a big role in determining what gets shared. If we think about a given news story as a disease waiting to be passed along, human emotion is its most common vector. And some emotions are more contagious.
  • ...10 more annotations...
  • the most shareable moments come when a story lights up the deepest recesses of our minds
  • “Hate, fear of the other, anger — they come directly from the nonconscious, and that’s why they’re so easy to evoke,”
  • news stories were more likely to be shared if they elicited emotions like awe, anger and anxiety.
  • “What goes viral is what we think is remarkable,” Jeff Hemsley, a professor of information studies at Syracuse University and a co-author of the book “Going Viral,” said. “In a way, it represents what we as a society think is worth talking about.”
  • That Mr. Trump is both volatile in nature and allergic to nuance is part of his viral success. Humans use mental shortcuts to process information quickly while conserving brain power. This means that we often don’t think critically about the information we’re receiving before sharing it with others.
  • Unsurprisingly, that can mean that things that are not true go viral. But lies, like fear, can maintain a powerful grasp on the human mind.
  • “Once we see something and accept it as true, it’s really, really hard to falsify the belief,” said Rosanna Guadagno, a social psychologist at the University of Texas at Dallas. “I’ve occasionally spread something that turned out to be false, and the sad thing is, I’m still trying to scrub that out of my memory as something I’ve accepted as real.”
  • According to Bradley M. Okdie, a social psychologist at Ohio State University at Newark, conservatives are more likely to share a given piece of content than liberals are, especially if it provokes a negative emotion.
  • “Conservatives tend to be a lot more reactive to negative information and they also tend to be a lot more insular in nature, and they also tend to have less tolerance for ambiguity,” Professor Okdie said. “Conservatives would prefer a negative concrete statement to a slightly positive, uncertain statement.”
  • With his us vs. them invective and his refusal to denounce hate-filled speech from some of his supporters, Mr. Trump is an echo chamber for certain corners of the far righ
18More

Volkswagen, Johnson & Johnson, and Corporate Responsibility - The Atlantic - 0 views

  • The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.”
  • In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.”
  • Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merely acting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.
  • ...15 more annotations...
  • If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”
  • Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts.
  • back to Volkswagen. You cannot unconsciously install a “defeat device” into hundreds of thousands of cars. You need to be sneaky, and thus deliberate.
  • The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence.
  • Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”
  • “It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?
  • Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis
  • the final decision to deceive was, on an individual level, rational—the logical end to a long sequence.
  • This sequence of events fits a pattern that appears and reappears in corporate-misconduct cases, beginning with the fantastic commitments made from on high.
  • All of which placed personnel in a position of extreme strain.
  • We know what strain does to people. Even without it, they tend to underestimate the probability of future bad events. Put them under emotional stress, some research suggests, and this tendency gets amplified. People will favor decisions that preempt short-term social discomfort even at the cost of heightened long-term risk. Faced with the immediate certainty of a boss’s wrath or the distant possibility of blowback from a faceless agency, many will focus mostly on the former.
  • What James Burke, Johnson & Johnson’s CEO, did was anticipate the possible results of these pressures, well before they built up. He shared Henry James’s “imagination of disaster.” And it’s why he introduced, if you will, a set of counterscripts. It was a conscious effort to tinker with the unconscious criteria by which decisions at his company were made. The result was an incremental descent into integrity, a slide toward soundness, and the normalization of referencing “Our Credo” in situations that might otherwise have seemed devoid of ethical content.
  • This reaction isn’t excusable. But it is predictable.
  • What we know of Ferdinand Piëch, Volkswagen’s chairman before the scandal, is that he was no James Burke. At a 2008 corruption trial that sent one VW executive to jail, Piëch referred to alleged widespread use of VW funds on prostitutes as mere “irregularities,” and chided a lawyer for mispronouncing Lamborghini. (“Those who can’t afford one should say it properly” were his precise words.) This was around the time the emissions cheating began.
  • “Culture starts at the top,” a businessman recently said in an interview with the Association of Certified Fraud Examiners. “But it doesn’t start at the top with pretty statements. Employees will see through empty rhetoric and will emulate the nature of top-management decision making … A robust ‘code of conduct’ can be emasculated by one action of the CEO or CFO.”
24More

Americans Believe in Climate Change, But Not Climate Action - 0 views

  • Last month, scientists warned that we had only about 12 years to cut global emissions in half and that doing so would require a worldwide mobilization on the scale of that for World War II.
  • perhaps it should not be surprising that, even in many of the world’s most progressive places, even in the moment of acknowledged environmental crisis, a sort of climate NIMBYism prevails. The cost of inaction is sort of unthinkable — annual deadly heat waves and widespread famine, tens of millions of climate refugees, global coastal flooding, and disasters that will cost double the world’s present-day wealth. And so we choose, most of the time, not to think about it
  • This is denial, too, whatever you check on a survey about whether you “believe” the climate is changing.
  • ...21 more annotations...
  • hard-core, bought-and-paid-for denialism is pernicious for many reasons — in fact, it may help explain why so few Americans believe “most scientists think global warming is happening.” According to the most recent Yale Climate Opinion Survey, just 49 percent do.
  • what is perhaps most remarkable about that same study is that many more Americans believe climate change is happening than believe scientists believe it: 70 percent say global warming is real, and ongoing, versus just 14 percent who say it isn’t.
  • One way of looking at that data is to say that we are, despite what we hear in the media, overwhelmingly a nation of climate-change believers, not deniers — and, in fact, a nation genuinely concerned about it
  • “denial is mostly a distraction at this point.” (“Those still unconvinced mostly cannot or do not want to be convinced,” he added, meaning, “It’s time to stop framing persuasion as the primary task here.”)
  • Another is that even those of us who believe in warming, and believe it is a problem, do not believe enough in it
  • the rest of us are only moderately worried, perhaps in part because we imagine the worst impacts of climate change will hit elsewhere. Forty-one percent of Americans believe climate change “will harm me personally” — actually quite a high number, in absolute terms, but considerably lower than the 62 percent who believe it will harm those in the developing world or the 70 percent who believe it will harm future generations
  • What are those coping mechanisms? Why can’t we see the threat right in front of us?
  • It’s fucking scary. For years now, researchers have known that “unrealistic optimism is a pervasive human trait,” one that, whatever you know about how social-media addicts get used to bad news, leads us to discount scary information and embrace the sunnier stuff
  • the generation of economists and behavioral psychologists who’ve spent the last few decades enumerating all of our cognitive biases have compiled a whole literature of problems with how we process the world, almost every single example of which distorts and distends our perception of a changing climate, typically by making us discount the threat.
  • anchoring, which explains how we build mental models around as few as one or two initial examples, no matter how unrepresentative — in the case of global warming, the world we know today, which is reassuringly temperate
  • the ambiguity effect, which suggests that most people are so uncomfortable contemplating uncertainty they will accept lesser outcomes in a bargain to avoid dealing with it
  • In theory, with climate, uncertainty should be an argument for action — much of the ambiguity arises from the range of possible human inputs, a quite concrete prompt we choose to process instead as a riddle, which discourages us
  • anthropocentric thinking, by which we build our view of the universe outward from our own experience, a reflexive tendency that some especially ruthless environmentalists have derided as “human supremacy” and that surely shapes our ability to apprehend genuinely existential threats to the species — a shortcoming that many climate scientists have mocked. “The planet will survive,” they say. “It’s the humans that may not.”
  • Among the most destructive effects that appear later in the library are these:
  • the bystander effect, or our tendency to wait for others to act rather than acting ourselves;
  • confirmation bias, by which we seek evidence for what we already understand to be true rather than endure the cognitive pain of reconceptualizing our world
  • the default effect, or tendency to choose the present option over alternatives, which is related to the status quo bias, or preference for things as they are, however bad that is
  • the endowment effect, or the instinct to demand more to give up something we have — more than we actually value it (or had paid to acquire or establish it)
  • We have an illusion of control, the behavioral economists tell us, and also suffer from overconfidence. We can’t see anything but through cataracts of self-deception.
  • Already, Yale says, 70 percent of Americans believe “environmental protection is more important than economic growth.” Nudging that number up to 75 percent isn’t the important thing; what’s important is getting those 70 percent to feel their conviction fiercely, to elevate action on climate change to a first-order political priority by speaking loudly about it and to disempower, however we can, those forces conspiring to silence us.
  • Even the ones in our own heads.
12More

Is The 'Green New Deal' Smart Politics For Democrats? | FiveThirtyEight - 0 views

  • To my understanding, the Green New Deal is pretty clearly written as (and meant as) a rallying cry, “This is what we care about. Let’s move the ‘Overton Window’ kind of stuff.” So why are people treating it like it is (or was meant to be) a detailed policy proposal? It feels like going to an auto show to see “Car of the Future” designs, and then being pissed that you’re not looking at a 2017 Taurus.
  • I think it has a lot to do with the presidential campaign. Democratic candidates want to be able to point out that they’re on board with the new left-leaning litmus tests without having to get pinned down by policies that might prove controversial. I think that’s a learned behavior from the 2016 campaign: people don’t vote on detailed policy proposals, they vote on the good feelings evoked by broad goals.
  • I do think there’s an implicit critique of Obama in there. That he was naive to think the Republicans would go along with his agenda. And that taking half-measures doesn’t really get you anywhere. In fact, it might weaken your bargaining position relative to demanding a ***lot*** and then settling for half of what you get.
  • ...9 more annotations...
  • My guess is that GND activists are right (politically) about the Overton Window stuff — wanting big, bold sweeping initiatives instead of incrementalism. But that they’re wrong (politically) about the strategy of lumping environmental policy along with a grab bag of other left-ish policy positions, instead of being more targeted.
  • But I have no idea. It’s just my priors, and they’re fairly weak priors.
  • I actually think one of the better arguments for swinging for the fences in terms of the GND is that because the Senate is so resistant to change, you need some kind of paradigm shift
  • A paradigm shift where even action that seems incremental is actually quite bold, just because the goalposts have shifted so much.
  • natesilver: I think the shift would just be a generational one. There’s a *lot* of evidence that people under about age 40 are willing to consider left-wing worldviews that a previous generation might have considered too radical.People under age 40 have also lived with two really unpopular Republican presidents, Bush and Trump (along with one semi-popular Democratic one). So I think there’s a decent chance that policy in the U.S. shifts significantly to the left as those young people grow older and gain influence and power.
  • natesilver: I’m not on the fence so much as I just have no f’ing clue. I guess the heuristic is “what we tried before didn’t work, so let’s try something new”, which I suppose on some level I agree with
  • Like, maybe the GND isn’t any more likely to succeed than incrementalism, but when it *does* succeed, there’s a much bigger payoff.
  • natesilver: That even goes a little bit to whether you think climate change is a linear or nonlinear problem. If you think we’re all fucked unless there’s a massive paradigm shift, then you take whatever chance of a paradigm shift you can get, even if you also risk a backlash. If you think climate change harms are more adaptable and/or uncertain and/or solvable by technology and/or with international agreement, maybe you want a more incremental approach.
  • The GND shouldn’t be taken as a stand-in for the overall debate about incrementalism vs. the big swing. You could very easily think that an incremental approach works for health care but is a disaster for the environment, for instance.
12More

A 'carbon law' offers pathway to halve emissions every decade -- ScienceDaily - 0 views

  • a carbon roadmap, driven by a simple rule of thumb or "carbon law" of halving emissions every decade, could catalyse disruptive innovation.
  • The authors say fossil-fuel emissions should peak by 2020 at the latest and fall to around zero by 2050 to meet the UN's Paris Agreement's climate goal of limiting the global temperature rise to "well below 2°C" from preindustrial times.
  • A "carbon law" approach, say the international team of scientists, ensures that the greatest efforts to reduce emissions happens sooner not later
  • ...9 more annotations...
  • The researchers say halving emissions every decade should be complemented by equally ambitious, exponential roll-out of renewables
  • For example, doubling renewables in the energy sector every 5-7 years, ramping up technologies to remove carbon from the atmosphere, and rapidly reducing emissions from agriculture and deforestatio
  • They propose that to remain on this trajectory all sectors of the economy need decadal carbon roadmaps that follow this rule of thumb, modeled on Moore's Law.
  • Moore's Law states that computer processors double in power about every two years. While it is neither a natural nor legal law, this simple rule of thumb or heuristic has been described as a "golden rule" which has held for 50 years and still drives disruptive innovation
  • a "carbon law" offers a flexible way to think about reducing carbon emissions. It can be applied across borders and economic sectors, as well as both regional and global scales.
  • "Our civilization needs to reach a socio-economic tipping point soon, and this roadmap shows just how this can happen. In particular, we identify concrete steps towards full decarbonization by 2050.
  • no single solution will do the job, and that this deep uncertainty thus implies starting today pursuing multiple options simultaneously.
  • Following a "carbon law," which is based on published energy scenarios, would give the world a 75% chance of keeping Earth below 2°C above pre-industrial temperatures, the target agreed by nations in Paris in 2015
  • How to get there: 2020: remove fossil fuel subsidies. Put a price on carbon starting at $50 per ton rising to $400 per ton by 2050. Large-scale energy efficiency measures and large scale trials of carbon sequestration begin at 100-500MtCO2/yr.
4More

hjkhkj - 0 views

  • Political messaging is everywhere—especially when we’re a mere month away from a presidential election. 
  • current theories suggest that people’s political beliefs are driven by their identification with a group—similar to sports team loyalties, but more rooted in values, which makes them harder to change. 
  • “We have too much information coming into our brains to process it all slowly and carefully. So our brain uses all sorts of shortcuts so that information can be processed really quickly,”
  • ...1 more annotation...
  • It turns out that just thinking about politics taps into our brains’ tendency to rely on group identity, meaning processing political thoughts may be a simple question of "Us versus Them."
1 - 8 of 8
Showing 20 items per page