Skip to main content

Home/ TOK Friends/ Group items matching "memes" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Raymond Tallis Takes Out the 'Neurotrash' - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Tallis informs 60 people gathered in a Kent lecture hall that his talk will demolish two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions.
  • Aping Mankind argues that neuroscientific approaches to things like love, wisdom, and beauty are flawed because you can't reduce the mind to brain activity alone.
  • Stephen Cave, a Berlin-based philosopher and writer who has called Aping Mankind "an important work," points out that most philosophers and scientists do in fact believe "that mind is just the product of certain brain activity, even if we do not currently know quite how." Tallis "does both the reader and these thinkers an injustice" by declaring that view "obviously" wrong,
  • ...5 more annotations...
  • Geraint Rees, director of University College London's Institute of Cognitive Neuroscience, complains that reading Tallis is "a bit like trying to nail jelly to the wall." He "rubbishes every current theory of the relationship between mind and brain, whether philosophical or neuroscientific," while offering "little or no alternative,"
  • cultural memes. The Darwinesque concept originates in Dawkins's 1976 book, The Selfish Gene. memes are analogous to genes, Dennett has said, "replicating units of culture" that spread from mind to mind like a virus. Religion, chess, songs, clothing, tolerance for free speech—all have been described as memes. Tallis considers it absurd to talk of a noun-phrase like "tolerance for free speech" as a discrete entity. But Dennett argues that Tallis's objections are based on "a simplistic idea of what one might mean by a unit." memes aren't units? Well, in that spirit, says Dennett, organisms aren't units of biology, nor are species—they're too complex, with too much variation. "He's got to allow theory to talk about entities which are not simple building blocks," Dennett says.
  • How is it that he perceives the glass of water on the table? How is it that he feels a sense of self over time? How is it that he can remember a patient he saw in 1973, and then cast his mind forward to his impending visit to the zoo? There are serious problems with trying to reduce such things to impulses in the brain, he argues. We can explain "how the light gets in," he says, but not "how the gaze looks out." And isn't it astonishing, he adds, that much neural activity seems to have no link to consciousness? Instead, it's associated with things like controlling automatic movements and regulating blood pressure. Sure, we need the brain for consciousness: "Chop my head off, and my IQ descends." But it's not the whole story. There is more to perceptions, memories, and beliefs than neural impulses can explain. The human sphere encompasses a "community of minds," Tallis has written, "woven out of a trillion cognitive handshakes of shared attention, within which our freedom operates and our narrated lives are led." Those views on perception and memory anchor his attack on "neurobollocks." Because if you can't get the basics right, he says, then it's premature to look to neuroscience for clues to complex things like love.
  • Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."
  • Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain." "You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
Javier E

How Alignment Charts Went From Dungeons & Dragons to a Meme - The Atlantic - 0 views

  • Bartle recommends against using an alignment chart in a virtual space or online game because, on the internet, “much of what is good or evil, lawful or chaotic, is intangible.” The internet creates so many unpredictable conflicts and confusing scenarios for human interaction, judgment becomes impossible.
  • At the same time, judgment comes down constantly online. Social-media platforms frequently enforce binary responses: either award something a heart because you love it, or reply with something quick and crude when you hate it. The internet is a space of permutations and addled context, yet, as the Motherboard writer Roisin Kiberd argued in a 2019 essay collection about meme culture, “the internet is full of reductive moral judgment.”
Javier E

How the Shoggoth Meme Has Come to Symbolize the State of A.I. - The New York Times - 0 views

  • the Shoggoth had become a popular reference among workers in artificial intelligence, as a vivid visual metaphor for how a large language model (the type of A.I. system that powers ChatGPT and other chatbots) actually works.
  • it was only partly a joke, he said, because it also hinted at the anxieties many researchers and engineers have about the tools they’re building.
  • Since then, the Shoggoth has gone viral, or as viral as it’s possible to go in the small world of hyper-online A.I. insiders. It’s a popular meme on A.I. Twitter (including a now-deleted tweet by Elon Musk), a recurring metaphor in essays and message board posts about A.I. risk, and a bit of useful shorthand in conversations with A.I. safety experts. One A.I. start-up, NovelAI, said it recently named a cluster of computers “Shoggy” in homage to the meme. Another A.I. company, Scale AI, designed a line of tote bags featuring the Shoggoth.
  • ...17 more annotations...
  • Most A.I. researchers agree that models trained using R.L.H.F. are better behaved than models without it. But some argue that fine-tuning a language model this way doesn’t actually make the underlying model less weird and inscrutable. In their view, it’s just a flimsy, friendly mask that obscures the mysterious beast underneath.
  • In a nutshell, the joke was that in order to prevent A.I. language models from behaving in scary and dangerous ways, A.I. companies have had to train them to act polite and harmless. One popular way to do this is called “reinforcement learning from human feedback,” or R.L.H.F., a process that involves asking humans to score chatbot responses, and feeding those scores back into the A.I. model.
  • Shoggoths are fictional creatures, introduced by the science fiction author H.P. Lovecraft in his 1936 novella “At the Mountains of Madness.” In Lovecraft’s telling, Shoggoths were massive, blob-like monsters made out of iridescent black goo, covered in tentacles and eyes.
  • @TetraspaceWest said, wasn’t necessarily implying that it was evil or sentient, just that its true nature might be unknowable.
  • And it reinforces the notion that what’s happening in A.I. today feels, to some of its participants, more like an act of summoning than a software development process. They are creating the blobby, alien Shoggoths, making them bigger and more powerful, and hoping that there are enough smiley faces to cover the scary parts.
  • “I was also thinking about how Lovecraft’s most powerful entities are dangerous — not because they don’t like humans, but because they’re indifferent and their priorities are totally alien to us and don’t involve humans, which is what I think will be true about possible future powerful A.I.”
  • when Bing’s chatbot became unhinged and tried to break up my marriage, an A.I. researcher I know congratulated me on “glimpsing the Shoggoth.” A fellow A.I. journalist joked that when it came to fine-tuning Bing, Microsoft had forgotten to put on its smiley-face mask.
  • @TetraspaceWest, the meme’s creator, told me in a Twitter message that the Shoggoth “represents something that thinks in a way that humans don’t understand and that’s totally different from the way that humans think.”
  • In any case, the Shoggoth is a potent metaphor that encapsulates one of the most bizarre facts about the A.I. world, which is that many of the people working on this technology are somewhat mystified by their own creations. They don’t fully understand the inner workings of A.I. language models, how they acquire new capabilities or why they behave unpredictably at times. They aren’t totally sure if A.I. is going to be net-good or net-bad for the world.
  • That some A.I. insiders refer to their creations as Lovecraftian horrors, even as a joke, is unusual by historical standards. (Put it this way: Fifteen years ago, Mark Zuckerberg wasn’t going around comparing Facebook to Cthulhu.)
  • If it’s an A.I. safety researcher talking about the Shoggoth, maybe that person is passionate about preventing A.I. systems from displaying their true, Shoggoth-like nature.
  • A great many people are dismissive of suggestions that any of these systems are “really” thinking, because they’re “just” doing something banal (like making statistical predictions about the next word in a sentence). What they fail to appreciate is that there is every reason to suspect that human cognition is “just” doing those exact same things. It matters not that birds flap their wings but airliners don’t. Both fly. And these machines think. And, just as airliners fly faster and higher and farther than birds while carrying far more weight, these machines are already outthinking the majority of humans at the majority of tasks. Further, that machines aren’t perfect thinkers is about as relevant as the fact that air travel isn’t instantaneous. Now consider: we’re well past the Wright flyer level of thinking machine, past the early biplanes, somewhere about the first commercial airline level. Not quite the DC-10, I think. Can you imagine what the AI equivalent of a 777 will be like? Fasten your seatbelts.
  • @thomas h. You make my point perfectly. You’re observing that the way a plane flies — by using a turbine to generate thrust from combusting kerosene, for example — is nothing like the way that a bird flies, which is by using the energy from eating plant seeds to contract the muscles in its wings to make them flap. You are absolutely correct in that observation, but it’s also almost utterly irrelevant. And it ignores that, to a first approximation, there’s no difference in the physics you would use to describe a hawk riding a thermal and an airliner gliding (essentially) unpowered in its final descent to the runway. Further, you do yourself a grave disservice in being dismissive of the abilities of thinking machines, in exactly the same way that early skeptics have been dismissive of every new technology in all of human history. Writing would make people dumb; automobiles lacked the intelligence of horses; no computer could possibly beat a chess grandmaster because it can’t comprehend strategy; and on and on and on. Humans aren’t nearly as special as we fool ourselves into believing. If you want to have any hope of acting responsibly in the age of intelligent machines, you’ll have to accept that, like it or not, and whether or not it fits with your preconceived notions of what thinking is and how it is or should be done … machines can and do think, many of them better than you in a great many ways. b&
  • @BLA. You are incorrect. Everything has nature. Its nature is manifested in making humans react. Sure, no humans, no nature, but here we are. The writer and various sources are not attributing nature to AI so much as admitting that they don’t know what this nature might be, and there are reasons to be scared of it. More concerning to me is the idea that this field is resorting to geek culture reference points to explain and comprehend itself. It’s not so much the algorithm has no soul, but that the souls of the humans making it possible are stupendously and tragically underdeveloped.
  • When even tech companies are saying AI is moving too fast, and the articles land on page 1 of the NYT (there's an old reference), I think the greedy will not think twice about exploiting this technology, with no ethical considerations, at all.
  • @nome sane? The problem is it isn't data as we understand it. We know what the datasets are -- they were used to train the AI's. But once trained, the AI is thinking for itself, with results that have surprised everybody.
  • The unique feature of a shoggoth is it can become whatever is needed for a particular job. There's no actual shape so it's not a bad metaphor, if an imperfect image. Shoghoths also turned upon and destroyed their creators, so the cautionary metaphor is in there, too. A shame more Asimov wasn't baked into AI. But then the conflict about how to handle AI in relation to people was key to those stories, too.
sissij

Trash dove: how a purple bird took over Facebook | Technology | The Guardian - 0 views

  • As noted by meme database Know Your Meme, Trash Dove exploded in popularity after it was featured alongside a dancing cat on a Thai Facebook page with millions of followers
  • Pigeons are such strange birds, they have very beautiful mottled, shimmery feathers, but they waddle around and bob their heads and beg for crumbs. They’re like beautiful doves, except they eat trash.
  • The fan art and nice comments have been the highlight for me, but I’m amazed at how mean people can be to someone they’ve never met, because of something silly online.
  • ...1 more annotation...
  • It’s better to spend time building a dedicated viewer base that will support you for you.
  •  
    The popularity of a meme can sometimes reflect on the culture online and how people feel about the current events. I think the popularity of the Trash Dove might be suggesting that people feel negative about this world because the meaning behind the Trash Dove is that "They're like beautiful doves, except they eat trash". I feel like this meaning is ironic. Internet is such transparent space that every big hit somehow reflect people's value and opinion. --Sissi (2/16/2017)
Javier E

This is what it's like to grow up in the age of likes, lols and longing | The Washington Post - 1 views

  • She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
  • She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
  • Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too.
  • ...19 more annotations...
  • “Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
  • The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
  • Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
  • “It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
  • School is where she thrives: She is beloved by her teachers, will soon star as young Simba in the eighth-grade performance of “The Lion King” musical, and gets straight A’s. Her school doesn’t offer a math course challenging enough for her, so she takes honors algebra online through Johns Hopkins University.
  • “Happy birthday posts are a pretty big deal,” she says. “It really shows who cares enough to put you on their page.”
  • He checks the phone bill to see who she’s called and how much she’s been texting, but she barely calls anyone and chats mostly through Snapchat, where her messages disappear.
  • Some of Katherine’s very best friends have never been to her house, or she to theirs. To Dave, it seems like they rarely hang out, but he knows that to her, it seems like they’re together all the time.
  • Dave Pommerening wants to figure out how to get her to use it less. One month, she ate up 18 gigabytes of data. Most large plans max out at 10. He intervened and capped her at four GB. “I don’t want to crimp it too much,” he says. “That’s something, from my perspective, I’m going to have to figure out, how to get my arms around that.”
  • Even if her dad tried snooping around her apps, the true dramas of teenage girl life are not written in the comments. Like how sometimes, Katherine’s friends will borrow her phone just to un-like all the Instagram photos of girls they don’t like. Katherine can’t go back to those girls’ pages and re-like the photos because that would be stalking, which is forbidden.
  • Or how last week, at the middle school dance, her friends got the phone numbers of 10 boys, but then they had to delete five of them because they were seventh-graders. And before she could add the boys on Snapchat, she realized she had to change her username because it was her childhood nickname and that was totally embarrassing.
  • Then, because she changed her username, her Snapchat score reverted to zero. The app awards about one point for every snap you send and receive. It’s also totally embarrassing and stressful to have a low Snapchat score. So in one day, she sent enough snaps to earn 1,000 points.
  • Snapchat is where flirting happens. She doesn’t know anyone who has sent a naked picture to a boy, but she knows it happens with older girls, who know they have met the right guy.
  • Nothing her dad could find on her phone shows that for as good as Katherine is at math, basketball and singing, she wants to get better at her phone. To be one of the girls who knows what to post, how to caption it, when to like, what to comment.
  • Katherine doesn’t need magazines or billboards to see computer-perfect women. They’re right on her phone, all the time, in between photos of her normal-looking friends. There’s Aisha, there’s Kendall Jenner’s butt. There’s Olivia, there’s YouTube star Jenna Marbles in lingerie.
  • The whole world is at her fingertips and has been for years. This, Katherine offers as a theory one day, is why she doesn’t feel like she’s 13 years old at all. She’s probably, like, 16.
  • “I don’t feel like a child anymore” she says. “I’m not doing anything childish. At the end of sixth grade” — when all her friends got phones and downloaded Snapchat, Instagram and Twitter — “I just stopped doing everything I normally did. Playing games at recess, playing with toys, all of it, done.”
  • Her scooter sat in the garage, covered in dust. Her stuffed animals were passed down to Lila. The wooden playground in the back yard stood empty. She kept her skateboard with neon yellow wheels, because riding it is still cool to her friends.
  • On the morning of her 14th birthday, Katherine wakes up to an alarm ringing on her phone. It’s 6:30 a.m. She rolls over and shuts it off in the dark. Her grandparents, here to celebrate the end of her first year of teenagehood, are sleeping in the guest room down the hall. She can hear the dogs shuffling across the hardwood downstairs, waiting to be fed. Propping herself up on her peace-sign-covered pillow, she opens Instagram. Later, Lila will give her a Starbucks gift card. Her dad will bring doughnuts to her class. Her grandparents will take her to the Melting Pot for dinner. But first, her friends will decide whether to post pictures of Katherine for her birthday. Whether they like her enough to put a picture of her on their page. Those pictures, if they come, will get likes and maybe tbhs. They should be posted in the morning, any minute now. She scrolls past a friend posing in a bikini on the beach. Then a picture posted by Kendall Jenner. A selfie with coffee. A basketball Vine. A selfie with a girl’s tongue out. She scrolls, she waits. For that little notification box to appear.
Javier E

Opinion | Gen Z slang terms are influenced by incels - The Washington Post - 0 views

  • Incels (as they’re known) are infamous for sharing misogynistic attitudes and bitter hostility toward the romantically successful
  • somehow, incels’ hateful rhetoric has bizarrely become popularized via Gen Z slang.
  • it’s common to hear the suffix “pilled” as a funny way to say “convinced into a lifestyle.” Instead of “I now love eating burritos,” for instance, one might say, “I’m so burritopilled.” “Pilled” as a suffix comes from a scene in 1999’s “The Matrix” where Neo (Keanu Reeves) had to choose between the red pill and the blue pill, but the modern sense is formed through analogy with “blackpilled,” an online slang term meaning “accepting incel ideology.
  • ...11 more annotations...
  • the popular suffix “maxxing” for “maximizing” (e.g., “I’m burritomaxxing” instead of “I’m eating a lot of burritos”) is drawn from the incel idea of “looksmaxxing,” or “maximizing attractiveness” through surgical or cosmetic techniques.
  • Then there’s the word “cucked” for “weakened” or “emasculated.” If the taqueria is out of burritos, you might be “tacocucked,” drawing on the incel idea of being sexually emasculated by more attractive “chads.
  • These slang terms developed on 4chan precisely because of the site’s anonymity. Since users don’t have identifiable aliases, they signal their in-group status through performative fluency in shared slang
  • there’s a dark side to the site as well — certain boards, like /r9k/, are known breeding grounds for incel discussion, and the source of the incel words being used today.
  • finally, we have the word “sigma” for “assertive male,” which comes from an incel’s desired position outside the social hierarchy.
  • Memes and niche vocabulary become a form of cultural currency, fueling their proliferation.
  • From there, those words filter out to more mainstream websites such as Reddit and eventually become popularized by viral memes and TikTok trends. Social media algorithms do the rest of the work by curating recommended content for viewers.
  • Because these terms often spread in ironic contexts, people find them funny, engage with them and are eventually rewarded with more memes featuring incel vocabulary.
  • Creators are not just aware of this process — they are directly incentivized to abet it. We know that using trending audio helps our videos perform better and that incorporating popular metadata with hashtags or captions will help us reach wider audiences
  • kids aren’t actually saying “cucked” because they’re “blackpilled”; they’re using it for the same reason all kids use slang: It helps them bond as a group. And what are they bonding over? A shared mockery of incel ideas.
  • These words capture an important piece of the Gen Z zeitgeist. We should therefore be aware of them, keeping in mind that they’re being used ironically.
Javier E

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
Javier E

This Is Not a Market | Dissent Magazine - 0 views

  • Given how ordinary people use the term, it’s not surprising that academic economists are a little vague about it—but you’ll be glad to hear that they know they’re being vague. A generation of economists have criticized their colleagues’ inability to specify what a “market” actually is. George Stigler, back in 1967, thought it “a source of embarrassment that so little attention has been paid to the theory of markets.” Sociologists agree: according to Harrison White, there is no “neoclassical theory of the market—[only] a pure theory of exchange.” And Wayne Baker found that the idea of the market is “typically assumed—not studied” by most economists, who “implicitly characterize ‘market’ as a ‘featureless plane.’
  • When we say “market” now, we mean nothing particularly specific, and, at the same time, everything—the entire economy, of course, but also our lives in general. If you can name it, there’s a market in it: housing, education, the law, dating. Maybe even love is “just an economy based on resource scarcity.”
  • The use of markets to describe everything is odd, because talking about “markets” doesn’t even help us understand how the economy works—let alone the rest of our lives. Even though nobody seems to know what it means, we use the metaphor freely, even unthinkingly. Let the market decide. The markets are volatile. The markets responded poorly. Obvious facts—that the economy hasn’t rebounded after the recession—are hidden or ignored, because “the market” is booming, and what is the economy other than “the market”? Well, it’s lots of other things. We might see that if we talked about it a bit differently.
  • ...9 more annotations...
  • For instance, we might choose a different metaphor—like, say, the traffic system. Sounds ridiculous? No more so than the market metaphor. After all, we already talk about one important aspect of economic life in terms of traffic: online activity. We could describe it in market terms (the market demands Trump memes!), but we use a different metaphor, because it’s just intuitively more suitable. That last Trump meme is generating a lot of traffic. Redirect your attention as required.
  • We don’t know much about markets, because we don’t deal with them very often. But most of us know plenty about traffic systems: drivers will know the frustration of trying to turn left onto a major road, of ceaseless, pointless lane-switching on a stalled rush-hour freeway, but also the joys of clear highways.
  • We know the traffic system because, whether we like it or not, we are always involved in it, from birth
  • As of birth, Jean is in the economy—even if s/he rarely goes to a market. You can’t not be an economic actor; you can’t not be part of the transport system.
  • Consider also the composition of the traffic system and the economy. A market, whatever else it is, is always essentially the same thing: a place where people can come together to buy and sell things. We could set up a market right now, with a few fences and a sign announcing that people could buy and sell. We don’t even really need the fences. A traffic system, however, is far more complex. To begin with, the system includes publicly and privately run elements: most cars are privately owned, as are most airlines
  • If we don’t evaluate traffic systems based on their size, or their growth, how do we evaluate them? Mostly, by how well they help people get where they want to go. The market metaphor encourages us to think that all economic activity is motivated by the search for profit, and pursued in the same fashion everywhere. In a market, everyone’s desires are perfectly interchangeable. But, while everybody engages in the transport system, we have no difficulty remembering that we all want to go to different places, in different ways, at different times, at different speeds, for different reasons
  • Deciding how to improve the traffic system, how to expand people’s opportunities, is obviously a question of resource allocation and prioritization on a scale that private individuals—even traders—cannot influence on their own. That’s why government have not historically trusted the “magic of the markets” to produce better opportunities for transport. We intuitively understand that these decisions are made at the level of mass society and public policy. And, whether you like it or not, this is true for decisions about the economy as well.
  • Thinking of the economy in terms of the market—a featureless plane, with no entry or exit costs, little need for regulation, and equal opportunity for all—obscures this basic insight. And this underlying misconception creates a lot of problems: we’ve fetishized economic growth, we’ve come to distrust government regulation, and we imagine that the inequalities in our country, and our world, are natural or justified. If we imagine the economy otherwise—as a traffic system, for example—we see more clearly how the economy actually works.
  • We see that our economic life looks a lot less like going to “market” for fun and profit than it does sitting in traffic on our morning commute, hoping against hope that we’ll get where we want to go, and on time.
annabaldwin_

How Fiction Becomes Fact on Social Media - The New York Times - 0 views

  • In the coming weeks, executives from Facebook and Twitter will appear before congressional committees to answer questions about the use of their platforms by Russian hackers and others to spread misinformation and skew elections.
  • Yet the psychology behind social media platforms — the dynamics that make them such powerful vectors of misinformation in the first place — is at least as important, experts say, especially for those who think they’re immune to being duped.
  • Skepticism of online “news” serves as a decent filter much of the time, but our innate biases allow it to be bypassed, researchers have found — especially when presented with the right kind of algorithmically selected “meme.”
  • ...3 more annotations...
  • That kind of curating acts as a fertile host for falsehoods by simultaneously engaging two predigital social-science standbys: the urban myth as “meme,” or viral idea; and individual biases, the automatic, subconscious presumptions that color belief.
  • “My experience is that once this stuff gets going, people just pass these stories on without even necessarily stopping to read them,” Mr. McKinney said.
  • “The networks make information run so fast that it outruns fact-checkers’ ability to check it.
kushnerha

Is That Even a Thing? - The New York Times - 3 views

  • Speakers and writers of American English have recently taken to identifying a staggering and constantly changing array of trends, events, memes, products, lifestyle choices and phenomena of nearly every kind with a single label — a thing.
  • It would be easy to call this a curiosity of the language and leave it at that. Linguistic trends come and go.
  • One could, on the other hand, consider the use of “a thing” a symptom of an entire generation’s linguistic sloth, general inarticulateness and penchant for cutesy, empty, half-ironic formulations that create a self-satisfied barrier preventing any form of genuine engagement with the world around them.
  • ...9 more annotations...
  • My assumption is that language and experience mutually influence each other. Language not only captures experience, it conditions it. It sets expectations for experience and gives shape to it as it happens. What might register as inarticulateness can reflect a different way of understanding and experiencing the world.
  • The word “thing” has of course long played a versatile and generic role in our language, referring both to physical objects and abstract matters. “The thing is …” “Here’s the thing.” “The play’s the thing.” In these examples, “thing” denotes the matter at hand and functions as stage setting to emphasize an important point. One new thing about “a thing,” then, is the typical use of the indefinite article “a” to precede it. We talk about a thing because we are engaged in cataloging. The question is whether something counts as a thing. “A thing” is not just stage setting. Information is conveyed.
  • What information? One definition of “a thing” that suggests itself right away is “cultural phenomenon.” A new app, an item of celebrity gossip, the practices of a subculture. It seems likely that “a thing” comes from the phrase the coolest/newest/latest thing. But now, in a society where everything, even the past, is new — “new thing” verges on the redundant. If they weren’t new they wouldn’t be things.
  • Clearly, cultural phenomena have long existed and been called “fads,” “trends,” “rages” or have been designated by the category they belong to — “product,” “fashion,” “lifestyle,” etc. So why the application of this homogenizing general term to all of them? I think there are four main reasons.
  • First, the flood of content into the cultural sphere. That we are inundated is well known. Information besieges us in waves that thrash us against the shore until we retreat to the solid ground of work or sleep or exercise or actual human interaction, only to wade cautiously back into our smartphones. As we spend more and more time online, it becomes the content of our experience, and in this sense “things” have earned their name. “A thing” has become the basic unit of cultural ontology.
  • Second, the fragmentation of this sphere. The daily barrage of culture requires that we choose a sliver of the whole in order to keep up. Netflix genres like “Understated Romantic Road Trip Movies” make it clear that the individual is becoming his or her own niche market — the converse of the celebrity as brand. We are increasingly a society of brands attuning themselves to markets, and markets evaluating brands. The specificity of the market requires a wider range of content — of things — to satisfy it
  • Third, the closing gap between satire and the real thing. The absurd excess of things has reached a point where the ironic detachment needed to cope with them is increasingly built into the things themselves, their marketing and the language we use to talk about them. The designator “a thing” is thus almost always tinged with ironic detachment. It puts the thing at arm’s length. You can hardly say “a thing” without a wary glint in your eye.
  • Finally, the growing sense that these phenomena are all the same. As we step back from “things,” they recede into the distance and begin to blur together. We call them all by the same name because they are the same at bottom: All are pieces of the Internet. A thing is for the most part experienced through this medium and generated by it. Even if they arise outside it, things owe their existence as things to the Internet. Google is thus always the arbiter of the question, “Is that a real thing?”
  • “A thing,” then, corresponds to a real need we have, to catalog and group together the items of cultural experience, while keeping them at a sufficient distance so that we can at least feign unified consciousness in the face of a world gone to pieces.
Javier E

The Widening World of Hand-Picked Truths - The New York Times - 0 views

  • it’s not just organized religions that are insisting on their own alternate truths. On one front after another, the hard-won consensus of science is also expected to accommodate personal beliefs, religious or otherwise, about the safety of vaccines, G.M.O. crops, fluoridation or cellphone radio waves, along with the validity of global climate change.
  • But presenting people with the best available science doesn’t seem to change many minds. In a kind of psychological immune response, they reject ideas they consider harmful.
  • Viewed from afar, the world seems almost on the brink of conceding that there are no truths, only competing ideologies — narratives fighting narratives. In this epistemological warfare, those with the most power are accused of imposing their version of reality — the “dominant paradigm” — on the rest, leaving the weaker to fight back with formulations of their own. Everything becomes a version.
  • ...3 more annotations...
  • I heard from young anthropologists, speaking the language of postmodernism, who consider science to be just another tool with which Western colonialism further extends its “cultural hegemony” by marginalizing the dispossessed and privileging its own worldview.
  • Science, through this lens, doesn’t discover knowledge, it “manufactures” it, along with other marketable goods.
  • The widening gyre of beliefs is accelerated by the otherwise liberating Internet. At the same time it expands the reach of every mind, it channels debate into clashing memes, often no longer than 140 characters, that force people to extremes and trap them in self-reinforcing bubbles of thought.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
catbclark

Why Do Many Reasonable People Doubt Science? - National Geographic Magazine - 0 views

  • Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking water systems, hardens tooth enamel and prevents tooth decay—a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brusher or not. That’s the scientific and medical consensus.
  • when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense
  • all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition.
  • ...61 more annotations...
  • Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts.
  • Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable, and rich in rewards—but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.
  • The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy.
  • In this bewildering world we have to decide what to believe and how to act on that. In principle that’s what science is for.
  • “Science is not a body of facts,” says geophysicist Marcia McNutt,
  • “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”
  • The scientific method leads us to truths that are less than self-evident, often mind-blowing, and sometimes hard to swallow.
  • We don’t believe you.
  • Galileo was put on trial and forced to recant. Two centuries later Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales, and even deep-sea mollusks is still a big ask for a lot of people. So is another 19th-century notion: that carbon dioxide, an invisible gas that we all exhale all the time and that makes up less than a tenth of one percent of the atmosphere, could be affecting Earth’s climate.
  • we intellectually accept these precepts of science, we subconsciously cling to our intuitions
  • Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They lurk in our brains, chirping at us as we try to make sense of the world.
  • Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics.
  • We have trouble digesting randomness; our brains crave pattern and meaning.
  • we can deceive ourselves.
  • Even for scientists, the scientific method is a hard discipline. Like the rest of us, they’re vulnerable to what they call confirmation bias—the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them
  • other scientists will try to reproduce them
  • Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
  • Many people in the United States—a far greater percentage than in other countries—retain doubts about that consensus or believe that climate activists are using the threat of global warming to attack the free market and industrial society generally.
  • news media give abundant attention to such mavericks, naysayers, professional controversialists, and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses
  • science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else—but their dogma is always wilting in the hot glare of new research.
  • But industry PR, however misleading, isn’t enough to explain why only 40 percent of Americans, according to the most recent poll from the Pew Research Center, accept that human activity is the dominant cause of global warming.
  • “science communication problem,”
  • yielded abundant new research into how people decide what to believe—and why they so often don’t accept the scientific consensus.
  • higher literacy was associated with stronger views—at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce beliefs that have already been shaped by their worldview.
  • “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change.
  • “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to—some kind of tax or regulation to limit emissions.
  • For a hierarchical individualist, Kahan says, it’s not irrational to reject established climate science: Accepting it wouldn’t change the world, but it might get him thrown out of his tribe.
  • Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers.
  • organizations funded in part by the fossil fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics.
  • Internet makes it easier than ever for climate skeptics and doubters of all kinds to find their own information and experts
  • Internet has democratized information, which is a good thing. But along with cable TV, it has made it possible to live in a “filter bubble” that lets in only the information with which you already agree.
  • How to convert climate skeptics? Throwing more facts at them doesn’t help.
  • people need to hear from believers they can trust, who share their fundamental values.
  • We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community.
  • “Believing in evolution is just a description about you. It’s not an account of how you reason.”
  • evolution actually happened. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines really do save lives. Being right does matter—and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.
  • Doubting science also has consequences.
  • In the climate debate the consequences of doubt are likely global and enduring. In the U.S., climate change skeptics have achieved their fundamental goal of halting legislative action to combat global warming.
  • “That line between science communication and advocacy is very hard to step back from,”
  • It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app.
  • that need to fit in is so strong that local values and local opinions are always trumping science.
  • not a sin to change your mind when the evidence demands it.
  • for the best scientists, the truth is more important than the tribe.
  • Students come away thinking of science as a collection of facts, not a method.
  • Shtulman’s research has shown that even many college students don’t really understand what evidence is.
  • “Everybody should be questioning,” says McNutt. “That’s a hallmark of a scientist. But then they should use the scientific method, or trust people using the scientific method, to decide which way they fall on those questions.”
  • science has made us the dominant organisms,
  • incredibly rapid change, and it’s scary sometimes. It’s not all progress.
  • But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on the Oprah Winfrey Show, “The University of Google is where I got my degree from.”)
    • catbclark
       
      Power of celebraties, internet as a source 
  • The scientific method doesn’t come naturally—but if you think about it, neither does democracy. For most of human history neither existed. We went around killing each other to get on a throne, praying to a rain god, and for better and much worse, doing things pretty much as our ancestors did.
  • We need to get a lot better at finding answers, because it’s certain the questions won’t be getting any simpler.
  • That the Earth is round has been known since antiquity—Columbus knew he wouldn’t sail off the edge of the world—but alternative geographies persisted even after circumnavigations had become common
  • We live in an age when all manner of scientific knowledge—from climate change to vaccinations—faces furious opposition.Some even have doubts about the moon landing.
  • Why Do Many Reasonable People Doubt Science?
  • science doubt itself has become a pop-culture meme.
  • Flat-Earthers held that the planet was centered on the North Pole and bounded by a wall of ice, with the sun, moon, and planets a few hundred miles above the surface. Science often demands that we discount our direct sensory experiences—such as seeing the sun cross the sky as if circling the Earth—in favor of theories that challenge our beliefs about our place in the universe.
  • . Yet just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not still random.
  • Sometimes scientists fall short of the ideals of the scientific method. Especially in biomedical research, there’s a disturbing trend toward results that can’t be reproduced outside the lab that found them, a trend that has prompted a push for greater transparency about how experiments are conducted
  • “Science will find the truth,” Collins says. “It may get it wrong the first time and maybe the second time, but ultimately it will find the truth.” That provisional quality of science is another thing a lot of people have trouble with.
  • scientists love to debunk one another
  • they will continue to trump science, especially when there is no clear downside to ignoring science.”
Javier E

Baseball or Soccer? - NYTimes.com - 1 views

  • Baseball is a team sport, but it is basically an accumulation of individual activities. Throwing a strike, hitting a line drive or fielding a grounder is primarily an individual achievement. The team that performs the most individual tasks well will probably win the game.
  • In soccer, almost no task, except the penalty kick and a few others, is intrinsically individual. Soccer, as Simon Critchley pointed out recently in The New York Review of Books, is a game about occupying and controlling space. If you get the ball and your teammates have run the right formations, and structured the space around you, you’ll have three or four options on where to distribute it. If the defenders have structured their formations to control the space, then you will have no options. Even the act of touching the ball is not primarily defined by the man who is touching it; it is defined by the context created by all the other players.
  • Most of us spend our days thinking we are playing baseball, but we are really playing soccer. We think we individually choose what career path to take, whom to socialize with, what views to hold. But, in fact, those decisions are shaped by the networks of people around us more than we dare recognize.
  • ...9 more annotations...
  • “Soccer is a collective game, a team game, and everyone has to play the part which has been assigned to them, which means they have to understand it spatially, positionally and intelligently and make it effective.”
  • Then there is the structure of your network. There is by now a vast body of research on how differently people behave depending on the structure of the social networks. People with vast numbers of acquaintances have more job opportunities than people with fewer but deeper friendships
  • This influence happens through at least three avenues. First there is contagion. People absorb memes, ideas and behaviors from each other the way they catch a cold.
  • soccer is like a 90-minute anxiety dream — one of those frustrating dreams when you’re trying to get somewhere but something is always in the way. This is yet another way soccer is like life.
  • Let me simplify it with a classic observation: Each close friend you have brings out a version of yourself that you could not bring out on your own. When your close friend dies, you are not only losing the friend, you are losing the version of your personality that he or she elicited.
  • Once we acknowledge that, in life, we are playing soccer, not baseball, a few things become clear. First, awareness of the landscape of reality is the highest form of wisdom. It’s not raw computational power that matters most; it’s having a sensitive attunement to the widest environment,
  • Second, predictive models will be less useful. Baseball is wonderful for sabermetricians. In each at bat there is a limited range of possible outcomes. Activities like soccer are not as easily renderable statistically, because the relevant spatial structures are harder to quantify
  • Finally, there is the power of the extended mind. There is also a developed body of research on how much our very consciousness is shaped by the people around us.
  • Is life more like baseball, or is it more like soccer?
kushnerha

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
Javier E

Jonathan Franzen Is Fine With All of It - The New York Times - 0 views

  • If you’re in a state of perpetual fear of losing market share for you as a person, it’s just the wrong mind-set to move through the world with.” Meaning that if your goal is to get liked and retweeted, then you are perhaps molding yourself into the kind of person you believe will get those things, whether or not that person resembles the actual you. The writer’s job is to say things that are uncomfortable and hard to reduce. Why would a writer mold himself into a product?
  • And why couldn’t people hear him about the social effects this would have? “The internet is all about destroying the elite, destroying the gatekeepers,” he said. “The people know best. You take that to its conclusion, and you get Donald Trump. What do those Washington insiders know? What does the elite know?
  • So he decided to withdraw from it all. After publicity for “The Corrections” ended, he decided he would no longer read about himself — not reviews, not think pieces, not stories, and then, as they came, not status updates and not tweets. He didn’t want to hear reaction to his work. He didn’t want to see the myriad ways he was being misunderstood. He didn’t want to know what the hashtags were.
  • ...7 more annotations...
  • I stopped reading reviews because I noticed all I remember is the negatives. Whatever fleeting pleasure you have in someone applying a laudatory adjective to your book is totally washed away by the unpleasantness of remembering the negative things for the rest of your life verbatim.
  • Franzen thinks that there’s no way for a writer to do good work — to write something that can be called “consuming and extraordinarily moving” — without putting a fence around yourself so that you can control the input you encounter. So that you could have a thought that isn’t subject to pushback all the time from anyone who has ever met you or heard of you or expressed interest in hearing from you. Without allowing yourself to think for a minute.
  • It’s not just writers. It’s everyone. The writer is just an extreme case of something everyone struggles with. “On the one hand, to function well, you have to believe in yourself and your abilities and summon enormous confidence from somewhere. On the other hand, to write well, or just to be a good person, you need to be able to doubt yourself — to entertain the possibility that you’re wrong about everything, that you don’t know everything, and to have sympathy with people whose lives and beliefs and perspectives are very different from yours.”
  • “This balancing act” — the confidence that you know everything plus the ability to believe that you don’t — “only works, or works best, if you reserve a private space for it.”
  • Can you write clearly about something that you don’t yourself swim in? Don’t you have to endure it and hate it most of the time like the rest of us?
  • his answer was no. No. No, you absolutely don’t. You can miss a meme, and nothing really changes. You can be called fragile, and you will live. “I’m pretty much the opposite of fragile. I don’t need internet engagement to make me vulnerable. Real writing makes me — makes anyone doing it — vulnerable.”
  • Has anyone considered that the interaction is the fragility? Has anyone considered that letting other people define how you fill your day and what they fill your head with — a passive, postmodern stream of other people’s thoughts — is the fragility?
Javier E

'ContraPoints' Is Political Philosophy Made for YouTube - The Atlantic - 1 views

  • While Wynn positions herself on the left, she is no dogmatic ideologue, readily admitting to points on the right and criticizing leftist arguments when warranted
  • She has described her work as “edutainment” and “propaganda,” and it’s both
  • But what makes her videos unique is the way Wynn combines those two elements: high standards of rational argument and not-quite-rational persuasion. ContraPoints offers compelling speech aimed at truth, rendered in the raucous, meme-laden idiom of the interne
  • ...16 more annotations...
  • In 2014, Wynn noticed a trend on YouTube that disturbed her: Videos with hyperbolic titles like “why feminism ruins everything,” “SJW cringe compilation,” and “Ben Shapiro DESTROYS Every College Snowflake” were attracting millions of views and spawning long, jeering comment threads. Wynn felt she was watching the growth of a community of outrage that believes feminists, Marxists, and multiculturalists are conspiring to destroy freedom of speech, liquidate gender norms, and demolish Western civilization
  • Wynn created ContraPoints to offer entertaining, coherent rebuttals to these kinds of ideas. Her videos also explain left-wing talking points—like rape culture and cultural appropriation—and use philosophy to explore topics that are important to Wynn, such as the meaning of gender for trans people.
  • Wynn thinks it’s a mistake to assume that viewers of angry, right-wing videos are beyond redemption. “It’s quite difficult to get through to the people who are really committed to these anti-progressive beliefs,” Wynn told me recently. However, she said, she believes that many viewers find such ideas “psychologically resonant” without being hardened reactionaries. This broad, not fully committed center—comprising people whose minds can still be changed—is Wynn’s target audience.
  • Usually, the videos to which Wynn is responding take the stance of dogged reason cutting through the emotional excesses of so-called “political correctness.” For example, the American conservative commentator Ben Shapiro, who is a target of a recent ContraPoints video, has made “facts don’t care about your feelings” his motto. Wynn’s first step in trying to win over those who find anti-progressive views appealing is to show that these ideas often rest on a flimsy foundation. To do so, she fully adopts the rational standards of argument that her rivals pride themselves on following, and demonstrates how they fail to achieve them
  • Wynn dissects her opponents’ positions, holding up fallacies, evasions, and other rhetorical tricks for examination, all the while providing a running commentary on good argumentative method.
  • The host defends her own positions according to the same principles. Wynn takes on the strongest version of her opponent’s argument, acknowledges when she thinks her opponents are right and when she has been wrong, clarifies when misunderstood, and provides plenty of evidence for her claims
  • Wynn is a former Ph.D. student in philosophy, and though her videos are too rich with dick jokes for official settings, her argumentative practice would pass muster in any grad seminar.
  • she critiques many of her leftist allies for being bad at persuasion.
  • Socrates persuaded by both the logic of argument and the dynamic of fandom. Wynn is beginning to grow a dedicated following of her own: Members of online discussion groups refer to her as “mother” and “the queen,” produce fan art, and post photos of themselves dressed as characters from her videos.
  • she shares Socrates’s view that philosophy is more an erotic art than a martial one
  • As she puts it, she’s not trying to destroy the people she addresses, but seduce them
  • for Wynn, the true key to persuasion is to engage her audience on an emotional level.
  • One thing she has come across repeatedly is a disdain for the left’s perceived moral superiority. Anti-progressives of all stripes, Wynn told me, show an “intense defensiveness against being told what to do” and a “repulsion in response to moralizing.”
  • Matching her speech to the audience’s tastes presents a prickly rhetorical challenge. In an early video, Contra complains: “The problem is this medium. These goddamn savages demand a circus, and I intend to give them one, but behind the curtain, I really just want to have a conversation.
  • Philosophical conversation requires empathy and good-faith engagement. But the native tongue of political YouTube is ironic antagonism. It’s Wynn’s inimitable way of combining these two ingredients that gives ContraPoints its distinctive mouthfeel.
  • Wynn spends weeks in the online communities of her opponents—whether they’re climate skeptics or trans-exclusionary feminists—trying to understand what they believe and why they believe it. In Socrates’s words, she’s studying the souls of her audience.
Javier E

Less cramming. More Frisbee. At Yale, students learn how to live the good life. - The Washington Post - 0 views

  • Santos designed this class after she realized, as the head of a residential college at Yale, that many students were stressed out and unhappy, grinding through long days that seemed to her far more crushing and joyless than her own college years. Her perception was backed up by statistics
  • a national survey that found nearly half of college students reported overwhelming anxiety and feeling hopeless.
  • “They feel they’re in this crazy rat race, they’re working so hard they can’t take a single hour off — that’s awful.”
  • ...15 more annotations...
  • The idea behind the class is deceptively simple, and many of the lessons — such as gratitude, helping others, getting enough sleep — are familiar.
  • “A lot of people are waking up, realizing that we’re struggling,
  • All semester, hundreds of students tried to rewire themselves — to exercise more, to thank their mothers, to care less about the final grade and more about the ideas.
  • But in ways small and large, silly and heartbreakingly earnest, simple and profound, this class changed the conversation at Yale. It surfaced in late-night talks in dorms, it was dissected in newspaper columns, it popped up, again and again, in memes.
  • It’s the application that’s difficult, a point Santos made repeatedly: Our brains often lead us to bad choices, and even when we realize the choices are bad, it’s hard to break habits.
  • In a way, the class is the very essence of a liberal-arts education: learning, exploration, insight into oneself and the world. But many students described it as entirely unlike any class they had ever taken — nothing to do with academics, and everything to do with life.
  • There’s no longer the same stigma around mental-health issues, he said. “Now, so many people are admitting they want to lead happier lives.”
  • The impact is not limited to Yale. Stories about PSYC157 spread around the world. Santos created a pared-down version of the class and offered it to anyone on the online education site Coursera.
  • She taught students about cognitive biases.
  • “We called it the ways in which our minds suck,” Forti said. “Our minds make us think that certain things make us happy, but they don’t.”
  • Then, they had to apply the lessons
  • There was a palpable difference on campus, several students said, during the week when they performed random acts of kindness.
  • The biggest misconception people have about the class is that Santos is offering some kind of easy happiness fix. “It’s something you have to work on every day. . . . If I keep using these skills, they’ll, over time, help me develop better habits and be happier.
  • So many students have told her the class changed their lives. “If you’re really grateful, show me that,” she told them. “Change the culture.
  • for now students stood and clapped and clapped and clapped, beaming, drowning out even Kanye with their standing ovation. As if they had nothing but time.
Javier E

It's Win-Win When Trump and the Democrats Work Together - 0 views

  • The “punch a Nazi” thread that became popular earlier this year among the left-liberal journalistic class opened my eyes to this, as more than a few liberal thought leaders loved it when they saw a video of Richard Spencer being clocked by a masked thug.
  • How has political violence now become acceptable on lefty Twitter and among one in five college students? I’d argue that it’s too easy to overlook the influence of the neo-Marxist ideology now pervasive on countless campuses — specifically the late philosopher Herbert Marcuse’s concepts of “violence of defense” and “violence of aggression” in the context of what he called “repressive tolerance.” For parts of the New Left, racist democratic capitalism perpetuates so much systemic oppression that any defense of it or acquiescence in it amounts to violence against the victims. Therefore violence in defense of the victims is perfectly defensible. It just levels the playing field.
  • Hence it’s okay to punch a Nazi, but not okay to punch a communist. It’s defensible for an oppressed person of color to assault a white person but never the other way round. Hence a recent discussion in The Guardian about whether cold-cocking a racist is defensible: “A punch may be uncivil, but racism is worse.”
  • ...2 more annotations...
  • Actually, speech is not just interchangeable with violence; even silence is! One of the more popular signs at the rally in Boston a few weeks back was the following: “White Silence = Violence.” If you are not actively speaking out against white supremacy, in other words, you are actively enforcing it. Once you’ve apologized for being born white, and asked permission to speak, your next and only step is to inveigh against racism/sexism, etc. … or be accused of being a white supremacist yourself. At some point your head begins to explode. What is this: a Maoist boot camp?
  • We often discuss these things in the media without understanding the core ideas that animate them. But it’s important to understand that for the social-justice left, there is nothing irrational about any of this. If you take their ideas seriously, oppressive speech is violence and self-defense is legitimate. Violence is therefore not some regrettable incident. Violence to achieve liberation is a key part of the ideology they believe in.
1 - 20 of 42 Next › Last »
Showing 20 items per page