Skip to main content

Home/ TOK Friends/ Group items matching "Nothing" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
46More

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
3More

All in the Family - The New York Times - 0 views

  • There’s nothing like the holidays to make you grateful for family ties while also wishing you were related to no one.
  • Molly Brodak wrestles with the question of whether telling a true story is a form of erasing it, or at least changing it beyond recognition. “I don’t want to see these words touching these true things,” she writes. “They are all wrong. This whole language I’m using is wrong.”
  •  
    Language can never truly express our feelings. Last year, I have read a war story called "How to write a true war story", talking about how literature can never be as true as reality. Absolute truth can't be preserved because our memory is so unreliable and plastic. It all depends on whether we believe it or not. --Sissi (12/31/2016)
12More

The Selfish Gene turns 40 | Science | The Guardian - 0 views

  • The idea was this: genes strive for immortality, and individuals, families, and species are merely vehicles in that quest. The behaviour of all living things is in service of their genes hence, metaphorically, they are selfish.
  • Before this, it had been proposed that natural selection was honing the behaviour of living things to promote the continuance through time of the individual creature, or family, or group or species. But in fact, Dawkins said, it was the gene itself that was trying to survive, and it just so happened that the best way for it to survive was in concert with other genes in the impermanent husk of an individual
  • This gene-centric view of evolution also began to explain one of the oddities of life on Earth – the behaviour of social insects. What is the point of a drone bee, doomed to remain childless and in the service of a totalitarian queen? Suddenly it made sense that, with the gene itself steering evolution, the fact that the drone shared its DNA with the queen meant that its servitude guarantees not the individual’s survival, but the endurance of the genes they shar
  • ...9 more annotations...
  • the subject is taught bafflingly minimally and late in the curriculum even today; evolution by natural selection is crucial to every aspect of the living world. In the words of the Russian scientist Theodosius Dobzhansky: “Nothing in biology makes sense except in the light of evolution.”
  • his true legacy is The Selfish Gene and its profound effect on multiple generations of scientists and lay readers. In a sense, The Selfish Gene and Dawkins himself are bridges, both intellectually and chronologically, between the titans of mid-century biology – Ronald Fisher, Trivers, Hamilton, Maynard Smith and Williams – and our era of the genome, in which the interrogation of DNA dominates the study of evolution.
  • Genes aren’t what they used to be either. In 1976 they were simply stretches of DNA that encoded proteins. We now know about genes made of DNA’s cousin, RNA; we’ve discovered genes that hop from genome to genome
  • Since 1976, our understanding of why life is the way it is has blossomed and changed. Once the gene became the dominant idea in biology in the 1990s there followed a technological goldrush – the Human Genome Project – to find them all.
  • None of the complications of modern genomes erodes the central premise of the selfish gene.
  • Much of the enmity stems from people misunderstanding that selfishness is being used as a metaphor. The irony of these attacks is that the selfish gene metaphor actually explains altruism. We help others who are not directly related to us because we share similar versions of genes with them.
  • In the scientific community, the chief objection maintains that natural selection can operate at the level of a group of animals, not solely on genes or even individuals
  • To my mind, and that of the majority of evolutionary biologists, the gene-centric view of evolution always emerges intact.
  • the premise remains exciting that a gene’s only desire is to reproduce itself, and that the complexity of genomes makes that reproduction more efficient.
22More

This is what it's like to grow up in the age of likes, lols and longing | The Washingto... - 1 views

  • She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
  • She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
  • Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too.
  • ...19 more annotations...
  • “Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
  • The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
  • Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
  • “It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
  • School is where she thrives: She is beloved by her teachers, will soon star as young Simba in the eighth-grade performance of “The Lion King” musical, and gets straight A’s. Her school doesn’t offer a math course challenging enough for her, so she takes honors algebra online through Johns Hopkins University.
  • “Happy birthday posts are a pretty big deal,” she says. “It really shows who cares enough to put you on their page.”
  • He checks the phone bill to see who she’s called and how much she’s been texting, but she barely calls anyone and chats mostly through Snapchat, where her messages disappear.
  • Some of Katherine’s very best friends have never been to her house, or she to theirs. To Dave, it seems like they rarely hang out, but he knows that to her, it seems like they’re together all the time.
  • Dave Pommerening wants to figure out how to get her to use it less. One month, she ate up 18 gigabytes of data. Most large plans max out at 10. He intervened and capped her at four GB. “I don’t want to crimp it too much,” he says. “That’s something, from my perspective, I’m going to have to figure out, how to get my arms around that.”
  • Even if her dad tried snooping around her apps, the true dramas of teenage girl life are not written in the comments. Like how sometimes, Katherine’s friends will borrow her phone just to un-like all the Instagram photos of girls they don’t like. Katherine can’t go back to those girls’ pages and re-like the photos because that would be stalking, which is forbidden.
  • Or how last week, at the middle school dance, her friends got the phone numbers of 10 boys, but then they had to delete five of them because they were seventh-graders. And before she could add the boys on Snapchat, she realized she had to change her username because it was her childhood nickname and that was totally embarrassing.
  • Then, because she changed her username, her Snapchat score reverted to zero. The app awards about one point for every snap you send and receive. It’s also totally embarrassing and stressful to have a low Snapchat score. So in one day, she sent enough snaps to earn 1,000 points.
  • Snapchat is where flirting happens. She doesn’t know anyone who has sent a naked picture to a boy, but she knows it happens with older girls, who know they have met the right guy.
  • Nothing her dad could find on her phone shows that for as good as Katherine is at math, basketball and singing, she wants to get better at her phone. To be one of the girls who knows what to post, how to caption it, when to like, what to comment.
  • Katherine doesn’t need magazines or billboards to see computer-perfect women. They’re right on her phone, all the time, in between photos of her normal-looking friends. There’s Aisha, there’s Kendall Jenner’s butt. There’s Olivia, there’s YouTube star Jenna Marbles in lingerie.
  • The whole world is at her fingertips and has been for years. This, Katherine offers as a theory one day, is why she doesn’t feel like she’s 13 years old at all. She’s probably, like, 16.
  • “I don’t feel like a child anymore” she says. “I’m not doing anything childish. At the end of sixth grade” — when all her friends got phones and downloaded Snapchat, Instagram and Twitter — “I just stopped doing everything I normally did. Playing games at recess, playing with toys, all of it, done.”
  • Her scooter sat in the garage, covered in dust. Her stuffed animals were passed down to Lila. The wooden playground in the back yard stood empty. She kept her skateboard with neon yellow wheels, because riding it is still cool to her friends.
  • On the morning of her 14th birthday, Katherine wakes up to an alarm ringing on her phone. It’s 6:30 a.m. She rolls over and shuts it off in the dark. Her grandparents, here to celebrate the end of her first year of teenagehood, are sleeping in the guest room down the hall. She can hear the dogs shuffling across the hardwood downstairs, waiting to be fed. Propping herself up on her peace-sign-covered pillow, she opens Instagram. Later, Lila will give her a Starbucks gift card. Her dad will bring doughnuts to her class. Her grandparents will take her to the Melting Pot for dinner. But first, her friends will decide whether to post pictures of Katherine for her birthday. Whether they like her enough to put a picture of her on their page. Those pictures, if they come, will get likes and maybe tbhs. They should be posted in the morning, any minute now. She scrolls past a friend posing in a bikini on the beach. Then a picture posted by Kendall Jenner. A selfie with coffee. A basketball Vine. A selfie with a girl’s tongue out. She scrolls, she waits. For that little notification box to appear.
8More

Noam Chomsky Calls Postmodern Critiques of Science Over-Inflated "Polysyllabic Truisms"... - 0 views

  • we recently featured an interview in which Noam Chomsky slams postmodernist intellectuals like Slavoj Zizek and Jacques Lacan as “charlatans” and posers.
  • The turn against postmodernism has been long in coming,
  • Chomsky characterizes leftist postmodern academics as “a category of intellectuals who are undoubtedly perfectly sincere”
  • ...4 more annotations...
  • in his critique, such thinkers use “polysyllabic words and complicated constructions” to make claims that are “all very inflated” and which have “a terrible effect on the third world.
  • It’s considered very left wing, very advanced. Some of what appears in it sort of actually makes sense, but when you reproduce it in monosyllables, it turns out to be truisms. It’s perfectly true that when you look at scientists in the West, they’re mostly men, it’s perfectly true that women have had a hard time breaking into the scientific fields, and it’s perfectly true that there are institutional factors determining how science proceeds that reflect power structures.
  • you don’t get to be a respected intellectual by presenting truisms in monosyllables.
  • Chomsky’s cranky contrarianism is nothing new, and some of his polemic recalls the analytic case against “continental” philosophy or Karl Popper’s case against pseudo-science, although his investment is political as much as philosophical.
  •  
    An interesting synopsis and analysis, linked to a relatively short interview with a great thinker.
26More

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
14More

Do Your Friends Actually Like You? - The New York Times - 1 views

  • Recent research indicates that only about half of perceived friendships are mutual. That is, someone you think is your friend might not be so keen on you. Or, vice versa, as when someone you feel you hardly know claims you as a bestie.
  • “The notion of doing nothing but spending time in each other’s company has, in a way, become a lost art,” replaced by volleys of texts and tweets, Mr. Sharp said. “People are so eager to maximize efficiency of relationships that they have lost touch with what it is to be a friend.”
  • It’s a concern because the authenticity of one’s relationships has an enormous impact on one’s health and well-being.
  • ...11 more annotations...
  • The study analyzed friendship ties among 84 subjects (ages 23 to 38) in a business management class by asking them to rank one another on a five-point continuum of closeness from “I don’t know this person” to “One of my best friends.” The feelings were mutual 53 percent of the time while the expectation of reciprocity was pegged at 94 percent. This is consistent with data from several other friendship studies conducted over the past decade, encompassing more than 92,000 subjects, in which the reciprocity rates ranged from 34 percent to 53 percent.
  • “Friendship is difficult to describe,” said Alexander Nehamas, a professor of philosophy at Princeton, who in his latest book, “On Friendship,” spends almost 300 pages trying to do just that. “It’s easier to say what friendship is not and, foremost, it is not instrumental.”
  • It is not a means to obtain higher status, wangle an invitation to someone’s vacation home or simply escape your own boredom. Rather, Mr. Nehamas said, friendship is more like beauty or art, which kindles something deep within us and is “appreciated for its own sake.
  • “Treating friends like investments or commodities is anathema to the whole idea of friendship,” said Ronald Sharp, a professor of English at Vassar College, who teaches a course on the literature of friendship. “It’s not about what someone can do for you, it’s who and what the two of you become in each other’s presence.”
  • Some blame human beings’ basic optimism, if not egocentrism, for the disconnect between perceived and actual friendships. Others point to a misunderstanding of the very notion of friendship in an age when “friend” is used as a verb, and social inclusion and exclusion are as easy as a swipe or a tap on a smartphone screen.
  • By his definition, friends are people you take the time to understand and allow to understand you.
  • Because time is limited, so, too, is the number of friends you can have, according to the work of the British evolutionary psychologist Robin I.M. Dunbar. He describes layers of friendship, where the topmost layer consists of only one or two people, say a spouse and best friend with whom you are most intimate and interact daily. The next layer can accommodate at most four people for whom you have great affinity, affection and concern and who require weekly attention to maintain. Out from there, the tiers contain more casual friends with whom you invest less time and tend to have a less profound and more tenuous connection. Without consistent contact, they easily fall into the realm of acquaintance. You may be friendly with them but they aren’t friends.
  • “There is a limited amount of time and emotional capital we can distribute, so we only have five slots for the most intense type of relationship,” Mr. Dunbar said. “People may say they have more than five but you can be pretty sure they are not high-quality friendships.
  • Such boasting implies they have soul mates to spare in a culture where we are taught that leaning on someone is a sign of weakness and power is not letting others affect you. But friendship requires the vulnerability of caring as well as revealing things about yourself that don’t match the polished image in your Facebook profile or Instagram feed, said Mr. Nehamas at Princeton. Trusting that your bond will continue, and might even be strengthened, despite your shortcomings and inevitable misfortunes, he said, is a risk many aren’t willing to take.
  • According to medical experts, playing it safe by engaging in shallow, unfulfilling or nonreciprocal relationships has physical repercussions. Not only do the resulting feelings of loneliness and isolation increase the risk of death as much as smoking, alcoholism and obesity; you may also lose tone, or function, in the so-called smart vagus nerve, which brain researchers think allows us to be in intimate, supportive and reciprocal relationships in the first place.
  • In the presence of a true friend, Dr. Banks said, the smart or modulating aspect of the vagus nerve is what makes us feel at ease rather than on guard as when we are with a stranger or someone judgmental. It’s what enables us to feel O.K. about exposing the soft underbelly of our psyche and helps us stay engaged and present in times of conflict. Lacking authentic friendships, the smart vagus nerve is not exercised. It loses tone and one’s anxiety remains high, making abiding, deep connections difficult.
35More

Why Our Children Don't Think There Are Moral Facts - NYTimes.com - 1 views

  • I already knew that many college-aged students don’t believe in moral facts.
  • the overwhelming majority of college freshman in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.
  • where is the view coming from?
  • ...32 more annotations...
  • the Common Core standards used by a majority of K-12 programs in the country require that students be able to “distinguish among fact, opinion, and reasoned judgment in a text.”
  • So what’s wrong with this distinction and how does it undermine the view that there are objective moral facts?
  • For example, many people once thought that the earth was flat. It’s a mistake to confuse truth (a feature of the world) with proof (a feature of our mental lives)
  • Furthermore, if proof is required for facts, then facts become person-relative. Something might be a fact for me if I can prove it but not a fact for you if you can’t. In that case, E=MC2 is a fact for a physicist but not for me.
  • worse, students are taught that claims are either facts or opinions. They are given quizzes in which they must sort claims into one camp or the other but not both. But if a fact is something that is true and an opinion is something that is believed, then many claims will obviously be both
  • How does the dichotomy between fact and opinion relate to morality
  • Kids are asked to sort facts from opinions and, without fail, every value claim is labeled as an opinion.
  • Here’s a little test devised from questions available on fact vs. opinion worksheets online: are the following facts or opinions? — Copying homework assignments is wrong. — Cursing in school is inappropriate behavior. — All men are created equal. — It is worth sacrificing some personal liberties to protect our country from terrorism. — It is wrong for people under the age of 21 to drink alcohol. — Vegetarians are healthier than people who eat meat. — Drug dealers belong in prison.
  • The answer? In each case, the worksheets categorize these claims as opinions. The explanation on offer is that each of these claims is a value claim and value claims are not facts. This is repeated ad nauseum: any claim with good, right, wrong, etc. is not a fact.
  • In summary, our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.
  • It should not be a surprise that there is rampant cheating on college campuses: If we’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong, we can’t very well blame them for doing so later on.
  • If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged? If there are no truths about what is good or valuable or right, how can we prosecute people for crimes against humanity? If it’s not true that all humans are created equal, then why vote for any political system that doesn’t benefit you over others?
  • the curriculum sets our children up for doublethink. They are told that there are no moral facts in one breath even as the next tells them how they ought to behave.
  • Our children deserve a consistent intellectual foundation. Facts are things that are true. Opinions are things we believe. Some of our beliefs are true. Others are not. Some of our beliefs are backed by evidence. Others are not.
  • Value claims are like any other claims: either true or false, evidenced or not.
  • The hard work lies not in recognizing that at least some moral claims are true but in carefully thinking through our evidence for which of the many competing moral claims is correct.
  • Moral truths are not the same as scientific truths or mathematical truths. Yet they may still be used a guiding principle for our individual lives as well as our laws.But there is equal danger of giving moral judgments the designation of truth as there is in not doing so. Many people believe that abortion is murder on the same level as shooting someone with a gun. But many others do not. So is it true that abortion is murder?Moral principles can become generally accepted and then form the basis for our laws. But many long accepted moral principles were later rejected as being faulty. "Separate but equal" is an example. Judging homosexual relationships as immoral is another example.
  • Whoa! That Einstein derived an equation is a fact. But the equation represents a theory that may have to be tweaked at some point in the future. It may be a fact that the equation foretold the violence of atomic explosions, but there are aspects of nature that elude the equation. Remember "the theory of everything?"
  • Here is a moral fact, this is a sermon masquerading as a philosophical debate on facts, opinions and truth. This professor of religion is asserting that the government via common core is teaching atheism via the opinion vs fact.He is arguing, in a dishonest form, that public schools should be teaching moral facts. Of course moral facts is code for the Ten Commandments.
  • As a fourth grade teacher, I try to teach students to read critically, including distinguishing between facts and opinions as they read (and have been doing this long before the Common Core arrived, by the way). It's not always easy for children to grasp the difference. I can only imagine the confusion that would ensue if I introduced a third category -- moral "facts" that can't be proven but are true nonetheless!
  • horrible acts occur not because of moral uncertainty, but because people are too sure that their views on morality are 100% true, and anyone who fails to recognize and submit themselves are heathens who deserve death.I can't think of any case where a society has suffered because people are too thoughtful and open-minded to different perspectives on moral truth.In any case, it's not an elementary school's job to teach "moral truths."
  • The characterization of moral anti-realism as some sort of fringe view in philosophy is misleading. Claims that can be true or false are, it seems, 'made true' by features of the world. It's not clear to many in philosophy (like me) just what features of the world could make our moral claims true. We are more likely to see people's value claims as making claims about, and enforcing conformity to, our own (contingent) social norms. This is not to hold, as Mr. McBrayer seems to think follows, that there are no reasons to endorse or criticize these social norms.
  • This is nonsense. Giving kids the tools to distinguish between fact and opinion is hard enough in an age when Republicans actively deny reality on Fox News every night. The last thing we need is to muddy their thinking with the concept of "moral facts."A fact is a belief that everyone _should_ agree upon because it is observable and testable. Morals are not agreed upon by all. Consider the hot button issue of abortion.
  • Truthfully, I'm not terribly concerned that third graders will end up taking these lessons in the definition of fact versus opinion to the extremes considered here, or take them as a license to cheat. That will come much later, when they figure out, as people always have, what they can get a way with. But Prof. McBrayer, with his blithe expectation that all the grownups know that there moral "facts"? He scares the heck out of me.
  • I've long chafed at the language of "fact" v. "opinion", which is grounded in a very particular, limited view of human cognition. In my own ethics courses, I work actively to undermine the distinction, focusing instead on considered judgment . . . or even more narrowly, on consideration itself. (See http://wp.me/p5Ag0i-6M )
  • The real waffle here is the very concept of "moral facts." Our statements of values, even very important ones are, obviously, not facts. Trying to dress them up as if they are facts, to me, argues for a pretty serious moral weakness on the part of those advancing the idea.
  • Our core values are not important because they are facts. They are important because we collectively hold them and cherish them. To lean on the false crutch of "moral facts" to admit the weakness of your own moral convictions.
  • I would like to believe that there is a core of moral facts/values upon which all humanity can agree, but it would be tough to identify exactly what those are.
  • For the the ancient philosophers, reality comprised the Good, the True, and the Beautiful (what we might now call ethics, science and art), seeing these as complementary and inseparable, though distinct, realms. With the ascendency of science in our culture as the only valid measure of reality to the detriment of ethics and art (that is, if it is not observable and provable, it is not real), we have turned the good and the beautiful into mere "social constructs" that have no validity on their own. While I am sympathetic in many ways with Dr. McBrayer's objections, I think he falls into the trap of discounting the Good and The Beautiful as valid in and of themselves, and tries, instead, to find ways to give them validity through the True. I think his argument would have been stronger had he used the language of validity rather than the language of truth. Goodness, Truth and Beauty each have their own validity, though interdependent and inseparable. When we artificially extract one of these and give it primacy, we distort reality and alienate ourselves from it.
  • Professor McBrayer seems to miss the major point of the Common Core concern: can students distinguish between premises based on (reasonably construed) fact and premises based on emotion when evaluating conclusions? I would prefer that students learn to reason rather than be taught moral 'truth' that follows Professor McBrayer's logic.
  • Moral issues cannot scientifically be treated on the level that Prof. McBrayer is attempting to use in this column: true or false, fact or opinion or both. Instead, they should be treated as important characteristics of the systematic working of a society or of a group of people in general. One can compare the working of two groups of people: one in which e.g. cheating and lying is acceptable, and one in which they are not. One can use historical or model examples to show the consequences and the working of specific systems of morals. I think that this method - suitably adjusted - can be used even in second grade.
  • Relativism has nothing to do with liberalism. The second point is that I'm not sure it does all that much harm, because I have yet to encounter a student who thought that he or she had to withhold judgment on those who hold opposing political views!
8More

Psychiatry's New Guide Falls Short, Experts Say - NYTimes.com - 1 views

  • his goal was to reshape the direction of psychiatric research to focus on biology, genetics and neuroscience so that scientists can define disorders by their causes, rather than their symptoms.
  • While the Diagnostic and Statistical Manual of Mental Disorders, or D.S.M., is the best tool now available for clinicians treating patients and should not be tossed out, he said, it does not reflect the complexity of many disorders, and its way of categorizing mental illnesses should not guide research.
  • senior figures in psychiatry who have challenged not only decisions about specific diagnoses but the scientific basis of the entire enterprise. Basic research into the biology of mental disorders and treatment has stalled, they say, confounded by the labyrinth of the brain.
  • ...5 more annotations...
  • The creators of the D.S.M. in the 1960s and ’70s “were real heroes at the time,” said Dr. Steven E. Hyman, a psychiatrist and neuroscientist at the Broad Institute and a former director at the National Institute of Mental Health. “They chose a model in which all psychiatric illnesses were represented as categories discontinuous with ‘normal.’ But this is totally wrong in a way they couldn’t have imagined. So in fact what they produced was an absolute scientific nightmare. Many people who get one diagnosis get five diagnoses, but they don’t have five diseases — they have one underlying condition.”
  • Dr. Insel is one of a growing number of scientists who think that the field needs an entirely new paradigm for understanding mental disorders, though neither he nor anyone else knows exactly what it will look like.
  • Decades of spending on neuroscience have taught scientists mostly what they do not know, undermining some of their most elemental assumptions. Genetic glitches that appear to increase the risk of schizophrenia in one person may predispose others to autism-like symptoms, or bipolar disorder. The mechanisms of the field’s most commonly used drugs — antidepressants like Prozac, and antipsychosis medications like Zyprexa — have revealed nothing about the causes of those disorders. And major drugmakers have scaled back psychiatric drug development, having virtually no new biological “targets” to shoot for.
  • Dr. Hyman, Dr. Insel and other experts said they hoped that the science of psychiatry would follow the direction of cancer research, which is moving from classifying tumors by where they occur in the body to characterizing them by their genetic and molecular signatures.
  • Dr. Insel said in the interview that his motivation was not to disparage the D.S.M. as a clinical tool, but to encourage researchers and especially outside reviewers who screen proposals for financing from his agency to disregard its categories and investigate the biological underpinnings of disorders instead.
16More

Googling Is Believing: Trumping the Informed Citizen - The New York Times - 1 views

  • Rubio’s Google gambit and Trump’s (non)reaction to it, reveals an interesting, and troubling, new change in attitude about a philosophical foundation of democracy: the ideal of an informed citizenry.
  • The idea is obvious: If citizens are going to make even indirect decisions about policy, we need to know the facts about the problem the policy is meant to rectify, and to be able to gain some understanding about how effective that policy would be.
  • Noam Chomsky argued in the 1980s that consent was being “manufactured” by Big Media — large consolidated content-delivery companies (like this newspaper) that could cause opinions to sway one way or the other at their whim.
  • ...13 more annotations...
  • searching the Internet can get you to information that would back up almost any claim of fact, no matter how unfounded. It is both the world’s best fact-checker and the world’s best bias confirmer — often at the same time.
  • Nor is it a coincidence that people are increasingly following the election on social media, using it both as the source of their information and as the way to get their view out. Consent is still being manufactured, but the manufacturing is being done willingly by us, usually intended for consumption by other people with whom we already agree, facts or no facts.
  • It really isn’t a surprise that Rubio would ask us to Google for certain facts; that’s how you and I know almost everything we know nowadays — it is a way of knowing that is so embedded into the very fabric of our lives that we don’t even notice it
  • The problem of course is that having more information available, even more accurate information, isn’t what is required by the ideal.
  • What is required is that people actually know and understand that information, and there are reasons to think we are no closer to an informed citizenry understood in that way than we ever have been. Indeed, we might be further away.
  • The worry is no longer about who controls content. It is about who controls the flow of that content.
  • the flow of digital information is just as prone to manipulation as its content
  • No wonder Trump and his followers on Twitter immediately shrugged off Rubio’s inconvenient truths; there is nothing to fear from information when counterinformation is just as plentiful.
  • The real worry concerns our faith in the ideal of an informed citizenry itself. That worry, as I see it, has two faces.
  • First, as Jason Stanley and others have emphasized recently, appeals to ideals can be used to undermine those very ideals.
  • The very availability of information can make us think that the ideal of the informed citizen is more realized than it is — and that, in turn, can actually undermine the ideal, making us less informed, simply because we think we know all we need to know already.
  • Second, the danger is that increasing recognition of the fact that Googling can get you wherever you want to go can make us deeply cynical about the ideal of an informed citizenry — for the simple reason that what counts as an “informed” citizen is a matter of dispute. We no longer disagree just over values. Nor do we disagree just over the facts. We disagree over whose source — whose fountain of facts — is the right one.
  • And once disagreement reaches that far down, the daylight of reason seems very far away indeed.
10More

How to Be Liked by Everyone Online - NYTimes.com - 1 views

  • The Internet — once again — has upended social and psychological norms. Linguistically speaking, what was formerly undesirable or just unpleasant is now highly sought after
  • To be “linked,” in a previous life, suggested something illicit — an affair or a possible crime associating His Name with Yours. But in Internet World, linking is a professional asset.
  • applying the word “disrupt” to any behavior in people under the age of 18 is bound to involve bodily damage, psychic distress or — later on, perhaps — the buying and selling of hard drugs.
  • ...7 more annotations...
  • “reversification” to describe the phenomenon. “I mean by it a process in which words come, through a process of evolution and innovation, to have a meaning that is opposite to, or at least very different from, their initial sense,”
  • the word “enable” had a dubious cast in the common parlance of therapy and gossip: an enabler was someone who handed the broody tippler a fresh cocktail; to enable was to unleash the codependent. Now it’s a technological upgrade
  • To have something liked online is not as great as having something actually liked. It doesn’t even necessarily mean someone enjoyed it — it might simply mean, “Got it,” or more wanly, “This provoked some kind of feeling, however minor.”
  • To tag someone online is a far nastier enterprise. Anyone can resurface disparaging photographic evidence of youthful folly and post it on a social network, “tagging” it with the unsuspecting’s name.
  • Most people think long and hard about their favorite movie, novel, people and even color. Online, favorites are not so special. To “favorite” (now a verb) something on Twitter is to say, in effect, “I saw this thing and liked it O.K., but not enough to retweet it.” Or a tepid “I see you wrote something about me and I will acknowledge that by favoriting. But expect nothing more.”
  • Even for adults, sharing has historically been considered a commendable activity, no matter the tangled motivations. Sharing in Internet parlance? Pure egotism. Check out my 6-year-old on the viola. Don’t you wish you were this attractive at 41?
  • Being a star in real life signifies tremendous professional success or, at the very least, celebrity; to “star” something on Gmail means you need to write back.
8More

Can You Trust the News Media? - Watchtower ONLINE LIBRARY - 1 views

  • MANY people doubt what they read and hear in the news. In the United States, for example, a 2012 Gallup poll asked people “how much trust and confidence” they had in the accuracy, fairness, and completeness of the news reports of newspapers, TV, and radio. The answer from 6 out of 10 people was either “not very much” or “none at all.” Is such distrust justified?
  • Many journalists and the organizations they work for have expressed a commitment to producing accurate and informative reports. Yet, there is reason for concern. Consider the following factors:
  • MEDIA MOGULS. A small but very powerful number of corporations own primary media outlets.
  • ...4 more annotations...
  • GOVERNMENTS. Much of what we learn in the media has to do with the people and the affairs of government.
  • ADVERTISING. In most lands, media outlets must make money in order to stay in business, and most of it comes from advertising.
  • While it is wise not to believe everything we read in the news, it does not follow that there is nothing we can trust. The key may be to have a healthy skepticism, while keeping an open mind.
  • So, can you trust the news media? Sound advice is found in the wisdom of Solomon, who wrote: “Anyone inexperienced puts faith in every word, but the shrewd one considers his steps.”
  •  
    Can we trust the news media?
9More

The Republican Horse Race Is Over, and Journalism Lost - The New York Times - 0 views

  • Wrong, wrong, wrong — to the very end, we got it wrong.
  • in the end, you have to point the finger at national political journalism, which has too often lost sight of its primary directives in this election season: to help readers and viewers make sense of the presidential chaos; to reduce the confusion, not add to it; to resist the urge to put ratings, clicks and ad sales above the imperative of getting it right.
  • The first signs that something was amiss in the coverage of the Tea Party era actually surfaced in the 2014 midterms. Oh, you broadcast network newscast viewers didn’t know we had important elections with huge consequences for the governance of your country that year? You can be forgiven because the broadcast networks hardly covered them.
  • ...6 more annotations...
  • the lesson in Virginia, as the Washington Post reporter Paul Farhi wrote at the time, was that nothing exceeds the value of shoe-leather reporting, given that politics is an essentially human endeavor and therefore can defy prediction and reason.
  • Yet when Mr. Trump showed up on the scene, it was as if that had never happened.
  • It was another thing to declare, as The Huffington Post did, that coverage of his campaign could be relegated to the entertainment section (and to add a disclaimer to articles about him) and still another to give Mr. Trump a “2 percent” chance at the nomination despite strong polls in his favor, as FiveThirtyEight did six months before the first votes were cast.
  • Predictions that far out can be viewed as being all in good fun. But in Mr. Trump’s case, they also arguably sapped the journalistic will to scour his record as aggressively as those of his supposedly more serious rivals. In other words, predictions can have consequences.
  • The problems weren’t at all only due to the reliance on data. Don’t forget those moments that were supposed to have augured Mr. Trump’s collapse: the certainty that once the race narrowed to two or three candidates, Mr. Trump would be through, and what at one point became the likelihood of a contested convention.
  • That’s all the more reason in the coming months to be as sharply focused on the data we don’t have as we are on the data we do have (and maybe watching out for making any big predictions about the fall based on the polling of today). But a good place to start would be to get a good night’s sleep, and then talk to some voters.
10More

Is Empathy Overrated? | Big Think - 0 views

  • Empathy seems to be a quality you can never overdo. It’s like a megavitamin of emotionally relating: the more you display, the better a human you are.
  • In his last book, Just Babies, he argued humans are born moral, no religion required.
  • Telling someone empathy is overrated is akin to stating puppies are useless and ugly.
  • ...6 more annotations...
  • Empathy is the act of coming to experience the world as you think someone else does … If your suffering makes me suffer, if I feel what you feel, that’s empathy in the sense that I’m interested in here.
  • For example, donating to foreign charities ups our dopamine intake—we feel better because we’re making a difference (which, of course, can make it more about how we feel than who we’re helping).
  • Yet it’s not in our biological inheritance to offer unchecked empathy. Bloom points to our tribal nature as evidence. We’re going to care more for those closest to us, such as family and friends, then Cambodian orphans.
  • Anyone who thinks that it’s important for a therapist to feel depressed or anxious while dealing with depressed or anxious people is missing the point of therapy.
  • Bloom then discusses the difference between what Binghamton professor and Asian Studies scholar Charles Goodman describes as “sentimental compassion” and “great compassion.” The first is similar to empathy, which leads to imbalances in relationships and one’s own psychological state. Simply put, it’s exhausting.
  • Empathy is going to be a buzzword for some time to come. It feeds into our social nature, which Bloom sees nothing wrong with.
  •  
    I found this article very interesting as it talks about how empathy as a emotion is sometimes bad for us. I really like the point when the author mention that the empathy is not in our biological inheritance because our tribal nature is to care more for those closest to us. It is very interesting to think how our modern society shapes our emotions and behavior, and how empathy is gradually becoming our nature. --Sissi (2/22/2017)
10More

The Right Way to Say 'I'm Sorry' - The New York Times - 1 views

  • Most people say “I’m sorry” many times a day for a host of trivial affronts – accidentally bumping into someone or failing to hold open a door. These apologies are easy and usually readily accepted, often with a response like, “No problem.”
  • But when “I’m sorry” are the words needed to right truly hurtful words, acts or inaction, they can be the hardest ones to utter.
  • apology can be powerful medicine with surprising value for the giver as well as the recipient.
  • ...6 more annotations...
  • Expecting nothing in return, I was greatly relieved when my doorbell rang and the neighbor thanked me warmly for what I had said and done.
  • as an excuse for hurtful behavior.
  • She disputes popular thinking that failing to forgive is bad for one’s health and can lead to a life mired in bitterness and hate.
  • Offering an apology is an admission of guilt that admittedly leaves people vulnerable. There’s no guarantee as to how it will be received.
  • apologies followed by rationalizations are “never satisfying” and can even be harmful.
  • ‘I’m sorry’ are the two most healing words in the English language,” she said. “The courage to apologize wisely and well is not just a gift to the injured person, who can then feel soothed and released from obsessive recriminations, bitterness and corrosive anger.
  •  
    We are very easy with saying "sorry" when it is completely unnecessary to perform our politeness because we know for sure that others will respond with a "No problem". However, when it comes to the real times that a "sorry" is very essential, we become very reluctant on saying that since the recipient is very likely to reject the apology. Giving an apology for our wrong behavior can release us from guilt and bitterness. Apology should not be an ask for forgiveness, it is a communication between two people, a reviewing on ourselves. Apologies shouldn't be begging for forgiveness, it should be a self-reflection. --Sissi (1/31/2017)
9More

The Economics of Obesity: Why Are Poor People Fat? - 0 views

  • This is what poverty looked like in the Great Depression…
  • This is what poverty looks like today…
  • For most of recorded history, fat was revered as a sign of health and prosperity. Plumpness was a status symbol. It showed that you did not have to engage in manual labor for your sustenance. And it meant that you could afford plentiful quantities of food.
  • ...5 more annotations...
  • The constant struggle to hunt and harvest ensured that we stayed active. And for those with little money, the supply of calories was meager. This ensured that most of the working class stayed slim.
  • Rich people were fat. Poor people were thin.
  • What he found is that he could buy well over 1,000 calories of cookies or potato chips. But his dollar would only buy 250 calories of carrots. He could buy almost 900 calories of soda… but only 170 calories of orange juice.
  • The primary reason that lower-income people are more overweight is because the unhealthiest and most fattening foods are the cheapest.
  • Within the current system, the best we can hope for is a situation where public funds are diverted from the corporate Agri-Giants (which is nothing more than welfare for the wealthy) to family farms and fruit and vegetable growers. Currently, almost 70% of farmers receive no subsidies at all, while the biggest and strongest take the bulk of public funds.
  •  
    This article shows a very interesting stereotyping that rich people ought to be fat and poor people ought to be thin. It reminded me of I video I have just seen, in which a poor but fat woman is trying to explain why now people in poverty is more likely to be fat. She shows us some comments from people when they hear that she is very poor. The vehement reaction and bad language they used showed that this stereotyping is very deep in our society. However, time is very different now. Food is not as expensive as we think, what is expensive is actually healthy food, that's why poor people tends to be fat. My grandpa once told me that when he was young, he was confused why poor people in Honking movie are eating chicken legs. This is the result of the transformation of society.--Sissi (2/8/2017)
1More

Sydney's Swelter Has a Climate Change Link, Scientists Say - 0 views

  •  
    Southeastern Australia has suffered through a series of brutal heat waves over the past two months, with temperatures reaching a scorching 113 degrees Fahrenheit in some parts of the state of New South Wales. "It was nothing short of awful," said Sarah Perkins-Kirkpatrick, of the Climate Change Research Center at the University of New South Wales, in Sydney.
1More

What Does 'Cultural Appropriation' Mean? - The Atlantic - 1 views

  • Some people correctly perceive something like a frat party full of blackface as wrongheaded, file it under "cultural appropriation," and adopt the erroneous heuristic that any appropriation of a culture is wrongheaded. When the chef who staffs the dining hall at their college serves sushi, they see injustice where there is none. Conversely, other folks see a protest over sushi, perceive that it is absurd, see it filed under cultural appropriation, and adopt the bad heuristic that any grievance lodged under that heading is bullshit. Later, when their Facebook stream unearths a story about blackface headlined, "These Frat Boys Are Guilty of Cultural Appropriation," they erroneously conclude that nothing wrongheaded occurred. Perhaps they even ignorantly add a dismissive comment, exacerbating the canard that racial animus or dehumanization is a nonissue.
13More

Breitbart's James Delingpole says reef bleaching is 'fake news', hits peak denial | Gra... - 0 views

  • It takes a very special person to label the photographed, documented, filmed and studied phenomenon of mass coral bleaching on the Great Barrier Reef “fake news”.
  • It also helps if you can hide inside the bubble of the hyper-partisan Breitbart media outlet, whose former boss is the US president’s chief strategist.
  • So our special person is the British journalist James Delingpole who, when he’s not denying the impacts of coral bleaching, is denying the science of human-caused climate change, which he says is “the biggest scam in the history of the world”.
    • dicindioha
       
      oh dear...
  • ...8 more annotations...
  • When we talk about the reef dying, what we are talking about are the corals that form the reef’s structure – the things that when in a good state of health can be splendorous enough to support about 69,000 jobs in Queensland and add about $6bn to Australia’s economy every year.
  • The Great Barrier Reef has suffered mass coral bleaching three times – in 1998, 2002 and 2016 – with a fourth episode now unfolding. The cause is increasing ocean temperatures.
  • So it seems we are now at a stage where absolutely nothing is real unless you have seen it for yourself,
  • Senator Pauline Hanson and her One Nation climate science-denying colleagues tried to pull a similar stunt last year by taking a dive on a part of the reef that had escaped bleaching and then claiming this as proof that everything was OK everywhere else.
  • Corals bleach when they are exposed to abnormally high ocean temperatures for too long. Under stress, the corals expel the algae that give them their colour and more of their nutrients.
  • After the 2016 bleaching, a quarter of all corals on the reef, mostly located in the once “pristine” northern section, died before there was a chance for recovery.
  • Essentially, the study found the only measure that would give corals on the reef a fighting chance was to rapidly reduce greenhouse gas emissions.
  • Some commentators have suggested a key cause of the 2016 bleaching was the El Niño weather pattern that tends to deliver warmer global temperatures. But Hughes says that before 1998, the Great Barrier Reef went through countless El Niños without suffering the extensive mass bleaching episodes that are being seen, photographed, filmed and documented now.
  •  
    This frustrates me enormously. When there is evidence of bleaching of the coral and the impact of global warming on this coral, I don't understand how people can say this is fake news. It seems the US, at least, will not be helping fix this problem, but the whole world is at fault for this, and we should be a part of fixing it.
« First ‹ Previous 121 - 140 of 417 Next › Last »
Showing 20 items per page