Skip to main content

Home/ TOK Friends/ Group items tagged prize

Rss Feed Group items tagged

sissij

Super Mario Run's Not-So-Super Gender Politics - The New York Times - 0 views

  • Super Mario Run begins, as does almost every Super Mario title, with Princess Peach becoming a hostage who must be rescued by Mario. Just before her ritual kidnapping, Peach invites Mario to her castle and pledges to bake him a cake. Upon her rescue, she kisses Mario. The game also includes a second female character, Toadette, whose job is to wave a flag before and after a race, like a character from “Grease.”
  • But Super Mario Run relegates its female characters to positions of near helplessness. Peach and Toadette become playable only after you complete certain tasks, which makes the women in the game feel like prizes.
  • Still, lots of girls and women play video games. There are more women over 30 who play video games than boys under 18 who play, according to the industry’s lobbying arm, the Entertainment Software Association. A Pew Research Center survey published last year found that almost 60 percent of girls between the ages of 13 and 17 are gamers.
  • ...1 more annotation...
  • The knowledge that video games possess this power, that they allow us to adopt new identities and grant us new ways of seeing ourselves, is as old as Mario’s quest for his princess.
  •  
    The gender stereotype is everywhere, even in the most popular games. I am astonished that I become so numb about this kind of story. I have just played this game recently and I didn't feel anything weird or strange. It shows how deep this kind of stereotype is planted in my brain. I subconsciously put this kind of story into the category of normal. What we feel is common or right might not be correct in another person's perspective. It is always mind-blowing to see that how unusually or inappropriate something I think is normal can be. --Sissi (12/22/2016)
Emily Freilich

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
Javier E

Facebook Has 50 Minutes of Your Time Each Day. It Wants More. - The New York Times - 0 views

  • Fifty minutes.That’s the average amount of time, the company said, that users spend each day on its Facebook, Instagram and Messenger platforms
  • there are only 24 hours in a day, and the average person sleeps for 8.8 of them. That means more than one-sixteenth of the average user’s waking time is spent on Facebook.
  • That’s more than any other leisure activity surveyed by the Bureau of Labor Statistics, with the exception of watching television programs and movies (an average per day of 2.8 hours)
  • ...19 more annotations...
  • It’s more time than people spend reading (19 minutes); participating in sports or exercise (17 minutes); or social events (four minutes). It’s almost as much time as people spend eating and drinking (1.07 hours).
  • the average time people spend on Facebook has gone up — from around 40 minutes in 2014 — even as the number of monthly active users has surged. And that’s just the average. Some users must be spending many hours a day on the site,
  • time has become the holy grail of digital media.
  • Time is the best measure of engagement, and engagement correlates with advertising effectiveness. Time also increases the supply of impressions that Facebook can sell, which brings in more revenue (a 52 percent increase last quarter to $5.4 billion).
  • And time enables Facebook to learn more about its users — their habits and interests — and thus better target its ads. The result is a powerful network effect that competitors will be hard pressed to match.
  • the only one that comes close is Alphabet’s YouTube, where users spent an average of 17 minutes a day on the site. That’s less than half the 35 minutes a day users spent on Facebook
  • ComScore reported that television viewing (both live and recorded) dropped 2 percent last year, and it said younger viewers in particular are abandoning traditional live television. People ages 18-34 spent just 47 percent of their viewing time on television screens, and 40 percent on mobile devices.
  • People spending the most time on Facebook also tend to fall into the prized 18-to-34 demographic sought by advertisers.
  • Users spent an average of nine minutes on all of Yahoo’s sites, two minutes on LinkedIn and just one minute on Twitter
  • What aren’t Facebook users doing during the 50 minutes they spend there? Is it possibly interfering with work (and productivity), or, in the case of young people, studying and reading?
  • While the Bureau of Labor Statistics surveys nearly every conceivable time-occupying activity (even fencing and spelunking), it doesn’t specifically tally the time spent on social media, both because the activity may have multiple purposes — both work and leisure — and because people often do it at the same time they are ostensibly engaged in other activities
  • The closest category would be “computer use for leisure,” which has grown from eight minutes in 2006, when the bureau began collecting the data, to 14 minutes in 2014, the most recent survey. Or perhaps it would be “socializing and communicating with others,” which slipped from 40 minutes to 38 minutes.
  • But time spent on most leisure activities hasn’t changed much in those eight years of the bureau’s surveys. Time spent reading dropped from an average of 22 minutes to 19 minutes. Watching television and movies increased from 2.57 hours to 2.8. Average time spent working declined from 3.4 hours to 3.25. (Those hours seem low because much of the population, which includes both young people and the elderly, does not work.)
  • The bureau’s numbers, since they cover the entire population, may be too broad to capture important shifts among important demographic groups
  • “You hear a narrative that young people are fleeing Facebook. The data show that’s just not true. Younger users have a wider appetite for social media, and they spend a lot of time on multiple networks. But they spend more time on Facebook by a wide margin.”
  • Among those 55 and older, 70 percent of their viewing time was on television, according to comScore. So among young people, much social media time may be coming at the expense of traditional television.
  • comScore’s data suggests that people are spending on average just six to seven minutes a day using social media on their work computers. “I don’t think Facebook is displacing other activity,” he said. “People use it during downtime during the course of their day, in the elevator, or while commuting, or waiting.
  • Facebook, naturally, is busy cooking up ways to get us to spend even more time on the platform
  • A crucial initiative is improving its News Feed, tailoring it more precisely to the needs and interests of its users, based on how long people spend reading particular posts. For people who demonstrate a preference for video, more video will appear near the top of their news feed. The more time people spend on Facebook, the more data they will generate about themselves, and the better the company will get at the task.
Javier E

Google's new media apocalypse: How the search giant wants to accelerate the end of the ... - 0 views

  • Google is announcing that it wants to cut out the middleman—that is to say, other websites—and serve you content within its own lovely little walled garden. That sound you just heard was a bunch of media publishers rushing to book an extra appointment with their shrink.
  • Back when search, and not social media, ruled the internet, Google was the sun around which the news industry orbited. Getting to the top of Google’s results was the key that unlocked buckets of page views. Outlet after outlet spent countless hours trying to figure out how to game Google’s prized, secretive algorithm. Whole swaths of the industry were killed instantly if Google tweaked the algorithm.
  • Facebook is now the sun. Facebook is the company keeping everyone up at night. Facebook is the place shaping how stories get chosen, how they get written, how they are packaged and how they show up on its site. And Facebook does all of this with just as much secrecy and just as little accountability as Google did.
  • ...3 more annotations...
  • Facebook just opened up its Instant Articles feature to all publishers. The feature allows external outlets to publish their content directly onto Facebook’s platform, eliminating that pesky journey to their actual website. They can either place their own ads on the content or join a revenue-sharing program with Facebook. Facebook has touted this plan as one which provides a better user experience and has noted the ability for publishers to create ads on the platform as well.
  • The benefit to Facebook is obvious: It gets to keep people inside its house. They don’t have to leave for even a second. The publisher essentially has to accept this reality, sigh about the gradual death of websites and hope that everything works out on the financial side.
  • It’s all part of a much bigger story: that of how the internet, that supposed smasher of gates and leveler of playing fields, has coalesced around a mere handful of mega-giants in the space of just a couple of decades. The gates didn’t really come down. The identities of the gatekeepers just changed. Google, Facebook, Apple, Amazon
Javier E

The Amygdala Made Me Do It - NYTimes.com - 1 views

  • It’s the invasion of the Can’t-Help-Yourself books. Unlike most pop self-help books, these are about life as we know it — the one you can change, but only a little, and with a ton of work. Professor Kahneman, who won the Nobel Prize in economic science a decade ago, has synthesized a lifetime’s research in neurobiology, economics and psychology. “Thinking, Fast and Slow” goes to the heart of the matter: How aware are we of the invisible forces of brain chemistry, social cues and temperament that determine how we think and act?
  • The choices we make in day-to-day life are prompted by impulses lodged deep within the nervous system. Not only are we not masters of our fate; we are captives of biological determinism. Once we enter the portals of the strange neuronal world known as the brain, we discover that — to put the matter plainly — we have no idea what we’re doing.
  • Mr. Duhigg’s thesis is that we can’t change our habits, we can only acquire new ones. Alcoholics can’t stop drinking through willpower alone: they need to alter behavior — going to A.A. meetings instead of bars, for instance — that triggers the impulse to drink.
  • ...1 more annotation...
  • they’re full of stories about people who accomplished amazing things in life by, in effect, rewiring themselves
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

How The Internet Enables Lies - The Dish | By Andrew Sullivan - The Daily Beast - 1 views

  • the ease of self-publishing and search afforded by the Internet along with a growing skeptical stance towards scientific expertise—has given the anti-vaccination movement a significant boost
  • She regularly shares her "knowledge" about vaccination with her nearly half-million Twitter followers. This is the kind of online influence that Nobel Prize-winning scientists can only dream of; Richard Dawkins, perhaps the most famous working scientist, has only 300,000 Twitter followers.
Duncan H

Other People's Suffering - NYTimes.com - 0 views

  • members of the upper class are more likely than others to behave unethically, to lie during negotiations, to drive illegally and to cheat when competing for a prize.“Greed is a robust determinant of unethical behavior,” the authors conclude. “Relative to lower-class individuals, individuals from upper-class backgrounds behaved more unethically in both naturalistic and laboratory settings.”
  • Our findings suggest that when a person is suffering, upper-class individuals perceive these signals less well on average, consistent with other findings documenting reduced empathic accuracy in upper-class individuals (Kraus et al., 2010). Taken together, these findings suggest that upper-class individuals may underestimate the distress and suffering in their social environments.
  • each participant was assigned to listen, face to face, from two feet away, to someone else describing real personal experiences of suffering and distress.The listeners’ responses were measured two ways, first by self-reported levels of compassion and second by electrocardiogram readings to determine the intensity of their emotional response. The participants all took a test known as the “sense of power” scale, ranking themselves on such personal strengths and weaknesses as ‘‘I can get people to listen to what I say’’ and ‘‘I can get others to do what I want,” as well as ‘‘My wishes do not carry much weight’’ and ‘‘Even if I voice them, my views have little sway,’’ which are reverse scored.The findings were noteworthy, to say the least. For “low-power” listeners, compassion levels shot up as the person describing suffering became more distressed. Exactly the opposite happened for “high-power” listeners: their compassion dropped as distress rose.
  • ...7 more annotations...
  • Who fits the stereotype of the rich and powerful described in this research? Mitt Romney. Empathy: “I’m not concerned about the very poor.” Compassion: “I like being able to fire people who provide services to me.” Sympathy for the disadvantaged: My wife “drives a couple of Cadillacs.” Willingness to lie in negotiations: “I was a severely conservative Republican governor.”
  • 48 percent described the Democratic Party as “weak,” compared to 28 percent who described the Republican Party that way. Conversely, 50 percent said the Republican Party is “cold hearted,” compared to 30 percent who said that was true of the Democrats.
  • This is the war that is raging throughout America. It is between conservatives, who emphasize personal responsibility and achievement, against liberals, who say the government must take from the wealthy and give to the poor. So it will be interesting this week to see if President Obama can rally the country to support his vision of a strong social compact. He has compassion on his side. Few Americans want to see their fellow citizens suffer. But the president does have that fiscal responsibility issue haunting him because the country remains in dire trouble.
  • For power holders, the world is viewed through an instrumental lens, and approach is directed toward those individuals who populate the useful parts of the landscape. Our results suggest that power not only channels its possessor’s energy toward goal completion but also targets and attempts to harness the energy of useful others. Thus, power appears to be a great facilitator of goal pursuit through a combination of intrapersonal and interpersonal processes. The nature of the power holder’s goals and interpersonal relationships ultimately determine how power is harnessed and what is accomplished in the end.
  • Republicans recognize the political usefulness of objectification, capitalizing on “compassion fatigue,” or the exhaustion of empathy, among large swathes of the electorate who are already stressed by the economic collapse of 2008, high levels of unemployment, an epidemic of foreclosures, stagnant wages and a hyper-competitive business arena.
  • . Republican debates provided further evidence of compassion fatigue when audiences cheered the record-setting use of the death penalty in Texas and applauded the prospect of a gravely ill pauper who, unable to pay medical fees, was allowed to die.Even Rick Santorum, who has been described by the National Review as holding “unstinting devotion to human dignity” and as fluent in “the struggles of the working class,” wants to slash aid to the poor. At a Feb. 21 gathering of 500 voters in Maricopa County, Ariz., Santorum brought the audience to its feet as he declared:We need to take everything from food stamps to Medicaid to the housing programs to education and training programs, we need to cut them, cap them, freeze them, send them to the states, say that there has to be a time limit and a work requirement, and be able to give them the flexibility to do those programs here at the state level.
  • President Obama has a substantial advantage this year because he does not have a primary challenger, which frees him from the need to emphasize his advocacy for the disempowered — increasing benefits or raising wages for the poor. This allows him to pick and chose the issues he wants to address.At the same time, compassion fatigue may make it easier for the Republican nominee to overcome the liabilities stemming from his own primary rhetoric, to reach beyond the core of the party to white centrist voters less openly drawn to hard-edged conservatism. With their capacity for empathy frayed by a pervasive sense of diminishing opportunity and encroaching shortfall, will these voters once again become dependable Republicans in 2012?
  •  
    Do you agree with Edsall? I think he is definitely taking an anti-Republican stance, but the findings are interesting.
kenjiendo

Impact Factor and the Future of Medical Journals - Haider Javed Warraich - The Atlantic - 0 views

    • kenjiendo
       
      An article highlighting recent criticism for the accuracy of published Medical Journals, origins of the issue, and possible solutions for the future. 
  • Impact Factor and the Future of Medical Journals Some research publications are getting away from flawed measures of influence that make it easy to game the system.
  • This year's Nobel Prize winner in physiology, Randy Scheckman, announced his decision to boycott the three major “luxury” journals: Science, Nature, and Cell.
  • ...19 more annotations...
  • medical journals are very rigid
  • impact factor, defined as the number of citations divided by the number of papers published in the journal, which is a measure to convey the influence of journals and the research they carry.
  • Journals employ several strategies to artificially raise the impact factor
  • caught trying to induce authors to increase the number of citations
  • cite each other’s articles
  • citations are barely a reflection of the quality of the research and that the impact factor is easily manipulated
  • shing’s growth is actually one of its g
  • overwhelmed with the avalanche of information
  • current system of peer-review, which originated in the 18th century, is now stressed
    • kenjiendo
       
      An example from our reading from U6-9
  • future of the medical journal was, he summed it up in just one word: “Digital.”
  • more innovative approache
  • PLOS One, which provides individual article metrics to anyone who accesses the article.
  • Instead of letting the reputation of the journal decide the impact of its papers, PLOS One provides information about the influence of the article on a more granular level.
  • future of medical publishing is a democratic one
  • Smart software will decide based on largely open access journals which papers will be of most interest to a particular reader.
  • Biology Direct, a journal that provides open peer review that is available for readers to read along with the article, with or without changes suggested by the reviewers.
  • Impact Factor and the Future of Medical Journals
  • Impact Factor and the Future of Medical Journals
Javier E

Creativity Becomes an Academic Discipline - NYTimes.com - 0 views

  • Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill.
  • “The reality is that to survive in a fast-changing world you need to be creative,”
  • “That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”
  • ...16 more annotations...
  • Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.
  • The method, which is used in Buffalo State classrooms, has four steps: clarifying, ideating, developing and implementing. People tend to gravitate to particular steps, suggesting their primary thinking style.
  • What’s igniting campuses, though, is the conviction that everyone is creative, and can learn to be more so.
  • Just about every pedagogical toolbox taps similar strategies, employing divergent thinking (generating multiple ideas) and convergent thinking (finding what works).The real genius, of course, is in the how.
  • as content knowledge evolves at lightning speed, educators are talking more and more about “process skills,” strategies to reframe challenges and extrapolate and transform information, and to accept and deal with ambiguity.
  • Ideating is brainstorming and calls for getting rid of your inner naysayer to let your imagination fly.
  • Clarifying — asking the right question — is critical because people often misstate or misperceive a problem. “If you don’t have the right frame for the situation, it’s difficult to come up with a breakthrough,
  • Developing is building out a solution, and maybe finding that it doesn’t work and having to start over
  • Implementing calls for convincing others that your idea has value.
  • “the frequency and intensity of failures is an implicit principle of the course. Getting into a creative mind-set involves a lot of trial and error.”
  • His favorite assignments? Construct a résumé based on things that didn’t work out and find the meaning and influence these have had on your choices.
  • “Examine what in the culture is preventing you from creating something new or different. And what is it like to look like a fool because a lot of things won’t work out and you will look foolish? So how do you handle that?”
  • Because academics run from failure, Mr. Keywell says, universities are “way too often shapers of formulaic minds,” and encourage students to repeat and internalize fail-safe ideas.
  • “The new people who will be creative will sit at the juxtaposition of two or more fields,” she says. When ideas from different fields collide, Dr. Cramond says, fresh ones are generated.
  • Basic creativity tools used at the Torrance Center include thinking by analogy, looking for and making patterns, playing, literally, to encourage ideas, and learning to abstract problems to their essence.
  • students explore definitions of creativity, characteristics of creative people and strategies to enhance their own creativity.These include rephrasing problems as questions, learning not to instinctively shoot down a new idea (first find three positives), and categorizing problems as needing a solution that requires either action, planning or invention.
Javier E

"Breaking Bad" By Niccolo Machiavelli « The Dish - 0 views

  • If a man is truly a man through force and fraud and nerve, then Walter becomes the man he always wanted to be. He trounces every foe; he gains a huge fortune; he dies a natural death. Compared with being a high school chemistry teacher? Niccolo would scoff at the comparison. “I did it for me.”
  • Walt is consumed all along by justified resentment of the success others stole from him, and by a rage that his superior mind was out-foxed by unscrupulous colleagues. He therefore lived and died his final years for human honor – for what <img class="alignright size-medium wp-image-150262" alt="466px-Portrait_of_Niccolò_Machiavelli_by_Santi_di_Tito" src="http://sullydish.files.wordpress.com/2013/02/466px-portrait_of_niccolocc80_machiavelli_by_santi_di_tito.jpg?w=233&h=300" width="233" height="300" />Machiavelli calls virtu, a caustic, brutal inversion of Christian virtue
  • his skills were eventually proven beyond any measure in ways that would never have happened if he had never broken bad. And breaking bad cannot mean putting a limit on what you are capable of doing. What Machiavelli insisted upon was that a successful power-broker know how to be “altogether bad.”
  • ...8 more annotations...
  • the cost-benefit analysis of “breaking bad” when the alternative is imminently “dying alone” is rigged in favor of the very short term, i.e. zero-sum evil. If Walt had had to weigh a long, unpredictable lifetime of unending fear and constant danger for his family and himself, he would have stopped cooking meth.
  • was he happy? Yes, but in a way that never really reflects any inner peace. He is happy in a way that all millionaires and tyrants are happy.
  • Machiavelli differs from later realists like Hobbes—and more contemporary “neorealists” like the late Kenneth Waltz—in recognizing that human agency matters as much as the structural fact of international anarchy in determining both foreign policy behavior and ultimate outcomes in world politics.
  • It should be taught because it really does convey the egoist appeal of evil, of acting ruthlessly in the world
  • The benefits only work if your life is nasty, brutish and short. The costs are seen in the exhausted, broken eyes of Skyler, the betrayal of an only painfully faithful son, the murder of a brother-in-law, the grisly massacre of dozens, the endless nervous need to be on the alert, to run and hide and lie and lie and lie again, until life itself becomes merely a means to achieve temporary security.
  • Breaking Bad should be taught alongside Machiavelli – as a riveting companion piece.
  • a leader’s choices can have a pivotal impact on politics, both domestic and international.
  • Though fortune be capricious and history contingent, the able leader may shape his fate and that of his state through the exercise of virtu. This is not to be mistaken for “virtue”, as defined by Christian moral teaching (implying integrity, charity, humility, and the like). Rather, it denotes the human qualities prized in classical antiquity, including knowledge, courage, cunning, pride, and strength.
Javier E

Guess Who Doesn't Fit In at Work - NYTimes.com - 0 views

  • ACROSS cultures and industries, managers strongly prize “cultural fit” — the idea that the best employees are like-minded.
  • One recent survey found that more than 80 percent of employers worldwide named cultural fit as a top hiring priority.
  • When done carefully, selecting new workers this way can make organizations more productive and profitable.
  • ...18 more annotations...
  • In the process, fit has become a catchall used to justify hiring people who are similar to decision makers and rejecting people who are not.
  • The concept of fit first gained traction in the 1980s. The original idea was that if companies hired individuals whose personalities and values — and not just their skills — meshed with an organization’s strategy, workers would feel more attached to their jobs, work harder and stay longer.
  • in many organizations, fit has gone rogue. I saw this firsthand while researching the hiring practices of the country’s top investment banks, management consultancies and law firms. I interviewed 120 decision makers and spent nine months observing
  • While résumés (and connections) influenced which applicants made it into the interview room, interviewers’ perceptions of fit strongly shaped who walked out with job offers.
  • Crucially, though, for these gatekeepers, fit was not about a match with organizational values. It was about personal fit. In these time- and team-intensive jobs, professionals at all levels of seniority reported wanting to hire people with whom they enjoyed hanging out and could foresee developing close relationships with
  • To judge fit, interviewers commonly relied on chemistry. “
  • Many used the “airport test.” As a managing director at an investment bank put it, “Would I want to be stuck in an airport in Minneapolis in a snowstorm with them?”
  • interviewers were primarily interested in new hires whose hobbies, hometowns and biographies matched their own. Bonding over rowing college crew, getting certified in scuba, sipping single-malt Scotches in the Highlands or dining at Michelin-starred restaurants was evidence of fit; sharing a love of teamwork or a passion for pleasing clients was not
  • it has become a common feature of American corporate culture. Employers routinely ask job applicants about their hobbies and what they like to do for fun, while a complementary self-help industry informs white-collar job seekers that chemistry, not qualifications, will win them an offer.
  • Selection based on personal fit can keep demographic and cultural diversity low
  • In the elite firms I studied, the types of shared experiences associated with fit typically required large investments of time and money.
  • Class-biased definitions of fit are one reason investment banks, management consulting firms and law firms are dominated by people from the highest socioeconomic backgrounds
  • Also, whether the industry is finance, high-tech or fashion, a good fit in most American corporations still tends to be stereotypically masculine.
  • Perhaps most important, it is easy to mistake rapport for skill. Just as they erroneously believe that they can accurately tell when someone is lying, people tend to be overly confident in their ability to spot talent. Unstructured interviews, which are the most popular hiring tools for American managers and the primary way they judge fit, are notoriously poor predictors of job performance.
  • Organizations that use cultural fit for competitive advantage tend to favor concrete tools like surveys and structured interviews that systematically test behaviors associated with increased performance and employee retention.
  • For managers who want to use cultural fit in a more productive way, I have several suggestions.
  • First, communicate a clear and consistent idea of what the organization’s culture is (and is not) to potential employees. Second, make sure the definition of cultural fit is closely aligned with business goals. Ideally, fit should be based on data-driven analysis of what types of values, traits and behaviors actually predict on-the-job success. Third, create formal procedures like checklists for measuring fit, so that assessment is not left up to the eyes (and extracurriculars) of the beholder.
  • But cultural fit has become a new form of discrimination that keeps demographic and cultural diversity down
Javier E

Rise in Scientific Journal Retractions Prompts Calls for Reform - NYTimes.com - 1 views

  • before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.
  • he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct.
  • Members of the committee agreed with their assessment. “I think this is really coming to a head,” said Dr. Roberta B. Ness, dean of the University of Texas School of Public Health. And Dr. David Korn of Harvard Medical School agreed that “there are problems all through the system.”
  • ...20 more annotations...
  • science has changed in some worrying ways in recent decades — especially biomedical research, which consumes a larger and larger share of government science spending.
  • the journal Nature reported that published retractions had increased tenfold over the past decade, while the number of published papers had increased by just 44 percent.
  • because journals are now online, bad papers are simply reaching a wider audience, making it more likely that errors will be spotted.
  • The National Institutes of Health accepts a much lower percentage of grant applications today than in earlier decades. At the same time, many universities expect scientists to draw an increasing part of their salaries from grants, and these pressures have influenced how scientists are promoted.
  • Dr. Fang and Dr. Casadevall looked at the rate of retractions in 17 journals from 2001 to 2010 and compared it with the journals’ “impact factor,” a score based on how often their papers are cited by scientists. The higher a journal’s impact factor, the two editors found, the higher its retraction rate.
  • Each year, every laboratory produces a new crop of Ph.D.’s, who must compete for a small number of jobs, and the competition is getting fiercer. In 1973, more than half of biologists had a tenure-track job within six years of getting a Ph.D. By 2006 the figure was down to 15 percent.
  • Yet labs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,
  • In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,”
  • To survive professionally, scientists feel the need to publish as many papers as possible, and to get them into high-profile journals. And sometimes they cut corners or even commit misconduct to get ther
  • “What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how may grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”
  • Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”
  • Universities want to attract successful scientists, and so they have erected a glut of science buildings, Dr. Stephan said. Some universities have gone into debt, betting that the flow of grant money will eventually pay off the loans.
  • “You can’t afford to fail, to have your hypothesis disproven,” Dr. Fang said. “It’s a small minority of scientists who engage in frank misconduct. It’s a much more insidious thing that you feel compelled to put the best face on everything.”
  • , Dr. Stephan points out that a number of countries — including China, South Korea and Turkey — now offer cash rewards to scientists who get papers into high-profile journals.
  • To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”
  • They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.
  • Such a shift would require scientists to surrender some of their most cherished practices — the priority rule, for example, which gives all the credit for a scientific discovery to whoever publishes results first.
  • To ease such cutthroat competition, the two editors would also change the rules for scientific prizes and would have universities take collaboration into account when they decide on promotions.
  • Even scientists who are sympathetic to the idea of fundamental change are skeptical that it will happen any time soon. “I don’t think they have much chance of changing what they’re talking about,” said Dr. Korn, of Harvard.
  • “When our generation goes away, where is the new generation going to be?” he asked. “All the scientists I know are so anxious about their funding that they don’t make inspiring role models. I heard it from my own kids, who went into art and music respectively. They said, ‘You know, we see you, and you don’t look very happy.’ ”
Javier E

Nobel in Economics Given to Angus Deaton for Studies of Consumption - The New York Times - 1 views

  • “To design economic policy that promotes welfare and reduces poverty, we must first understand individual consumption choices,” the Royal Swedish Academy of Sciences said in announcing the economics prize, the last of this year’s Nobels. “More than anyone else, Angus Deaton has enhanced this understanding.”
  • “Simple ways of looking at the world are often the basis of policy, and if the statements are wrong, then the policy may be wrong also,” said the economist Janet M. Currie, a Princeton colleague. “He’s always had a concern for trying to capture the complexity of the real world.”
  • Economics before the 1980s relied on the assumption that the people represented by a large economic aggregate behaved in roughly the same way, not because that made sense but because accounting for diversity was too hard.
  • ...9 more annotations...
  • “What he’s shown is that you do learn a great deal more by looking at the behavior that underlies the aggregates,”
  • Professor Deaton said he hoped “carefulness in measurement” would be his legacy. He said his mentor, Richard Stone, a Cambridge professor who won the Nobel in economics in 1984, had ingrained in him the importance of good data. “I’ve always wanted to be like him,” Professor Deaton said. “I think putting numbers together into a coherent framework always seemed to me to be what really matters.”
  • His work also is marked by an insistence that theories must explain these more complicated sets of facts. “A good theoretical account must explain all of the evidence that we see,” Professor Deaton wrote in a 2011 essay on his life in economics. “If it doesn’t work everywhere, we have no idea what we are talking about, and all is chaos.”
  • “There’s a fair amount of policy agnosticism that comes from this — it emphasizes more the heterogeneity of outcomes,” Professor Rodrik said of Professor Deaton. “He’s somebody with quite a sharp tongue, and he’s often had as his target people who make very strong statements about this policy or that policy.”
  • Proponents of foreign aid programs are a frequent target. Professor Deaton has argued that such investments often undermine local governance and the development of institutions necessary to sustain development.
  • “Life is better now than at almost any time in history,” he wrote in the opening lines of his 2013 book “The Great Escape,” a popular account of his work. “More people are richer and fewer people live in dire poverty. Lives are longer and parents no longer routinely watch a quarter of their children die.”
  • He counts climate change and increased economic inequality in developed nations as threats to this progress.
  • He has noted in other work that inequality occurs naturally because of divergent luck, but he has said that the growing gaps in recent years pose a new economic and political challenge.
  • “I think inequality has gone past the point where it’s helping us all get rich, and it’s really becoming a serious threat,” he said
Javier E

The Moral Ill Effects of Teaching Economics | Amitai Etzioni - 1 views

  • the hypothesis that teaching economics is debasing people's morality
  • They designed a game where participants were given an allotment of tokens to divide between a private account and a public fund
  • the game was designed to promote free-riding: the socially optimal behavior would be to contribute to the public fund, but the personal advantage was in investing everything in the private fund (as long as the others did not catch on or make the same move).
  • ...10 more annotations...
  • most subjects divided their tokens nearly equally between the public and private accounts
  • Economics students, by contrast, invested only 20 percent of their tokens in the public fund, on average.
  • Three quarters of non-economists reported that a "fair" investment of tokens would necessitate putting at least half of their tokens in the public fund. A third of economists didn't answer the question or gave "complex, uncodable responses." The remaining economics students were much more likely than their non-economist peers to say that "little or no contribution was 'fair'."
  • Other studies have found economics students to exhibit a stronger tendency towards anti-social positions compared to their peers.
  • Carter and Irons had both economics students and non-economics students play the "ultimatum" game -- a two-player game where one player is given a sum of money to divide between the two. The other player is then given a chance to accept or reject the offer; if she accepts it, then each player receives the portion of money proposed by the offerer. If she declines, then neither player gets any money. Carter and Irons found that, relative to non-economics students, economics students were much more likely to offer their partners small sums, and, thus, deviate from a "fair" 50/50 spilt.
  • Finally, researchers had both economics and non-economics students fill out two "honesty surveys" -- one at the start of the semester and one at the conclusion -- regarding how likely they were to either report being undercharged for a purchase or return found money to its owner. The authors found that, after taking an economics class, students' responses to the end-of-the-semester survey were more likely to reflect a decline in honest behavior than students who studied astronomy.
  • Other studies supported these key findings. They found that economics students are less likely to consider a vendor who increases the price of bottled water on a hot day to be acting "unfairly." Economics students who played a lottery game were willing to commit less of their potential winnings to fund a consolation prize for losers than were their peers. And such students were significantly more willing to accept bribes than other students. Moreover, economics students valued personal achievement and power more than their peers while attributing less importance to social justice and equality.
  • results show that it is not just selection that is responsible for the reported increase in immoral attitudes
  • Later studies support this conclusion. They found ideological differences between lower-level economics students and upper-level economics students that are similar in kind to the measured differences between the ideology of economics students as a whole and their peers. He finds that upper-level students are even less likely to support egalitarian solutions to distribution problems than lower-level students, suggesting that time spent studying economics does have an indoctrination effect.
  • The problem is not only that students are exposed to such views, but that there are no "balancing" courses taught in typical American colleges, in which a different view of economics is presented. Moreover, while practically all economic classes are taught in the "neoclassical" (libertarian, self centered) viewpoint, in classes by non-economists -- e.g., in social philosophy, political science, and sociology -- a thousand flowers bloom such that a great variety of approaches are advanced, thereby leaving students with a cacophony of conflicting pro-social views. What is needed is a systematic pro-social economics, that combines appreciation for the common good and for others as well as for the service of self.
kushnerha

Are scientists blocking their own progress? - The Washington Post - 1 views

  • Max Planck won a Nobel prize for his revolutionary work in quantum mechanics, but it was his interest in the philosophy of science that led to what is now called “Planck’s Principle.” Planck argued that science was an evolving system of thought which changes slowly over time, fueled by the deaths of old ideas. As he wrote in his 1968 autobiography: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
  • Is our understanding of the world based in pure objective reason, or are the theories that underpin it shaped by generational biases? Do our most famous thinkers actually block new ideas from gaining ground?
  • A new paper published by the National Bureau of Economic Research suggests that fame does play a significant role in deciding when and whether new scientific ideas can gain traction. When a prominent scientist dies, the paper’s authors found, the number of articles published by his or her collaborators tends to fall “precipitously” in the years following the death — those supporters tend not to continue advocating for a once-famous scientist’s ideas once the scientist is gone.
  • ...8 more annotations...
  • the number of research articles written by other scientists — including those with opposing ideas — increases by 8 percent on average, implying that the work of these scientists had been stifled before, but that after the death of a ubiquitous figure, the field becomes more open to new ideas. The study also found that these new articles are less likely to cite previous research and are more likely to be cited by others in the field. Death signifies a changing of the guard
  • Our instinct is often to view science as a concrete tower, growing ever upward and built upon the immovable foundations of earlier pioneers.  Sir Isaac Newton famously characterized this as “standing on the shoulders of giants.”
  • Mid-20th century philosopher Thomas Kuhn was among the first to come to this conclusion, in his 1962 book “The Structure of Scientific Revolutions.” He argued that scientific theories appeared in punctuated “paradigm shifts,” in which the underlying assumptions of a field are questioned and eventually overthrown
  • Kuhn’s book was, to some extent, a paradigm shift in its own right. According to his logic, commonly held notions in science were bound to change and become outdated. What we believe today will tomorrow be revised, rewritten — and in the most extreme cases ridiculed.
  • the journal Nature earlier this year said scientific data is prone to bias because researchers design experiments and make observations in ways that support hypotheses
  • equally as important are simple shifts in perspective. It only takes one researcher seeing an accepted scientific model in a new light for a solidified paradigm to enter what Kuhn called a “crisis phase” and beg for alternative explanations
  • The NBER study shows that those who questioned consensus ought to be given the opportunity to make their case, not ignored, silenced or pushed to the back of the line.
  • We’re likely to see these “paradigm shifts” happen at a much faster rate as data and research become easier to share worldwide. For some, this reality might seem chaotic; for the truly curious, it is exhilarating. The result may be a more democratic version of science — one in which the progress of ideas doesn’t have to wait until the funeral of a great mind.
kushnerha

There's nothing wrong with grade inflation - The Washington Post - 0 views

  • By the early ’90s, so long as one had the good sense to major in the humanities — all bets were off in the STEM fields — it was nearly impossible to get a final grade below a B-minus at an elite college. According to a 2012 study, the average college GPA, which in the 1930s was a C-plus, had risen to a B at public universities and a B-plus at private schools. At Duke, Pomona and Harvard, D’s and F’s combine for just 2 percent of all grades. A Yale report found that 62 percent of all Yale grades are A or A-minus. According to a 2013 article in the Harvard Crimson, the median grade at Harvard was an A-minus , while the most common grade was an A.
  • The result is widespread panic about grade inflation at elite schools. (The phenomenon is not as prevalent at community colleges and less-selective universities.) Some blame students’ consumer mentality, a few see a correlation with small class sizes (departments with falling enrollments want to keep students happy), and many cite a general loss of rigor in a touchy-feely age.
  • Yet whenever elite schools have tried to fight grade inflation, it’s been a mess. Princeton instituted strict caps on the number of high grades awarded, then abandoned the plan, saying the caps dissuaded applicants and made students miserable. At Wellesley, grade-inflated humanities departments mandated that the average result in their introductory and intermediate classes not exceed a B-plus. According to one study, enrollment fell by one-fifth, and students were 30 percent less likely to major in one of these subjects.
  • ...12 more annotations...
  • I liked the joy my students found when they actually earned a grade they’d been reaching for. But whereas I once thought we needed to contain grades, I now see that we may as well let them float skyward. If grade inflation is bad, fighting it is worse. Our goal should be ending the centrality of grades altogether. For years, I feared that a world of only A’s would mean the end of meaningful grades; today, I’m certain of it. But what’s so bad about that?
  • It’s easy to see why schools want to fight grade inflation. Grades should motivate certain students: those afraid of the stigma of a bad grade or those ambitious, by temperament or conditioning, to succeed in measurable ways. Periodic grading during a term, on quizzes, tests or papers, provides feedback to students, which should enable them to do better. And grades theoretically signal to others, such as potential employers or graduate schools, how well the student did. (Grade-point averages are also used for prizes and class rankings, though that doesn’t strike me as an important feature.)
  • But it’s not clear that grades work well as motivators. Although recent research on the effects of grades is limited, several studies in the 1970s, 1980s and 1990s measured how students related to a task or a class when it was graded compared to when it was ungraded. Overall, graded students are less interested in the topic at hand and — and, for obvious, common-sense reasons — more inclined to pick the easiest possible task when given the chance. In the words of progressive-education theorist Alfie Kohn, author of “The Homework Myth,” “the quality of learning declines” when grades are introduced, becoming “shallower and more superficial when the point is to get a grade.”
  • Even where grades can be useful, as in describing what material a student has mastered, they are remarkably crude instruments. Yes, the student who gets a 100 on a calculus exam probably grasps the material better than the student with a 60 — but only if she retains the knowledge, which grades can’t show.
  • I still can’t say very well what separates a B from an A. What’s more, I never see the kind of incompetence or impudence that would merit a D or an F. And now, in our grade-inflated world, it’s even harder to use grades to motivate, or give feedback, or send a signal to future employers or graduate schools.
  • According to a 2012 study by the Chronicle of Higher Education, GPA was seventh out of eight factors employers considered in hiring, behind internships, extracurricular activities and previous employment. Last year, Stanford’s registrar told the Chronicle about “a clamor” from employers “for something more meaningful” than the traditional transcript. The Lumina Foundation gave a$1.27 million grant to two organizations for college administrators working to develop better student records, with grades only one part of a student’s final profile.
  • Some graduate schools, too, have basically ditched grades. “As long as you don’t bomb and flunk out, grades don’t matter very much in M.F.A. programs,” the director of one creative-writing program told the New York Times. To top humanities PhD programs, letters of reference and writing samples matter more than overall GPA (although students are surely expected to have received good grades in their intended areas of study). In fact, it’s impossible to get into good graduate or professional schools without multiple letters of reference, which have come to function as the kind of rich, descriptive comments that could go on transcripts in place of grades.
  • suggests that GPAs serve not to validate students from elite schools but to keep out those from less-prestigious schools and large public universities, where grades are less inflated. Grades at community colleges “have actually dropped” over the years, according to Stuart Rojstaczer, a co-author of the 2012 grade-inflation study. That means we have two systems: one for students at elite schools, who get jobs based on references, prestige and connections, and another for students everywhere else, who had better maintain a 3.0. Grades are a tool increasingly deployed against students without prestige.
  • The trouble is that, while it’s relatively easy for smaller colleges to go grade-free, with their low student-to-teacher ratios, it’s tough for professors at larger schools, who must evaluate more students, more quickly, with fewer resources. And adjuncts teaching five classes for poverty wages can’t write substantial term-end comments, so grades are a necessity if they want to give any feedback at all.
  • It would mean hiring more teachers and paying them better (which schools should do anyway). And if transcripts become more textured, graduate-school admission offices and employers will have to devote more resources to reading them, and to getting to know applicants through interviews and letters of reference — a salutary trend that is underway already.
  • When I think about getting rid of grades, I think of happier students, with whom I have more open, democratic relationships. I think about being forced to pay more attention to the quiet ones, since I’ll have to write something truthful about them, too. I’ve begun to wonder if a world without grades may be one of those states of affairs (like open marriages, bicycle lanes and single-payer health care) that Americans resist precisely because they seem too good, suspiciously good. Nothing worth doing is supposed to come easy.
  • Alfie Kohn, too, sees ideology at work in the grade-inflation panic. “Most of what powers the arguments against grade inflation is a very right-wing idea that excellence consists in beating everyone else around you,” he says. “Even when you have sorted them — even when they get to Harvard! — we have to sort them again.” In other words, we can trust only a system in which there are clear winners and losers.
kushnerha

Six steps to stronger willpower - 0 views

  • Besides intelligence, willpower is meant to be the single most important trait for success in life.
  • while the brain is exercising self-control on one task, its discipline spreads to any other task at hand.
  • The participants who needed the toilet were more likely to forgo a smaller, immediate award in order to receive a bigger pay-out later on – a classic test of willpower.
  • ...5 more annotations...
  • Psychologists think of willpower as a “limited resource” – essentially, you can use it up over the course of a day.
  • Self-control often involves suppressing some difficult emotions, as you keep your eye on the prize. Fortunately, mindful contemplation helps you to balance your feelings
  • ways to restore it. One option is comedy. A recent study found that people who watched funny videos were better at controlling their impulses later on.
  • Self-control uses up the brain’s energy reserves, meaning that you are more weak-willed when you are hungry. One study found that judges are more likely to make rash judgements before lunch for this very reason
  • The mind automatically associates guilt with pleasure – meaning that we find our vices even more enticing when we know we’re not meant to enjoy them. Conversely, a little guilt-free indulgence may just be the rest you need to help you maintain your resolve.
Javier E

To Cut My Spending, I Used Behavioral Economics on Myself - The Atlantic - 0 views

  • “The average person, in my view, a lot of the overspending they do isn’t in the small things, which your system is likely to deal with,” he said. “But it’s large things that are often quite invisible, and wouldn’t be picked up by your system.” There are usually more savings to be had from revisiting one’s auto- or home-insurance policy, or one’s phone bill, than from skipping the marginal cup of coffee. Loewenstein said it’s more effective to make changes with larger “one-time decisions,” instead of regularly having to make “all these micro-decisions.”
  • the dynamics that shape spending. On one side of each credit-card swipe are multiple financial corporations—a phalanx of marketers, programmers, and data analysts who have perfect visibility into countless transactions, and who are thus armed with plentiful information about people’s purchases. On the other is the individual, who lacks this bird’s-eye view and is effectively on their own as they weigh whether and how much to spend at any given time. This arrangement seems lopsided and unfair
  • “A lot of the problem is us … We tend to blame the credit-card industry for our own desire to have a standard of living that is beyond what our income is. You can’t blame Visa for that.” He said the focus should be on norms, and how individual action can alter them—maybe two friends cook dinner together instead of going out. The goal, Pollack says, would be a culture that prizes restraint without being puritanical.
  • ...1 more annotation...
  • What would create such a culture? There is the Consumer Financial Protection Bureau, which (in theory) provides high-level government oversight, and there are small individual actions (like, say, meticulously tracking one’s purchases), but there isn’t something in between—a powerful advocacy group, a mainstream cultural movement, or something else not yet built or imagined—that serves as a counterweight to the pressure on Americans to spend.
‹ Previous 21 - 40 of 65 Next › Last »
Showing 20 items per page