Skip to main content

Home/ TOK Friends/ Contents contributed and discussions participated by Javier E

Contents contributed and discussions participated by Javier E

Javier E

How 'ObamaCare' Went from Smear to Cheer - Politics - The Atlantic Wire - 3 views

  • Friday's effort strikes us as a pretty smart, if overdue, campaign strategy to try to make a mark on a term that has undoubtedly entered the lexicon, whether the campaign likes it or not.
  • why shouldn't they use "Obamacare" to describe the bill? There's nothing immediately or obviously pejorative in the word. In fact, given the bill's very long actual name, we're kind of appreciative for an easily recognized alternative. The problem for Democrats is that conservatives really effectively claimed the term and attached it in the public's mind with negative sentiment. As Kiran Moodley wrote in a piece for The Atlantic last year on the term, conservatives, through repetition, were able to link the idea to  government intrusion and the larger Obama agenda. "Obamacare" became more than an innocent shorthand: It became a rallying cry.
Javier E

The Benefits of Bilingualism - NYTimes.com - 2 views

  • Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age.
  • in a bilingual’s brain both language systems are active even when he is using only one language, thus creating situations in which one system obstructs the other. But this interference, researchers are finding out, isn’t so much a handicap as a blessing in disguise. It forces the brain to resolve internal conflict, giving the mind a workout that strengthens its cognitive muscles.
  • the bilingual experience improves the brain’s so-called executive function — a command system that directs the attention processes that we use for planning, solving problems and performing various other mentally demanding tasks. These processes include ignoring distractions to stay focused, switching attention willfully from one thing to another and holding information in mind — like remembering a sequence of directions while driving.
  • ...2 more annotations...
  • The key difference between bilinguals and monolinguals may be more basic: a heightened ability to monitor the environment. “Bilinguals have to switch languages quite often — you may talk to your father in one language and to your mother in another language,” says Albert Costa, a researcher at the University of Pompeu Fabra in Spain. “It requires keeping track of changes around you in the same way that we monitor our surroundings when driving.”
  • individuals with a higher degree of bilingualism — measured through a comparative evaluation of proficiency in each language — were more resistant than others to the onset of dementia and other symptoms of Alzheimer’s disease: the higher the degree of bilingualism, the later the age of onset.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

The Way We Read Now - NYTimes.com - 0 views

  • “I would be most content if my children grew up to be the kind of people who think decorating consists mostly of building enough bookshelves.”
Javier E

Millennials Are More 'Generation Me' Than 'Generation We,' Study Finds - Students - The... - 0 views

  • The study, which compares the traits of young people in high school and entering college today with those of baby boomers and Gen X'ers at the same age from 1966 to 2009, shows an increasing trend of valuing money, image, and fame more than inherent principles like self-acceptance, affiliation, and community. "The results generally support the 'Generation Me' view of generational differences rather than the 'Generation We,'
  • college students in 1971 ranked the importance of being very well off financially No. 8 in their life goals, but since 1989, they have consistently placed it at the top of the list.
  • "I see no evidence that today's young people feel much attachment to duty or to group cohesion. Young people have been consistently taught to put their own needs first and to focus on feeling good about themselves."
  • ...4 more annotations...
  • That view is apparent in the new study's findings, such as a steep decline in concern for the environment. The study found that three times more millennials than baby boomers said they made no personal effort at all to practice sustainability. Only 51 percent of millennials said they tried to save energy by cutting down on electricity, compared with 68 percent of baby boomers and 60 percent of Gen X'ers.
  • The study also found a decline in civic interest, such as political participation and trust in government, as well as in concern for others, including charity donations, and in the importance of having a job worthwhile to society.
  • "The aphorisms have shifted to 'believe in yourself' and 'you're special,'" she says. "It emphasizes individualism, and this gets reflected in personality traits and attitudes."
  • Even community service, the one aspect where millennials' engagement rose, does not seem to stem from genuine altruism. The study attributes that gain to high schools in recent years requiring volunteer hours to graduate. The number of public high schools with organized community-service programs jumped from 9 percent in 1984 to 46 percent in 1999, according to the study.
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

Couples and Dating | Men's Health - 0 views

  • in the world of online dating, frivolous similarities really do matter. When researchers at MIT tracked 65,000 online daters for a 2005 study, they observed "significant homophily." Translation: You're typically interested in someone just like you, who likes the same things you do.
  • Finding a decent signal amid all this noise takes work. This is one of the market failures of window-shopping for soul mates, writes behavioral economist Dan Ariely, Ph.D., author of The Upside of Irrationality. He cites this finding from University of Chicago research: A typical online dater spends an average of 12 hours a week screening but only 2 hours dating. Not a good return.
  • All my wife's likes and dislikes—the ones I've had to learn over time—are right there on the screen for some other guy to capitalize on. To make her short list, all he has to do is declare, "Me too!"
  • ...1 more annotation...
  • Not surprisingly, the perception of financial security is a big deal for online Juliets. In one study, Ariely and his colleagues calculated that a man who's 5'9" must outearn a 5'10" suitor by at least $35,000 a year just to be seen as equally attractive.
Javier E

Jon Meacham on Why We Question God | TIME Ideas | TIME.com - 2 views

  • Hamilton was no militant atheist. He was not contemptuous of faith or of the faithful—far from it; he was a longtime churchgoer—and he was therefore, I think, all the more a threat to unreflective Christianity. At heart, he was questioning whether the Christian tradition of encouraging a temporal moral life required belief in a divine order.
  • The questions with which he grappled were eternal, essential, and are with us still: how does a culture that tends to be religious continue to hold to a belief in an all-powerful, all-loving divinity beyond time and space given the evidence of science and of experience?
  • faith has become not a possession but a hope.”
  • ...2 more annotations...
  • My own view of these things is that we simply do not know enough to judge the ultimate truth of the claims of theology. (I’m with Hamlet, who remarked to Horatio: “There are more things in heaven and earth … than are dreamt of in your philosophy.”) Perhaps we will one day; perhaps not. Meanwhile, given that religious faith is an intrinsic element of human experience, it is best to approach and engage the subject with a sense of history and a critical sensibility.
  • In his view that faith was “not a possession but a hope,” Hamilton was tapping into an ancient tradition. As the author of the New Testament Epistle to the Hebrews wrote, “Faith is the substance of things hoped for, the evidence of things not seen”—in this sense, religious faith is way of interpreting experience that allows for the possibility of the redemptive. Faith in this sense assumes that scripture and tradition are the works of human hands and hearts, efforts undertaken to explain the seemingly inexplicable. Faith in this sense is inextricably tied to doubt; it is an attempt, sometimes successful and sometimes not, to squint and struggle to “see through a glass darkly,” as Paul wrote in Corinthians. Faith without such doubt has never been part of the Christian tradition; it is telling, I think, that one of the earliest resurrection scenes in the Bible is that of Thomas demanding evidence—he wanted to see, to touch, to prove. Those who question and probe and debate are heirs of the apostles just as much as the most fervent of believers.
Javier E

Forget the Money, Follow the Sacredness - NYTimes.com - 0 views

  • Despite what you might have learned in Economics 101, people aren’t always selfish. In politics, they’re more often groupish. When people feel that a group they value — be it racial, religious, regional or ideological — is under attack, they rally to its defense, even at some cost to themselves. We evolved to be tribal, and politics is a competition among coalitions of tribes.
  • The key to understanding tribal behavior is not money, it’s sacredness. The great trick that humans developed at some point in the last few hundred thousand years is the ability to circle around a tree, rock, ancestor, flag, book or god, and then treat that thing as sacred. People who worship the same idol can trust one another, work as a team and prevail over less cohesive groups. So if you want to understand politics, and especially our divisive culture wars, you must follow the sacredness.
  • A good way to follow the sacredness is to listen to the stories that each tribe tells about itself and the larger nation.
  • ...3 more annotations...
  • The Notre Dame sociologist Christian Smith once summarized the moral narrative told by the American left like this: “Once upon a time, the vast majority” of people suffered in societies that were “unjust, unhealthy, repressive and oppressive.” These societies were “reprehensible because of their deep-rooted inequality, exploitation and irrational traditionalism — all of which made life very unfair, unpleasant and short. But the noble human aspiration for autonomy, equality and prosperity struggled mightily against the forces of misery and oppression and eventually succeeded in establishing modern, liberal, democratic, capitalist, welfare societies.” Despite our progress, “there is much work to be done to dismantle the powerful vestiges of inequality, exploitation and repression.” This struggle, as Smith put it, “is the one mission truly worth dedicating one’s life to achieving.”This is a heroic liberation narrative. For the American left, African-Americans, women and other victimized groups are the sacred objects at the center of the story. As liberals circle around these groups, they bond together and gain a sense of righteous common purpose.
  • the Reagan narrative like this: “Once upon a time, America was a shining beacon. Then liberals came along and erected an enormous federal bureaucracy that handcuffed the invisible hand of the free market. They subverted our traditional American values and opposed God and faith at every step of the way.” For example, “instead of requiring that people work for a living, they siphoned money from hard-working Americans and gave it to Cadillac-driving drug addicts and welfare queens.” Instead of the “traditional American values of family, fidelity and personal responsibility, they preached promiscuity, premarital sex and the gay lifestyle” and instead of “projecting strength to those who would do evil around the world, they cut military budgets, disrespected our soldiers in uniform and burned our flag.” In response, “Americans decided to take their country back from those who sought to undermine it.”This, too, is a heroic narrative, but it’s a heroism of defense. In this narrative it’s God and country that are sacred — hence the importance in conservative iconography of the Bible, the flag, the military and the founding fathers. But the subtext in this narrative is about moral order. For social conservatives, religion and the traditional family are so important in part because they foster self-control, create moral order and fend off chaos.
  • Part of Reagan’s political genius was that he told a single story about America that rallied libertarians and social conservatives, who are otherwise strange bedfellows. He did this by presenting liberal activist government as the single devil that is eternally bent on destroying two different sets of sacred values — economic liberty and moral order. Only if all nonliberals unite into a coalition of tribes can this devil be defeated.
Javier E

Renaming Philosophy - NYTimes.com - 0 views

  • I suggested in my earlier essay that philosophy so conceived is best classified as a science, because of its rigor, technicality, universality, falsifiability, connection with other sciences, and concern with the nature of objective being (among other reasons). I did not claim, however, that it is an empirical science, like physics and chemistry; rather, it is an a priori science, like the “formal science” of mathematics.
  • This is not a matter of dubious public relations for a languishing field of study; rather, it is simply the recognition of the intellectual substance of the discipline — its power and achievements
  • There is plenty of room here for ethics, philosophy of art, value theory, and even “practical wisdom.” In my terminology, we might label these parts of philosophy “axiological ontics”— that is, the study of the nature and being of value in all its forms.
  • ...2 more annotations...
  • My main question was what to call this subject, in view of the confusions wrought by its current name and the ancient origin of the word.
  • The general reaction to my original essay from people not professionally involved in philosophy rather proves my point about the need for linguistic reform. There is precious little understanding of what the subject is really like, but a lot of opinion about its demerits and betrayals of its historical ideals. To be sure, we will not cure such ignorance and hostility — either from the dogmatists of empirical science or the disappointed fringe mystics — by simply relabeling the subject; but we should at least forestall some of the ire that stems from the etymology and popular meaning of the word “philosophy”
Javier E

Do People Eat Too Much Because They Enjoy It Too Little? - Hit & Run : Reason Magazine - 0 views

  • Qnexa, which in clinical trials helped subjects lose about one-tenth of their weight on average, combines an appetite-suppressing stimulant with "an anticonvulsant shown to reduce cravings for binge-eaters." Lehrer says it seems to work partly by increasing "activity in the dopamine reward pathway," which "allows dieters to squeeze more satisfaction from every bite."
  • "People crave pleasure, and they don't stop until they get their fill, even if means consuming the entire pint of Häagen-Dazs." He says one lesson for dieters is that "it's important to seek pleasure from many sources," since "people quickly adapt to the pleasure of any single food." 
  • it contradicts the advice commonly heard from anti-obesity crusaders such as Kelly Brownell and David Kessler, who say the problem is that food is too delicious and too varied. Rats who eat their fill of one food, they note, will begin chowing down again if given something different. Hence variety is the dieter's enemy—not, as Lehrer suggests, his friend.
  • ...1 more annotation...
  • These clashing perspectives are reflected in the perennial conflict between two dieting dicta: 1) avoid temptation and 2) don't make yourself feel deprived. There is some truth to both views. 
Javier E

The Rediscovery of Character - NYTimes.com - 0 views

  • broken windows was only a small piece of what Wilson contributed, and he did not consider it the center of his work. The best way to understand the core Wilson is by borrowing the title of one of his essays: “The Rediscovery of Character.”
  • When Wilson began looking at social policy, at the University of Redlands, the University of Chicago and Harvard, most people did not pay much attention to character. The Marxists looked at material forces. Darwinians at the time treated people as isolated products of competition. Policy makers of right and left thought about how to rearrange economic incentives. “It is as if it were a mark of sophistication for us to shun the language of morality in discussing the problems of mankind,” he once recalled.
  • during the 1960s and ’70s, he noticed that the nation’s problems could not be understood by looking at incentives
  • ...9 more annotations...
  • “At root,” Wilson wrote in 1985 in The Public Interest, “in almost every area of important concern, we are seeking to induce persons to act virtuously, whether as schoolchildren, applicants for public assistance, would-be lawbreakers or voters and public officials.”
  • When Wilson wrote about character and virtue, he didn’t mean anything high flown or theocratic. It was just the basics, befitting a man who grew up in the middle-class suburbs of Los Angeles in the 1940s: Behave in a balanced way. Think about the long-term consequences of your actions. Cooperate. Be decent.
  • Wilson argued that American communities responded to the stresses of industrialization by fortifying self-control.
  • he emphasized that character was formed in groups. As he wrote in “The Moral Sense,” his 1993 masterpiece, “Order exists because a system of beliefs and sentiments held by members of a society sets limits to what those members can do.”
  • Wilson set out to learn how groups created a good order, why that order sometimes frayed.
  • In “The Moral Sense,” he brilliantly investigated the virtuous sentiments we are born with and how they are cultivated by habit. Wilson’s broken windows theory was promoted in an essay with George Kelling called “Character and Community.” Wilson and Kelling didn’t think of crime primarily as an individual choice. They saw it as something that emerged from the social psychology of a community. When neighborhoods feel disorganized and scary, crime increases.
  • It was habituated by practicing good manners, by being dependable, punctual and responsible day by day.
  • But America responded to the stresses of the information economy by reducing the communal buttresses to self-control, with unfortunate results.
  • Wilson was not a philosopher. He was a social scientist. He just understood that people are moral judgers and moral actors, and he reintegrated the vocabulary of character into discussions of everyday life.
Javier E

WHICH IS THE BEST LANGUAGE TO LEARN? | More Intelligent Life - 2 views

  • For language lovers, the facts are grim: Anglophones simply aren’t learning them any more. In Britain, despite four decades in the European Union, the number of A-levels taken in French and German has fallen by half in the past 20 years, while what was a growing trend of Spanish-learning has stalled. In America, the numbers are equally sorry.
  • compelling reasons remain for learning other languages.
  • First of all, learning any foreign language helps you understand all language better—many Anglophones first encounter the words “past participle” not in an English class, but in French. Second, there is the cultural broadening. Literature is always best read in the original. Poetry and lyrics suffer particularly badly in translation. And learning another tongue helps the student grasp another way of thinking.
  • ...11 more annotations...
  • is Chinese the language of the future?
  • So which one should you, or your children, learn? If you take a glance at advertisements in New York or A-level options in Britain, an answer seems to leap out: Mandarin.
  • The practical reasons are just as compelling. In business, if the team on the other side of the table knows your language but you don’t know theirs, they almost certainly know more about you and your company than you do about them and theirs—a bad position to negotiate from.
  • If you were to learn ten languages ranked by general usefulness, Japanese would probably not make the list. And the key reason for Japanese’s limited spread will also put the brakes on Chinese.
  • This factor is the Chinese writing system (which Japan borrowed and adapted centuries ago). The learner needs to know at least 3,000-4,000 characters to make sense of written Chinese, and thousands more to have a real feel for it. Chinese, with all its tones, is hard enough to speak. But  the mammoth feat of memory required to be literate in Mandarin is harder still. It deters most foreigners from ever mastering the system—and increasingly trips up Chinese natives.
  • A recent survey reported in the People’s Daily found 84% of respondents agreeing that skill in Chinese is declining.
  • Fewer and fewer native speakers learn to produce characters in traditional calligraphy. Instead, they write their language the same way we do—with a computer. And not only that, but they use the Roman alphabet to produce Chinese characters: type in wo and Chinese language-support software will offer a menu of characters pronounced wo; the user selects the one desired. (Or if the user types in wo shi zhongguo ren, “I am Chinese”, the software detects the meaning and picks the right characters.) With less and less need to recall the characters cold, the Chinese are forgetting them
  • As long as China keeps the character-based system—which will probably be a long time, thanks to cultural attachment and practical concerns alike—Chinese is very unlikely to become a true world language, an auxiliary language like English, the language a Brazilian chemist will publish papers in, hoping that they will be read in Finland and Canada. By all means, if China is your main interest, for business or pleasure, learn Chinese. It is fascinating, and learnable—though Moser’s online essay, “Why Chinese is so damn hard,” might discourage the faint of heart and the short of time.
  • But if I was asked what foreign language is the most useful, and given no more parameters (where? for what purpose?), my answer would be French. Whatever you think of France, the language is much less limited than many people realise.
  • French ranks only 16th on the list of languages ranked by native speakers. But ranked above it are languages like Telegu and Javanese that no one would call world languages. Hindi does not even unite India. Also in the top 15 are Arabic, Spanish and Portuguese, major languages to be sure, but regionally concentrated. If your interest is the Middle East or Islam, by all means learn Arabic. If your interest is Latin America, Spanish or Portuguese is the way to go. Or both; learning one makes the second quite easy.
  • if you want another truly global language, there are surprisingly few candidates, and for me French is unquestionably top of the list. It can enhance your enjoyment of art, history, literature and food, while giving you an important tool in business and a useful one in diplomacy. It has native speakers in every region on earth. And lest we forget its heartland itself, France attracts more tourists than any other country—76.8m in 2010, according to the World Tourism Organisation, leaving America a distant second with 59.7m
Javier E

Clear Your Google Web History - Wired How-To Wiki - 0 views

  • On March 1st, 2012, Google will implement a new, unified privacy policy. The new policy is retroactive, meaning it will affect any data Google has collected on you prior to that date, as well as any data it gathers afterward.
  • Basically, under the new policy, your Google Web History (all of your searches and the sites you clicked through to) can be combined with other data Google has gathered about you from other services — Gmail, Google+, etc.
  • If you'd like to keep your personal data a good distance away from Google, you'll need to delete your existing search history and prevent Google from using that history in the future.
  • ...4 more annotations...
  • This will not stop Google from gathering data when you search. To do that you would need to block Google cookies completely. However, while it will still gather the data, Google will not use it to serve targeted ads or do anything other than use it for internal purposes. Also, with Web History disabled, your data is at least partially anonymized after 18 months (if you leave Web History on, Google will keep your search records indefinitely).
  • First sign into your Google account and head to the history page. Click the button labeled Remove all Web History. Then click Okay to confirm. Note that this also pauses your web history going forward, and Google won't start listening to your history again unless you let it.
  • in this case, however, a little time spent changing your settings can provide invaluable peace of mind knowing that Google can't exploit your personal tendencies for its own purposes.
  • On the negative side, bear in mind that while this won't prevent Google from making search suggestions, it will prevent you from getting personalized suggestions based on your previous searches.
Javier E

The Poverty of an Idea - NYTimes.com - 1 views

  • THE libertarian writer Charles Murray has probably done more than any other contemporary thinker to keep alive the idea of a “culture of poverty,” the theory that poor people are trapped by distorted norms and aspirations and not merely material deprivation.
  • Harrington had picked up the idea of a “culture of poverty” from the anthropologist Oscar Lewis, whose 1959 study of Mexican slum dwellers identified a “subculture” of lowered aspirations and short-term gratification. Echoing Lewis, Harrington argued that American poverty constituted “a separate culture, another nation, with its own way of life.” It would not be solved merely by economic expansion or moral exhortation, he contended, but by a “comprehensive assault on poverty.”
  • In his view, these problems were not a judgment on the poor as individuals, but on a society indifferent to their plight. His popularization of the phrase “culture of poverty” has unintended consequences. There was nothing  in the “vicious circle” of pathology he sketched that was culturally determined, but in the hands of others, the idea came to signify an ingrained system of norms passed from generation to generation.
  • ...3 more annotations...
  • Conservatives took the attitudes and behaviors Harrington saw as symptoms of poverty and portrayed them as its direct causes.
  • In his 1984 book, “Losing Ground,” Mr. Murray argued that welfare programs abet rather than ameliorate poverty. The book dismissed Harrington’s prescription for ending poverty, and Harrington returned the favor. In “The New American Poverty,” published the same year, he called Mr. Murray the right-wing equivalent of a “vulgar Marxist,” a social theorist who believed in a “one-to-one relationship between the economic and the political or the psychological.”
  • Harrington’s culture-of-poverty thesis was at best an ambiguous impediment to understanding — in later books, he made no use of the term. But in its moral clarity, “The Other America” was ultimately optimistic; it was less an indictment and more an appeal to Americans to live up to their better instincts.
Javier E

Young Women Often Trendsetters in Vocal Patterns - NYTimes.com - 0 views

  • vocal trends associated with young women are often seen as markers of immaturity or even stupidity.
  • such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize.
  • they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”
  • ...7 more annotations...
  • “The truth is this: Young women take linguistic features and use them as power tools for building relationships.”
  • women tend to be maybe half a generation ahead of males on average.”
  • Less clear is why. Some linguists suggest that women are more sensitive to social interactions and hence more likely to adopt subtle vocal cues. Others say women use language to assert their power in a culture that, at least in days gone by, asked them to be sedate and decorous. Another theory is that young women are simply given more leeway by society to speak flamboyantly.
  • Several studies have shown that uptalk can be used for any number of purposes, even to dominate a listener.
  • by far the most common uptalkers were fathers of young women. For them, it was “a way of showing themselves to be friendly and not asserting power in the situation,” she said.
  • So what does the use of vocal fry denote?
  • a natural result of women’s lowering their voices to sound more authoritative. It can also be used to communicate disinterest, something teenage girls are notoriously fond of doing.
Javier E

What Do You Mean by 'Love'? | Experts' Corner | Big Think - 1 views

  • people use the same word to mean so many different things. One thing I can say for sure is that we tend to use the word love for that which we feel strongly about. But we use it both to refer to personal preferences and to point to that which we consider to be the most meaningful in human experience
  • We should all aspire to know what love is in its many multifaceted manifestations. But while striving to embrace the full spectrum of the human experience, we want to do so within a hierarchy of spiritually informed values—values that always make it clear what is more important than anything else. 
Javier E

Our Dangerous Inability to Agree on What is TRUE | Risk: Reason and Reality | Big Think - 2 views

  • Given that human cognition is never the product of pure dispassionate reason, but a subjective interpretation of the facts based on our feelings and biases and instincts, when can we ever say that we know who is right and who is wrong, about anything? When can we declare a fact so established that it’s fair to say, without being called arrogant, that those who deny this truth don’t just disagree…that they’re just plain wrong.
  • This isn’t about matters of faith, or questions of ultimately unknowable things which by definition can not be established by fact. This is a question about what is knowable, and provable by careful objective scientific inquiry, a process which includes challenging skepticism rigorously applied precisely to establish what, beyond any reasonable doubt, is in fact true.
  • With enough careful investigation and scrupulously challenged evidence, we can establish knowable truths that are not just the product of our subjective motivated reasoning.
  • ...8 more annotations...
  • This matters for social animals like us, whose safety and very survival ultimately depend on our ability to coexist. Views that have more to do with competing tribal biases than objective interpretations of the evidence create destructive and violent conflict. Denial of scientifically established ‘truth’ cause all sorts of serious direct harms. Consider a few examples; • The widespread faith-based rejection of evolution feeds intense polarization. • Continued fear of vaccines is allowing nearly eradicated diseases to return. • Those who deny the evidence of the safety of genetically modified food are also denying the immense potential benefits of that technology to millions. • Denying the powerful evidence for climate change puts us all in serious jeopardy should that evidence prove to be true.
  • To address these harms, we need to understand why we often have trouble agreeing on what is true (what some have labeled science denialism). Social science has taught us that human cognition is innately, and inescapably, a process of interpreting the hard data about our world – its sights and sound and smells and facts and ideas - through subjective affective filters that help us turn those facts into the judgments and choices and behaviors that help us survive. The brain’s imperative, after all, is not to reason. It’s job is survival, and subjective cognitive biases and instincts have developed to help us make sense of information in the pursuit of safety, not so that we might come to know ‘THE universal absolute truth
  • This subjective cognition is built-in, subconscious, beyond free will, and unavoidably leads to different interpretations of the same facts.
  • But here is a truth with which I hope we can all agree. Our subjective system of cognition can be dangerous.
  • It can produce perceptions that conflict with the evidence, what I call The Perception Gap, which can in turn produce profound harm
  • We need to recognize the greater threat that our subjective system of cognition can pose, and in the name of our own safety and the welfare of the society on which we depend, do our very best to rise above it or, when we can’t, account for this very real danger in the policies we adopt.
  • "Everyone engages in motivated reasoning, everyone screens out unwelcome evidence, no one is a fully rational actor. Sure. But when it comes to something with such enormous consequences to human welfare
  • I think it's fair to say we have an obligation to confront our own ideological priors. We have an obligation to challenge ourselves, to push ourselves, to be suspicious of conclusions that are too convenient, to be sure that we're getting it right.
Javier E

Wikipedia and the Meaning of Truth - Technology Review - 0 views

  • Why the online encyclopedia's epistemology should worry those who care about traditional notions of accuracy.
  • With little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word "truth."
« First ‹ Previous 2381 - 2400 of 2691 Next › Last »
Showing 20 items per page