Skip to main content

Home/ TOK Friends/ Group items tagged enlightenment

Rss Feed Group items tagged

Javier E

The Irrational Risk of Thinking We Can Be Rational About Risk | Risk: Reason and Realit... - 0 views

  • in the most precise sense of the word, facts are meaningless…just disconnected ones and zeroes in the computer until we run them through the software of how those facts feel
  • Of all the building evidence about human cognition that suggests we ought to be a little more humble about our ability to reason, no other finding has more significance, because Elliott teaches us that no matter how smart we like to think we are, our perceptions are inescapably a blend of reason and gut reaction, intellect and instinct, facts and feelings.
  • Because our perceptions rely as much as or more on feelings than simply on the facts, we sometimes get risk wrong. We’re more afraid of some risks than we need to be (child abduction, vaccines), and not as afraid of some as we ought to be (climate change, particulate air pollution), and that “Perception Gap” can be a risk in and of itself
  • ...5 more annotations...
  • There are more than a dozen of these risk perception factors, (see Ch. 3 of “How Risky Is It, Really? Why Our Fears Don’t Match the Facts", available online free at)
  • many people, particularly intellectuals and academics and policy makers, maintain a stubborn post-Enlightenment confidence in the supreme power of rationality. They continue to believe that we can make the ‘right’ choices about risk based on the facts, that with enough ‘sound science’ evidence from toxicology and epidemiology and cost-benefit analysis, the facts will reveal THE TRUTH. At best this confidence is hopeful naivete. At worst, it is intellectual arrogance that denies all we’ve learned about the realities of human cognition. In either case, it’s dangerous
  • We must understand that instinct and intellect are interwoven components of a single system that helps us perceive the world and make our judgments and choices, a system that worked fine when the risks we faced were simpler but which can make dangerous mistakes as we try to figure out some of the more complex dangers posed in our modern world.
  • What we can do to avoid the dangers that arise when our fears don’t match the facts—the most rational thing to do—is, first, to recognize that our risk perceptions can never be purely objectively perfectly 'rational', and that our subjective perceptions are prone to potentially dangerous mistakes.
  • Then we can begin to apply all the details we've discovered of how our risk perception system works, and use that knowledge and self-awareness to make wiser, more informed, healthier choices
Javier E

Can Dostoevsky's "Notes from Underground" Still Kick You in the Gut? : The New Yorker - 0 views

  • You can easily imagine what Dostoevsky would make of modern sociology, psychology, advertising techniques, war games, polling of any sort. What’s wrong with such techniques, in both their cynical or ameliorative uses, was simply stated by Sartre, in 1945: “All materialist philosophies create man as an object, a stone.” The underground man says that, on the contrary, human beings are unfathomable, unknowable.
  • Predictors of human behavior, as the underground man says, generally assume we will act in our own best interests. But do we? The same question might be asked today, when “rational-choice theory” is still a predictive model for economists and sociologists and many others. When working-class whites vote for Republican policies that will further reduce their economic power—are they voting in their best interests? What about wealthy liberals in favor of higher taxes on the rich? Do people making terrible life choices—say, poor women having children with unreliable men—act in their best interests? Do they calculate at all? What if our own interest, as we construe it, consists of refusing what others want of us? That motive can’t be measured. It can’t even be known, except by novelists like Dostoevsky. Reason is only one part of our temperament, the underground man says. Individualism as a value includes the right to screw yourself up.
Javier E

AMA: How a Weird Internet Thing Became a Mainstream Delight - Alexis C. Madrigal - The ... - 0 views

  • hundreds of people have offered themselves up to be interrogated via Reddit's crowdsourced question-and-answer sessions. They open a new thread on the social network and say, for example, "IamA nanny for a super-rich family in China AMA!"
  • Then, the assembled Redditors ask whatever they want. Questions are voted up and down, and generally speaking, the most popular ones get answered. These interviews can last for as little as an hour or go on for several days.  googletag.cmd.push(function () { googletag.display("adIn-article3"); }); Politicians tend to play things pretty straight, but the regular people and niche celebrities tend to open up in fascinating ways. 
  • Over the last several years, the IamA subreddit has gone from interesting curiosity to a juggernaut of a media brand. Its syntax and abbreviations have invaded the public consciousness like Wired's aged Wired/Tired/Expired rubric. It's a common Twitter joke now to say, "I [did something commonplace], ask me anything." 
  • ...3 more annotations...
  • Reddit was about to become the preeminent place for "real 'expert'" AMAs that were extremely useful and enlightening.
  • AMAs among common folk focus on dishing on what sex, disease, or jobs are really like. The celebrity versions borrow the same idea, but they serve up inside information on celebrity itself (generally speaking) or politics itself. 
  • The AMA is supposed to expose the mechanism. The AMA is about exposing the "inside conversations." The AMA is like the crowdsourced version of those moments when Kevin Spacey turns to the camera in House of Cards and breaks things down. 
Javier E

The Smart Set: I Have My Reasons - May 24, 2011 - 0 views

  • Wasn’t the Enlightenment supposed to wash the world of its sins of superstition and religion? And yet humanity keeps clinging to its belief systems, its religious leaders, and its prayer. More than that, we’re dipping back into the magical realms — one would think that if superstition were to be eradicated through the power of reason and rationality, magic would be the first to go. It turns out our hunger for the irrational and the intuitive is more insatiable than previously assumed. We have our Kabbalah, our Chaos Magick, our Druids. We have our mystics and tarot card readers and our astrologers on morning news shows explaining why Kate and William are a match made by the gods. Wicca is a fast growing religion in the United States, and my German health insurance covers homeopathy and Reiki massage, both of which have always felt more like magic than science to me.
  • And yet the atheists keep on, telling us that we don’t have to believe in God. It maybe never occurred to them that perhaps we want to.
  • Magical belief, whether that entails an omnipotent God who watches over us or the conviction that we can communicate with “the other side,” fills a need in us. Some of us, I should say, as atheists would be quick to counter. How seriously we take that belief, and what we do with it varies from person to person. As the debates between the godless and the faithful continue — and these are so prevalent in our culture now, the religious figure versus the atheist, sponsored by every university and cultural center, that I read such a debate in a novel I had picked up — that perhaps these two kinds of people simply have a different set of needs. It’s like a new mother arguing with a woman who has never felt maternal a day in her life. Neither side will ever truly understand the longings of the other, and the fact that they can’t stop arguing and trying to convince is proof of that.
Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
Javier E

The Unnecessary Conflict Between Evangelicalism and Science - Forbes - 1 views

  • throughout human history, people have generally understood there to be two kinds of truth: logos - the truth of reason, logic, practical experience, and science – and mythos - the stories of the sacred and divine that reveal truths about the world, but not literal truths. They’re the truths revealed by art, prayer, philosophy and religion that represents the symbolic, transcendent meaning in our life.
  • with the success of the Enlightenment, science, and our modern culture, we seem to have discarded the idea of mythos as part of our mainstream culture. As a consequence, there are those of faith who confuse mythos with logos - that is, they read a story or sacred text and interpret what is intended to be a symbolic aspect of spiritual life and treat it literally, as though the story happened historically or happened exactly as described. And in rejecting religious belief, a lot of atheists make the same mistake.
  • understanding distinction between these truths of mythos and logos points the way towards realizing the compatibility of scientific and religious thought. We need them both. They don’t have to be enemies, as they represent different aspects of the human search for truth. You don’t have to believe there’s a God to see wisdom in the Bible, or believe in Brahman to be moved by the poetry of the Vedas. Likewise, you don’t have to give up your belief in God to understand the wonder and complexity of evolution, or delight in the counter-intuitive math of quantum mechanics.
  • ...1 more annotation...
  • “All religions, arts and sciences are branches of the same tree. All these aspirations are directed toward ennobling man’s life, lifting it from the sphere of mere physical existence and leading the individual towards freedom. It is no mere chance that our older universities developed from clerical schools. Both churches and universities — insofar as they live up to their true function — serve the ennoblement of the individual.”
Javier E

The Peril of Knowledge Everywhere - NYTimes.com - 1 views

  • Are there things we should try not to know?
  • IBM says that 2.5 quintillion bytes of data are created each day. That is a number both unimaginable and somewhat unhelpful to real understanding. It’s not just the huge scale of the information, after all, it’s the novel types of data
  • many participants expressed concern about the effects all this data would have on the ability of powerful institutions to control people, from state coercion to product marketing.
  • ...5 more annotations...
  • If we want protection from the world we’re building, perhaps we’re asking that the algorithm wielders choose not to know things, despite their being true. To some, that may be a little like the 1616 order by the Catholic Church that Galileo cease from teaching or discussing the idea that the Earth moves around the sun.
  • one bit here and another there, both innocuous, may reveal something personal that is hidden perhaps even from myself.
  • Since then, we have been living in something closer to the spirit of the 18th-century Enlightenment, when all forms of knowledge were acceptable, and learning was a good in its own right. Regulation has been based on actions, not on knowledge.
  • the situation may be something like a vastly more difficult version of laws against red lining
  • we are also entering a new world where individuals can be as powerful as institutions. That phone gives Big Brother lots of data goodies, but it can also have access to its own pattern-finding algorithms, and publish those findings to the world.
Javier E

The New York Times > Magazine > In the Magazine: Faith, Certainty and the Presidency of... - 0 views

  • The Delaware senator was, in fact, hearing what Bush's top deputies -- from cabinet members like Paul O'Neill, Christine Todd Whitman and Colin Powell to generals fighting in Iraq -- have been told for years when they requested explanations for many of the president's decisions, policies that often seemed to collide with accepted facts. The president would say that he relied on his ''gut'' or his ''instinct'' to guide the ship of state, and then he ''prayed over it.''
  • What underlies Bush's certainty? And can it be assessed in the temporal realm of informed consent?
  • Top officials, from cabinet members on down, were often told when they would speak in Bush's presence, for how long and on what topic. The president would listen without betraying any reaction. Sometimes there would be cross-discussions -- Powell and Rumsfeld, for instance, briefly parrying on an issue -- but the president would rarely prod anyone with direct, informed questions.
  • ...13 more annotations...
  • This is one key feature of the faith-based presidency: open dialogue, based on facts, is not seen as something of inherent value. It may, in fact, create doubt, which undercuts faith. It could result in a loss of confidence in the decision-maker and, just as important, by the decision-maker.
  • has spent a lot of time trying to size up the president. ''Most successful people are good at identifying, very early, their strengths and weaknesses, at knowing themselves,'' he told me not long ago. ''For most of us average Joes, that meant we've relied on strengths but had to work on our weakness -- to lift them to adequacy -- otherwise they might bring us down. I don't think the president really had to do that, because he always had someone there -- his family or friends -- to bail him out. I don't think, on balance, that has served him well for the moment he's in now as president. He never seems to have worked on his weaknesses.''
  • Details vary, but here's the gist of what I understand took place. George W., drunk at a party, crudely insulted a friend of his mother's. George senior and Barbara blew up. Words were exchanged along the lines of something having to be done. George senior, then the vice president, dialed up his friend, Billy Graham, who came to the compound and spent several days with George W. in probing exchanges and walks on the beach. George W. was soon born again. He stopped drinking, attended Bible study and wrestled with issues of fervent faith. A man who was lost was saved.
  • Rubenstein described that time to a convention of pension managers in Los Angeles last year, recalling that Malek approached him and said: ''There is a guy who would like to be on the board. He's kind of down on his luck a bit. Needs a job. . . . Needs some board positions.'' Though Rubenstein didn't think George W. Bush, then in his mid-40's, ''added much value,'' he put him on the Caterair board. ''Came to all the meetings,'' Rubenstein told the conventioneers. ''Told a lot of jokes. Not that many clean ones. And after a while I kind of said to him, after about three years: 'You know, I'm not sure this is really for you. Maybe you should do something else. Because I don't think you're adding that much value to the board. You don't know that much about the company.' He said: 'Well, I think I'm getting out of this business anyway. And I don't really like it that much. So I'm probably going to resign from the board.' And I said thanks. Didn't think I'd ever see him again.''
  • challenges -- from either Powell or his opposite number as the top official in domestic policy, Paul O'Neill -- were trials that Bush had less and less patience for as the months passed. He made that clear to his top lieutenants. Gradually, Bush lost what Richard Perle, who would later head a largely private-sector group under Bush called the Defense Policy Board Advisory Committee, had described as his open posture during foreign-policy tutorials prior to the 2000 campaign. (''He had the confidence to ask questions that revealed he didn't know very much,'' Perle said.) By midyear 2001, a stand-and-deliver rhythm was established. Meetings, large and small, started to take on a scripted quality.
  • That a deep Christian faith illuminated the personal journey of George W. Bush is common knowledge. But faith has also shaped his presidency in profound, nonreligious ways. The president has demanded unquestioning faith from his followers, his staff, his senior aides and his kindred in the Republican Party. Once he makes a decision -- often swiftly, based on a creed or moral position -- he expects complete faith in its rightness.
  • A cluster of particularly vivid qualities was shaping George W. Bush's White House through the summer of 2001: a disdain for contemplation or deliberation, an embrace of decisiveness, a retreat from empiricism, a sometimes bullying impatience with doubters and even friendly questioners.
  • By summer's end that first year, Vice President Dick Cheney had stopped talking in meetings he attended with Bush. They would talk privately, or at their weekly lunch. The president was spending a lot of time outside the White House, often at the ranch, in the presence of only the most trustworthy confidants.
  • ''When I was first with Bush in Austin, what I saw was a self-help Methodist, very open, seeking,'' Wallis says now. ''What I started to see at this point was the man that would emerge over the next year -- a messianic American Calvinist. He doesn't want to hear from anyone who doubts him.''
  • , I had a meeting with a senior adviser to Bush. He expressed the White House's displeasure, and then he told me something that at the time I didn't fully comprehend -- but which I now believe gets to the very heart of the Bush presidency.
  • The aide said that guys like me were ''in what we call the reality-based community,'' which he defined as people who ''believe that solutions emerge from your judicious study of discernible reality.'' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. ''That's not the way the world really works anymore,'' he continued. ''We're an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.''
  • ''If you operate in a certain way -- by saying this is how I want to justify what I've already decided to do, and I don't care how you pull it off -- you guarantee that you'll get faulty, one-sided information,'' Paul O'Neill, who was asked to resign his post of treasury secretary in December 2002, said when we had dinner a few weeks ago. ''You don't have to issue an edict, or twist arms, or be overt.''
  • George W. Bush and his team have constructed a high-performance electoral engine. The soul of this new machine is the support of millions of likely voters, who judge his worth based on intangibles -- character, certainty, fortitude and godliness -- rather than on what he says or does.
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

The Fortnightly Review › Death to the Reading Class. - 0 views

  • most people don’t want to read and, therefore, don’t read. The evidence on this score is clear: the average American reads for about fifteen minutes a day and almost never reads a book for pleasure.
  • we have tried to solve the reading “problem” by removing the most obvious impediments to reading: we taught everyone to read; we printed millions upon millions of books; and we made those books practically free in libraries. And so the barriers fell: now nearly everyone in the developed world is literate, there is plenty to read, and reading material is dirt cheap. But still people don’t read. Why? The obvious answer—though one that is difficult for us to admit—is that most people don’t like to read.
  • Humans achieved their modern form about 180,000 years ago; for 175,000 of those years they never wrote or read anything. About 40,000 years ago, humans began to make symbols,
  • ...12 more annotations...
  • Most people successfully avoided reading until about 300 years ago. It was about then that Western European priests and princes decided that everyone should be taught to read. These literacy-loving types tried various schemes to make common folks literate; the most effective of these, however, was naked coercion. By the nineteenth century, churches and states all over Europe and North America were forcing parents to send their kids to school to learn to read
  • So it happened that by the early twentieth century most people in Western Europe and North America could read. They had no choice in the matter. They still don’t.
  • Why don’t most people like to read? The answer is surprisingly simple: humans weren’t evolved to read. Note that we have no reading organs: our eyes and brains were made for watching, not for decoding tiny symbols on mulch sheets. To prepare our eyes and brains for reading, we must rewire them. This process takes years of hard work to accomplish, and some people never accomplish it all. Moreover, even after you’ve learned to read, you probably won’t find reading to be very much fun. It consumes all of your attention, requires active thought, and makes your eyes hurt. For most people, then, reading is naturally hard and, therefore, something to be avoided if at all possible.
  • we have misidentified the “problem” facing us: it is not the much-bemoaned reading gap, but rather a seldom-mentioned knowledge gap. Though it is immodest to say, we readers genuinely know more than those who do not read. Thus we are usually able to make better-informed decisions than non-readers can.
  • If we lived in an aristocracy of readers, this maldistribution of knowledge might be acceptable. But we don’t; rather, we live in a democracy (if we are lucky). In a democracy, the people – readers and non-readers alike – decide. Thus we would like all citizens to be knowledgeable so that they can make well-informed decisions about our common affairs. This has been a central goal of the Reading Class since the literacy-loving Enlightenment.
  • If we in the Reading Class want to teach the the reading-averse public more effectively than we have in the past, we must rid ourselves of our reading fetish and admit that we’ve been falling down on the job. Once we take this painful step, then a number of interesting options for closing the knowledge gap become available. The most promising of these options is using audio and video to share what we know with the public at large.
  • We have to laboriously learn to read, but we are born with the ability to watch and listen. We don’t find reading terribly pleasant, but we do find watching and listening generally enjoyable.
  • The results of this “natural experiment” are in: people would much rather watch/listen than read. This is why Americans sit in front of the television for three hours a day, while they read for only a tiny fraction of that time.
  • Our task, then, is to give them something serious to watch and listen to, something that conveys the richness and complexity of our written work in pictures and sounds. The good news is that we can easily do this.
  • Today any lecturer can produce and distribute high quality audio and video programs. Most scholars have the equipment on their desks (that is, a PC). The software is dead simple and inexpensive. And the shows themselves can be distributed the world over on the Internet for almost nothing.
  • I’ve done it. Here are two examples. The first is New Books in History, an author-interview podcast featuring historians with new books. Aside from the computer, the total hardware and software start-up costs were roughly $300. It took me no time to learn the software thanks to some handy on-line tutorials available on Lynda.com. Today New Books in History has a large international audience.
  • The “new books” podcasts are not about serious books; they are about the ideas trapped in those serious, and seriously un-read, books. Books imprison ideas; the “new books” podcasts set them free.)
Emilio Ergueta

Performance Is The Thing | Issue 57 | Philosophy Now - 1 views

  • The definition of philosophy is pretty much set – love of wisdom, the rational investigation of questions about existence and knowledge and ethics, the application of reason towards a more enlightened way of life. How can we understand performance in equally clear terms? Also, what are the responsibilities of the performer?
  • How can my performances enrich my life and the lives of my audiences? What is my own personal philosophy of performance?
  • Performance can fundamentally be said to be a transformation of ideas and dreams and all those other little understood human impulses into outward action. In this very basic sense performance happens with every word and gesture. It also presupposes a process of evaluation by a spectator.
  • ...7 more annotations...
  • There’s one very basic thing I learnt that day: never take your writing on stage on a flimsy bit of paper. Always use something hardbacked. Even if your hands start shaking, it won’t be so noticeable.
  • I determined that I was going to break through whatever it was that made me so anxious to stand up in front of a room full of people, and simply be. But I also learnt some other more complex philosophical lessons too, which became clearer over time. In fact, whatever philosophy I have developed about performance has stemmed from that moment.
  • Romantic aestheticians would have it that art, and by extension, performance, is a heightening of the common human activity of expressing emotions to the point where they are experienced and rendered lucid to the performer and audience in a way that is rarely seen in everyday life.
  • When I am performing, there’s a desire I can taste to bridge the gap in understanding between me and my audience. I want to find new ways, new language, verbal and non-verbal to express universal truths. I want to push the challenge of understanding deeper, for me and my audience.
  • I say ‘I am a writer, poet, performer,’ but this self-definition is in itself a kind of performance.
  • One philosophical problem is that performance on the stage of life doesn’t have a beginning, middle and end in the way it does in the theatre. So after a stage performance, when we discard the magnifying glass for the clutter of the quotidian, it’s all too easy to forget to zero in – to forget that we are still and are always performing, and that life is constantly a performance: that we are, in fact, only performing from moment to moment.
  • understanding process can add to our understanding of how we can live our lives for the better and, dare I say, the greater good.
Javier E

What Shamu Taught Me About a Happy Marriage - New York Times - 0 views

  • like many wives before me, I ignored a library of advice books and set about improving him. By nagging, of course, which only made his behavior worse: he'd drive faster instead of slower; shave less frequently, not more; and leave his reeking bike garb on the bedroom floor longer than ever.
  • For a book I was writing about a school for exotic animal trainers, I started commuting from Maine to California, where I spent my days watching students do the seemingly impossible: teaching hyenas to pirouette on command, cougars to offer their paws for a nail clipping, and baboons to skateboard.
  • The central lesson I learned from exotic animal trainers is that I should reward behavior I like and ignore behavior I don't. After all, you don't get a sea lion to balance a ball on the end of its nose by nagging. The same goes for the American husband.
  • ...3 more annotations...
  • I was using what trainers call "approximations," rewarding the small steps toward learning a whole new behavior. You can't expect a baboon to learn to flip on command in one session, just as you can't expect an American husband to begin regularly picking up his dirty socks by praising him once for picking up a single sock.
  • With Scott the husband, I began to praise every small act every time: if he drove just a mile an hour slower, tossed one pair of shorts into the hamper, or was on time for anything.
  • Enlightened trainers learn all they can about a species, from anatomy to social structure, to understand how it thinks, what it likes and dislikes, what comes easily to it and what doesn't.
Javier E

Is Everyone a Little Bit Racist? - NYTimes.com - 0 views

  • Research in the last couple of decades suggests that the problem is not so much overt racists. Rather, the larger problem is a broad swath of people who consider themselves enlightened, who intellectually believe in racial equality, who deplore discrimination, yet who harbor unconscious attitudes that result in discriminatory policies and behavior.
  • The player takes on the role of a police officer who is confronted with a series of images of white or black men variously holding guns or innocent objects such as wallets or cellphones. The aim is to shoot anyone with a gun while holstering your weapon in other cases.Ordinary players (often university undergraduates) routinely shoot more quickly at black men than at white men, and are more likely to mistakenly shoot an unarmed black man than an unarmed white man.
  • Correll has found no statistically significant difference between the play of blacks and that of whites in the shooting game.
  • ...4 more annotations...
  • “There’s a whole culture that promotes this idea of aggressive young black men,” Correll notes. “In our minds, young black men are associated with danger.”
  • One finding is that we unconsciously associate “American” with “white.” Thus, in 2008, some California college students — many who were supporting Barack Obama for president — unconsciously treated Obama as more foreign than Tony Blair, the former British prime minister.
  • an uncomfortable starting point is to understand that racial stereotyping remains ubiquitous, and that the challenge is not a small number of twisted white supremacists but something infinitely more subtle and complex: People who believe in equality but who act in ways that perpetuate bias and inequality.
  • Joshua Correll of the University of Colorado at Boulder has used an online shooter video game to try to measure these unconscious attitudes (you can play the game yourself).
aqconces

BBC - Future - Why do we pick our nose? - 0 views

  • ost of us do it, but few of us will admit to it. If we get caught red-handed, we experience shame and regret. And we tend to frown upon others when they do it in public.
  • Is nose-picking really all that bad? How prevalent or bad is it, really?
  • Andrade and Srihari compiled data from 200 teenagers. Nearly all of them admitted to picking their noses, on average four times per day. That's not all that enlightening; we knew this. But what are interesting are the patterns. Only 7.6% of students reported sticking their fingers into their noses more than 20 times each day, but nearly 20% thought they had a “serious nose-picking problem”. Most of them said they did it to relieve an itch or to clear out nasal debris, but 24 of them, i.e. 12%, admitted that they picked their nose because it felt good.
Javier E

In Defense of Naïve Reading - NYTimes.com - 1 views

  • Clearly, poems and novels and paintings were not produced as objects for future academic study; there is no a priori reason to think that they could be suitable objects of  “research.” By and large they were produced for the pleasure and enlightenment of those who enjoyed them.
  • But just as clearly, the teaching of literature in universities ─ especially after the 19th-century research model of Humboldt University of Berlin was widely copied ─ needed a justification consistent with the aims of that academic setting
  • The main aim was research: the creating and accumulation and transmission of knowledge. And the main model was the natural science model of collaborative research: define problems, break them down into manageable parts, create sub-disciplines and sub-sub-disciplines for the study of these, train students for such research specialties and share everything. With that model, what literature and all the arts needed was something like a general “science of meaning” that could eventually fit that sort of aspiration. Texts or art works could be analyzed as exemplifying and so helping establish such a science. Results could be published in scholarly journals, disputed by others, consensus would eventually emerge and so on.
  • ...3 more annotations...
  • literature study in a university education requires some method of evaluation of whether the student has done well or poorly. Students’ papers must be graded and no faculty member wants to face the inevitable “that’s just your opinion” unarmed, as it were. Learning how to use a research methodology, providing evidence that one has understood and can apply such a method, is understandably an appealing pedagogy
  • Literature and the arts have a dimension unique in the academy, not shared by the objects studied, or “researched” by our scientific brethren. They invite or invoke, at a kind of “first level,” an aesthetic experience that is by its nature resistant to restatement in more formalized, theoretical or generalizing language. This response can certainly be enriched by knowledge of context and history, but the objects express a first-person or subjective view of human concerns that is falsified if wholly transposed to a more “sideways on” or third person view.
  • such works also can directly deliver a  kind of practical knowledge and self-understanding not available from a third person or more general formulation of such knowledge. There is no reason to think that such knowledge — exemplified in what Aristotle said about the practically wise man (the phronimos)or in what Pascal meant by the difference between l’esprit géometrique and l’esprit de finesse — is any less knowledge because it cannot be so formalized or even taught as such.
Javier E

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
kushnerha

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
Javier E

Opinion | Is There Such a Thing as an Authoritarian Voter? - The New York Times - 0 views

  • Jonathan Weiler, a political scientist at the University of North Carolina at Chapel Hill, has spent much of his career studying the appeal of authoritarian figures: politicians who preach xenophobia, beat up on the press and place themselves above the law while extolling “law and order” for everyone else.
  • He is one of many scholars who believe that deep-seated psychological traits help explain voters’ attraction to such leaders. “These days,” he told me, “audiences are more receptive to the idea” than they used to be.
  • “In 2018, the sense of fear and panic — the disorientation about how people who are not like us could see the world the way they do — it’s so elemental,” Mr. Weiler said. “People understand how deeply divided we are, and they are looking for explanations that match the depth of that division.”
  • ...24 more annotations...
  • Moreover, using the child-rearing questionnaire, African-Americans score as far more authoritarian than whites
  • what, exactly, is an “authoritarian” personality? How do you measure it?
  • for more than half a century — social scientists have tried to figure out why some seemingly mild-mannered people gravitate toward a strongman
  • the philosopher (and German refugee) Theodor Adorno collaborated with social scientists at the University of California at Berkeley to investigate why ordinary people supported fascist, anti-Semitic ideology during the war. They used a questionnaire called the F-scale (F is for fascism) and follow-up interviews to analyze the “total personality” of the “potentially antidemocratic individual.”
  • The resulting 1,000-page tome, “The Authoritarian Personality,” published in 1950, found that subjects who scored high on the F-scale disdained the weak and marginalized. They fixated on sexual deviance, embraced conspiracy theories and aligned themselves with domineering leaders “to serve powerful interests and so participate in their power,”
  • “Globalized free trade has shafted American workers and left us looking for a strong male leader, a ‘real man,’” he wrote. “Trump offers exactly what my maladapted unconscious most craves.”
  • one of the F-scale’s prompts: “Obedience and respect for authority are the most important virtues children should learn.” Today’s researchers often diagnose latent authoritarians through a set of questions about preferred traits in children: Would you rather your child be independent or have respect for elders? Have curiosity or good manners? Be self-reliant or obedient? Be well behaved or considerate?
  • a glance at the Christian group Focus on the Family’s “biblical principles for spanking” reminds us that your approach to child rearing is not pre-political; it is shorthand for your stance in the culture wars.
  • “All the social sciences are brought to bear to try to explain all the evil that persists in the world, even though the liberal Enlightenment worldview says that we should be able to perfect things,” said Mr. Strouse, the Trump voter
  • what should have been obvious:
  • “Trump’s electoral strength — and his staying power — have been buoyed, above all, by Americans with authoritarian inclinations,” wrote Matthew MacWilliams, a political consultant who surveyed voters during the 2016 election
  • The child-trait test, then, is a tool to identify white people who are anxious about their decline in status and power.
  • new book, “Prius or Pickup?,” by ditching the charged term “authoritarian.” Instead, they divide people into three temperamental camps: fixed (people who are wary of change and “set in their ways”), fluid (those who are more open to new experiences and people) and mixed (those who are ambivalent).
  • “The term ‘authoritarian’ connotes a fringe perspective, and the perspective we’re describing is far from fringe,” Mr. Weiler said. “It’s central to American public opinion, especially on cultural issues like immigration and race.”
  • Other scholars apply a typology based on the “Big Five” personality traits identified by psychologists in the mid-20th century: extroversion, agreeableness, conscientiousness, neuroticism and openness to experience. (It seems that liberals are open but possibly neurotic, while conservatives are more conscientious.)
  • Historical context matters — it shapes who we are and how we debate politics. “Reason moves slowly,” William English, a political economist at Georgetown, told me. “It’s constituted sociologically, by deep community attachments, things that change over generations.”
  • “it is a deep-seated aspiration of many social scientists — sometimes conscious and sometimes unconscious — to get past wishy-washy culture and belief. Discourses that can’t be scientifically reduced are problematic” for researchers who want to provide “a universal account of behavior.”
  • in our current environment, where polarization is so unyielding, the apparent clarity of psychological and biological explanations becomes seductive
  • Attitudes toward parenting vary across cultures, and for centuries African-Americans have seen the consequences of a social and political hierarchy arrayed against them, so they can hardly be expected to favor it — no matter what they think about child rearing
  • — we know that’s not going to happen. People have wicked tendencies.”
  • as the social scientific portrait of humanity grows more psychological and irrational, it comes closer and closer to approximating the old Adam of traditional Christianity: a fallen, depraved creature, unable to see himself clearly except with the aid of a higher power
  • The conclusions of political scientists should inspire humility rather than hubris. In the end, they have confirmed what so many observers of our species have long suspected: None of us are particularly free or rational creatures.
  • Allen Strouse is not the archetypal Trump voter whom journalists discover in Rust Belt diners. He is a queer Catholic poet and scholar of medieval literature who teaches at the New School in New York City. He voted for Mr. Trump “as a protest against the Democrats’ failures on economic issues,” but the psychological dimensions of his vote intrigue him. “Having studied Freudian analysis, and being in therapy for 10 years, I couldn’t not reflexively ask myself, ‘How does this decision have to do with my psychology?’” he told me.
  • their preoccupation with childhood and “primitive and irrational wishes and fears” have influenced the study of authoritarianism ever since.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Assessing the Value of Buddhism, for Individuals and for the World - The New York Times - 0 views

  • Robert Wright sketches an answer early in “Why Buddhism Is True.” He settles on a credible blend that one might call Western Buddhism, a largely secular approach to life and its problems but not devoid of a spiritual dimension. The centerpiece of the approach is the practice of mindful meditation.
  • The goal of “Why Buddhism Is True” is ambitious: to demonstrate “that Buddhism’s diagnosis of the human predicament is fundamentally correct, and that its prescription is deeply valid and urgently important.”
  • It is reasonable to claim that Buddhism, with its focus on suffering, addresses critical aspects of the human predicament. It is also reasonable to suggest that the prescription it offers may be applicable and useful to resolve that predicament.
  • ...9 more annotations...
  • To produce his demonstrations and to support the idea that Buddhism is “true,” Wright relies on science, especially on evolutionary psychology, cognitive science and neuroscience.
  • Wright is up to the task: He’s a Buddhist who has written about religion and morality from a scientific perspective — he is most famous for his 1994 book, “The Moral Animal.”
  • Second, the mismatch between causes and responses is rooted in evolution. We have inherited from our nonhuman and human forerunners a complex affect apparatus suited to life circumstances very different from ours
  • First, the beneficial powers of meditation come from the possibility of realizing that our emotive reactions and the consequent feelings they engender — which operate in automated fashion, outside our deliberate control — are often inappropriate and even counterproductive relative to the situations that trigger them.
  • Third, meditation allows us to realize that the idea of the self as director of our decisions is an illusion, and that the degree to which we are at the mercy of a weakly controlled system places us at a considerable disadvantage
  • Fourth, the awareness brought on by meditation helps the construction of a truly enlightened humanity and counters the growing tribalism of contemporary societies.
  • when, in modern life, emotions such as fear and anger are incorrectly and unnecessarily engaged — for example, road rage — Wright calls the respective feelings “false” or “illusory.” Such feelings, however, are no less true than the thirst, hunger or pain that Wright accepts and welcomes
  • We can agree that mindful meditation promotes a distancing effect and thus may increase our chances of combining affect and reason advantageously. Meditation can help us glean the especially flawed and dislocated status of humans in modern societies, and help us see how social and political conflicts appear to provoke resentment and anger so easily.
  • How does one scale up, from many single individuals to populations, in time to prevent the social catastrophes that seem to be looming?
‹ Previous 21 - 40 of 54 Next ›
Showing 20 items per page