Skip to main content

Home/ Groups/ TOK Friends
Javier E

The Lies of Science Writing - WSJ.com - 0 views

  • Writing about science poses a fundamental problem right at the outset: You have to lie.
  • because math is the language of science, scientists who want to translate their work into popular parlance have to use verbal or pictorial metaphors that are necessarily inexact.
  • Choosing the proper metaphor can make all the difference between distorting science and providing an appropriate context from which nonscientists can appreciate new scientific findings and put them in perspective.
  • ...4 more annotations...
  • Not only is a good picture, even a mental one, worth at least a thousand words, but many scientists themselves think in these terms.
  • Though metaphors are useful in trying to understand complicated scientific ideas, they have their pitfalls.
  • Consider another famous scientific metaphor, the evolutionary biologist Richard Dawkins's idea of the "selfish gene." This is a brilliant and simple way to explain that natural selection relies on the self-perpetuation of genes that promote higher rates of survival. But for some critics, it suggests an intentionality that is absent in the process of evolution. Others worry that it implies an immoral world where selfishness wins out.
  • When used effectively, an apt metaphor can enhance the real purpose in writing about science for the public: provoking interest and a desire to learn more.
Javier E

Book Review: The Last Lingua Franca - WSJ.com - 0 views

  • After narrating the history of Latin, Persian, Phoenician and other once-dominant languages, all now either dead or consigned to their native communities, Mr. Ostler argues that English too will sputter out relatively soon. Among the factors dooming it is the lack of any institution to demand its survival—no priestly use, as Latin or Sanskrit had, or government that requires its subjects to keep their linguistic skills up to enjoy full citizenship. As English loses cachet, it will become optional, and ultimately its reign will be one of the shortest in the history of lingua francas.
  • But regional languages are gaining enough traction in trade to allow their speakers to discard English, particularly if people can transact their cultural and commercial business with the crutch of computer software and machine translation.
  • The one issue that Mr. Ostler treats insufficiently is what the world might lose after what his subtitle calls "the return of Babel." One needn't be sentimental about English to wonder whether it isn't useful to have one language, rich in literature, that everyone shares in addition to a mother tongue.
Javier E

The Social Network and the Dunbar Number | Mind & Matter - WSJ.com - 0 views

  • Mr. Dunbar's eponymous number is 147.8, plus or minus a lot, and it is the size of the average human being's social network of friends, as predicted by the size of the average human brain
  • Mr. Dunbar famously noticed that there is a tight correlation between the size of a primate's brain and the size of the social group its species generally forms. On this basis human beings should live in groups of around 150. The neat thing about this prediction was the way it seemed to fit the number of good friends most people have, as measured by the length of address books, the size of hunter-gatherer bands, the population of neolithic villages and the strength of army units. In recent years, Facebook has also seemed to confirm the hunch, with rosters of friends often settling around the Dunbar number.
  • Mr. Dunbar's "social brain hypothesis" rests on another idea—the theory of mind—which argues that we use our brains to imagine what others are thinking. So, drilling down further into the physiology of the brain, Mr. Dunbar's team has now found that a rich social network also goes with the ability to reason about others' intentional states.
  • ...1 more annotation...
  • human beings evolved big brains not to understand the world, but to understand each other.
Javier E

Watson Still Can't Think - NYTimes.com - 0 views

  • Fish argued that Watson “does not come within a million miles of replicating the achievements of everyday human action and thought.” In defending this claim, Fish invoked arguments that one of us (Dreyfus) articulated almost 40 years ago in “What Computers Can’t Do,” a criticism of 1960s and 1970s style artificial intelligence.
  • At the dawn of the AI era the dominant approach to creating intelligent systems was based on finding the right rules for the computer to follow.
  • GOFAI, for Good Old Fashioned Artificial Intelligence.
  • ...12 more annotations...
  • For constrained domains the GOFAI approach is a winning strategy.
  • there is nothing intelligent or even interesting about the brute force approach.
  • the dominant paradigm in AI research has largely “moved on from GOFAI to embodied, distributed intelligence.” And Faustus from Cincinnati insists that as a result “machines with bodies that experience the world and act on it” will be “able to achieve intelligence.”
  • The new, embodied paradigm in AI, deriving primarily from the work of roboticist Rodney Brooks, insists that the body is required for intelligence. Indeed, Brooks’s classic 1990 paper, “Elephants Don’t Play Chess,” rejected the very symbolic computation paradigm against which Dreyfus had railed, favoring instead a range of biologically inspired robots that could solve apparently simple, but actually quite complicated, problems like locomotion, grasping, navigation through physical environments and so on. To solve these problems, Brooks discovered that it was actually a disadvantage for the system to represent the status of the environment and respond to it on the basis of pre-programmed rules about what to do, as the traditional GOFAI systems had. Instead, Brooks insisted, “It is better to use the world as its own model.”
  • although they respond to the physical world rather well, they tend to be oblivious to the global, social moods in which we find ourselves embedded essentially from birth, and in virtue of which things matter to us in the first place.
  • the embodied AI paradigm is irrelevant to Watson. After all, Watson has no useful bodily interaction with the world at all.
  • The statistical machine learning strategies that it uses are indeed a big advance over traditional GOFAI techniques. But they still fall far short of what human beings do.
  • “The illusion is that this computer is doing the same thing that a very good ‘Jeopardy!’ player would do. It’s not. It’s doing something sort of different that looks the same on the surface. And every so often you see the cracks.”
  • Watson doesn’t understand relevance at all. It only measures statistical frequencies. Because it is relatively common to find mismatches of this sort, Watson learns to weigh them as only mild evidence against the answer. But the human just doesn’t do it that way. The human being sees immediately that the mismatch is irrelevant for the Erie Canal but essential for Toronto. Past frequency is simply no guide to relevance.
  • The fact is, things are relevant for human beings because at root we are beings for whom things matter. Relevance and mattering are two sides of the same coin. As Haugeland said, “The problem with computers is that they just don’t give a damn.” It is easy to pretend that computers can care about something if we focus on relatively narrow domains — like trivia games or chess — where by definition winning the game is the only thing that could matter, and the computer is programmed to win. But precisely because the criteria for success are so narrowly defined in these cases, they have nothing to do with what human beings are when they are at their best.
  • Far from being the paradigm of intelligence, therefore, mere matching with no sense of mattering or relevance is barely any kind of intelligence at all. As beings for whom the world already matters, our central human ability is to be able to see what matters when.
  • But, as we show in our recent book, this is an existential achievement orders of magnitude more amazing and wonderful than any statistical treatment of bare facts could ever be. The greatest danger of Watson’s victory is not that it proves machines could be better versions of us, but that it tempts us to misunderstand ourselves as poorer versions of them.
Javier E

The Nation's Science Report Card is out. Everything is going fine. : Greg Laden's Blog - 0 views

  • The Science component of "The Nation's Report Card" was released today and clearly indicates that we have moved one step closer as a nation in two of our most important goals: Building a large and complacent poorly educated low-pay labor class, and increasing the size of our science-illiterate populace in order to allow the advance of medieval morality and Iron Age Christian values.
Richard Monari

Yakko's Universe Song - 0 views

  •  
    This song, released on the Animaniacs in the 90s, first instilled in me a sense of relativity. Later on, I feel impressed that an animated show would include such material that discussed the largeness of the universe for small children. Also, it depressed me when they said "we're all really puny."
Javier E

A Romp Through Theories on Origins of Life - NYTimes.com - 0 views

  • they debated the definition of life — “anything highly statistically improbable, but in a particular direction,” in the words of Richard Dawkins, the evolutionary biologist at Oxford. Or, they wondered if it could be defined at all in the absence of a second example to the Earth’s biosphere — a web of interdependence all based on DNA.
  • The rapid appearance of complex life in some accounts — “like Athena springing from the head of Zeus,” in the words of Dr. McKay — has rekindled interest recently in a theory fancied by Francis Crick, one of the discoverers of the double helix, that life originated elsewhere and floated here through space. These days the favorite candidate for such an extraterrestrial cradle is Mars
  • “If you want to think of it that way, life is a very simple process,” said Sidney Altman, who shared a Nobel Prize in 1989 for showing that RNA had these dual abilities. “It uses energy, it sustains itself and it replicates.” One lesson of the meeting was how finicky are the chemical reactions needed for carrying out these simple-sounding functions. “There might be a reason why amino acids and nucleotides are the way they are,”
Javier E

Secrets of a Mind-Gamer - NYTimes.com - 0 views

  • “What you have to understand is that even average memories are remarkably powerful if used properly,” Cooke said. He explained to me that mnemonic competitors saw themselves as “participants in an amateur research program” whose aim is to rescue a long-lost tradition of memory training.
  • it wasn’t so long ago that culture depended on individual memories. A trained memory was not just a handy tool but also a fundamental facet of any worldly mind. It was considered a form of character-building, a way of developing the cardinal virtue of prudence and, by extension, ethics. Only through memorizing, the thinking went, could ideas be incorporated into your psyche and their values absorbed.
  • all the other mental athletes I met kept insisting that anyone could do what they do. It was simply a matter of learning to “think in more memorable ways,” using a set of mnemonic techniques almost all of which were invented in ancient Greece. These techniques existed not to memorize useless information like decks of playing cards but to etch into the brain foundational texts and ideas.
  • ...10 more annotations...
  • not only did the brains of the mental athletes appear anatomically indistinguishable from those of the control subjects, but on every test of general cognitive ability, the mental athletes’ scores came back well within the normal range.
  • There was, however, one telling difference between the brains of the mental athletes and those of the control subjects. When the researchers looked at the parts of the brain that were engaged when the subjects memorized, they found that the mental athletes were relying more heavily on regions known to be involved in spatial memory.
  • just about anything could be imprinted upon our memories, and kept in good order, simply by constructing a building in the imagination and filling it with imagery of what needed to be recalled. This imagined edifice could then be walked through at any time in the future. Such a building would later come to be called a memory palace.
  • Memory training was considered a centerpiece of classical education in the language arts, on par with grammar, logic and rhetoric. Students were taught not just what to remember but how to remember it. In a world with few books, memory was sacrosanct.
  • In his essay “First Steps Toward a History of Reading,” Robert Darnton describes a switch from “intensive” to “extensive” reading that occurred as printed books began to proliferate.
  • Until relatively recently, people read “intensively,” Darnton says. “They had only a few books — the Bible, an almanac, a devotional work or two — and they read them over and over again, usually aloud and in groups, so that a narrow range of traditional literature became deeply impressed on their consciousness.” Today we read books “extensively,” often without sustained focus, and with rare exceptions we read each book only once. We value quantity of reading over quality of reading.
  • “Rhetorica ad Herennium” underscores the importance of purposeful attention by making a distinction between natural memory and artificial memory:
  • Our hunter-gatherer ancestors didn’t need to recall phone numbers or word-for-word instructions from their bosses or the Advanced Placement U.S. history curriculum or (because they lived in relatively small, stable groups) the names of dozens of strangers at a cocktail party. What they did need to remember was where to find food and resources and the route home and which plants were edible and which were poisonous
  • What distinguishes a great mnemonist, I learned, is the ability to create lavish images on the fly, to paint in the mind a scene so unlike any other it cannot be forgotten. And to do it quickly. Many c
  • the three stages of acquiring a new skill. During the first phase, known as the cognitive phase, we intellectualize the task and discover new strategies to accomplish it more proficiently. During the second
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

How To Look Smart, Ctd - The Daily Dish | By Andrew Sullivan - 0 views

  • The Atlantic Home todaysDate();Tuesday, February 8, 2011Tuesday, February 8, 2011 Go Follow the Atlantic » Politics Presented by When Ronald Reagan Endorsed Ron Paul Joshua Green Epitaph for the DLC Marc Ambinder A Hard Time Raising Concerns About Egypt Chris Good Business Presented by Could a Hybrid Mortgage System Work? Daniel Indiviglio Fighting Bias in Academia Megan McArdle The Tech Revolution For Seniors Derek Thompson Culture Presented By 'Tiger Mother' Creates a New World Order James Fallows Justin Bieber: Daydream Believer James Parker <!-- /li
  • these questions tend to overlook the way IQ tests are designed. As a neuropsychologist who has administered hundreds of these measures, I can tell you that their structures reflect a deeply embedded bias toward intelligence as a function of reading skills
Javier E

How Meditation May Change the Brain - NYTimes.com - 0 views

  • scientists say that meditators like my husband may be benefiting from changes in their brains. The researchers report that those who meditated for about 30 minutes a day for eight weeks had measurable changes in gray-matter density in parts of the brain associated with memory, sense of self, empathy and stress
  • M.R.I. brain scans taken before and after the participants’ meditation regimen found increased gray matter in the hippocampus, an area important for learning and memory. The images also showed a reduction of gray matter in the amygdala, a region connected to anxiety and stress.
  • “The main idea is to use different objects to focus one’s attention, and it could be a focus on sensations of breathing, or emotions or thoughts, or observing any type of body sensations,” she said. “But it’s about bringing the mind back to the here and now, as opposed to letting the mind drift.”
Javier E

A gentler and more logical economics « Blog Archive « Dan Ariely - 0 views

  • When it comes to designing things in our physical world, we all understand how flawed we are and design the physical world around us accordingly.
  • What I find amazing is that when it comes to designing the mental and cognitive realm, we somehow assume that human beings are without bounds. We cling to the idea that we are fully rational beings, and that, like mental Supermen, we can figure out anything. Why are we so readily willing to admit to our physical limitations but are unwilling to take our cognitive limitations into account?
  • To start with, our physical limitations stare us in the face all the time; but our cognitive limitations are not as obvious. A second reason is that we have a desire to see ourselves as perfectly capable — an impossibility in the physical domain. And perhaps a final reason why we don’t see our cognitive limitations is that maybe we have all bought into standard economics a little too much.
  • ...1 more annotation...
  • If we’re going to try to understand human behavior and use this knowledge to design the world around us—including institutions such as taxes, education systems, and financial markets—we need to use additional tools and other disciplines, including psychology, sociology, and philosophy. Rational economics is useful, but it offers just one type of input
Ranger Ruffins

Wild Talk - 0 views

  •  
    This is a link my dad sent me. It covers a little bit of what we talked about in our Language unit, and our debate on wether animals have language. I hope you like it!
Javier E

Book Review - Examined Lives - By James Miller - NYTimes.com - 0 views

  • Miller has now had the superb idea of taking Diogenes Laertius as a model, while simultaneously using this model to test whether such an approach can still offer us anything of value. He covers 12 philosophers: Socrates, Plato, Diogenes the Cynic (not to be confused with Laertius), Aristotle, Seneca, Augustine, Montaigne, Descartes, Rousseau, Kant, Emerson and Nietzsche. In each case, he explores the life selectively, looking for “crux” points and investigating how ideas of the philosophical life have changed.
  • Miller concludes that his 12 philosophical lives offer a moral that is “neither simple nor uniformly edifying.” It amounts mainly to the idea that philosophy can offer little or no consolation, and that the examined life is, if anything, “harder and less potentially rewarding” for us than it was for Socrates.
Javier E

Using Google to Tell Real Science from Fads | Mind Matters | Big Think - 0 views

  • a database of words from millions of books digitized by Google—4 percent of all books ever printed—could be one of the big ones. It's a fabulous source of ideas and evidence for theories about society, and it's fabulously democratic. Google offers a handy analyzer, the Ngram Viewer, which anyone can use to test an idea. A case in point: Yesterday, the social psychologist Rob Kurzban argued that the tool can distinguish between genuine scientific theories and intellectual fads.
Javier E

Science Closes In On the Reason Rich People Are Jerks | Mind Matters | Big Think - 0 views

  • Wilson's student Dan O'Brien was researching cooperative behavior in a local primate species called the Binghamton, N.Y. high-school student. The higher a neighborhood's median income, O'Brien found, the less cooperative were its teen-agers.
  • The fact that cooperativeness varies from culture to culture, Wilson writes, suggests an explanation: Human nature doesn't have a single default setting for helpfulness and respect. Instead, we have the capacity to learn how trusting, how open, and how generous to be with others. If you hunt whales in a tightly cooperating team, you learn to cooperate readily. If you farm a hardscrabble patch of dirt with only your near relatives to help, you're much more likely to want to screw over your fellow man.
  • Wilson suggests that the comforts of affluence are atrophying people's propensity to band with others to work for the common good. If you don't practice this social skill, he argues, it will go away. "Those of us who can pay with our credit cards don’t need to cooperate," he writes, "and so we forget how."
  • ...1 more annotation...
  • It seems there's an association between spending money on one's self and selfish conduct, and it doesn't require actual spending. In this 2009 paper Roy Y.J. Chua and Xi Zou, both professors of management, found that just getting people to think about that kind of spending was sufficient to make their decisions more selfish. The pair showed 87 university students pictures of shoes and watches and had them complete a survey about the products. Then they answered questions about how they would behave as a chief executive in each of three hypothetical business decisions. Half the group had seen pictures of simple, functional shoes and watches. The others had viewed, and then described, top-end luxury goods. Those who saw the luxury versions were significantly more likely to choose the selfish path in the business decisions. They were more inclined to OK the production of a car that would pollute the environment, the release of bug-riddled software, and the marketing of a videogame that would prompt kids to bash each other. That suggests, write Chua and Zou, that "mere exposure to luxury caused people to think more about themselves than others."
Javier E

Big Think Interview With Nicholas Carr | Nicholas Carr | Big Think - 0 views

  • Neurologically, how does our brain adapt itself to new technologies? Nicholas Carr:&nbsp;A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses. And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections. And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways. On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
  • And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading.
  • What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
  • ...12 more annotations...
  • we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information.
  • Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit.
  • With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words.
  • If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that.
  • if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
  • The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking.
  • we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
  • the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
  • On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
  • the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
  • So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions.
  • what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
Javier E

A Most Valuable Democrat - NYTimes.com - 1 views

  • As Ezra Klein of The Washington Post noted recently, this turned out to be one of the most consequential decisions Obama and Reid made. If Lieberman had not been welcomed back by the Democrats, there might not have been a 60th vote for health care reform, and it would have failed. There certainly would have been no victory for “don’t ask, don’t tell” repeal without Lieberman’s tireless work and hawkish credentials. The Kerry-Lieberman climate bill came closer to passage than any other energy bill. Lieberman also provided crucial support or a swing vote for the Lilly Ledbetter Fair Pay Act, the stimulus bill, the banking bill, the unemployment extension and several other measures.
  • These policy makers are judging Lieberman by the criteria Max Weber called the “ethic of responsibility” — who will produce the best consequences. Some of the activists are judging him by what Weber called an “ethic of intention” — who has the purest and most uncompromising heart.
Javier E

"Generously Angry" - The Daily Dish | By Andrew Sullivan - 1 views

  • He is laughing, with a touch of anger in his laughter, but no triumph, no malignity. It is the face of a man who is always fighting against something, but who fights in the open and is not frightened, the face of a man who is generously angry — in other words, of a nineteenth-century liberal, a free intelligence, a type hated with equal hatred by all the smelly little orthodoxies which are now contending for our souls.
  • A blogger will feel anger from time to time - and should express it
  • The difficult task is summoning the right amount of anger with the right amount of generosity of spirit.
  • ...3 more annotations...
  • I mean keeping our anger at failures and misdemeanors in public life constantly in terms of finding ways to make things better for all of us, including the objects of our criticism.
  • when there are individuals in politics you have learned to distrust or oppose, it is always helpful from time to time to add a genuine compliment, not for the sake of it, or for credentializing, but because there are very few people who have no redeeming features and noting them is only fair.
  • Generous anger: a classically Orwellian term. Because it is a new phrase, a fresh idea, and yet instantly understandable. And necessary.
« First ‹ Previous 6381 - 6400 of 6446 Next › Last »
Showing 20 items per page