Skip to main content

Home/ TOK Friends/ Group items tagged Capacity

Rss Feed Group items tagged

Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
manhefnawi

Your Brain's Memory Capacity May Be as Big as the World Wide Web | Mental Floss - 0 views

  • In an attempt to understand and measure the brain’s synapses, whose shape and size have remained mysterious to scientists, researchers at the University of Texas, Austin and the Salk Institute worked together to determine that the brain’s memory capacity is much larger than previously understood. The results, published in the journal eLife, estimate that an individual human brain may store as much as a petabyte of information—perhaps 10 times more than previously estimated, and about the equivalent of the World Wide Web.
Javier E

Think Less, Think Better - The New York Times - 1 views

  • the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.”
  • Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear.
  • We found that a high mental load consistently diminished the originality and creativity of the response: Participants with seven digits to recall resorted to the most statistically common responses (e.g., white/black), whereas participants with two digits gave less typical, more varied pairings (e.g., white/cloud).
  • ...8 more annotations...
  • In another experiment, we found that longer response times were correlated with less diverse responses, ruling out the possibility that participants with low mental loads simply took more time to generate an interesting response.
  • it seems that with a high mental load, you need more time to generate even a conventional thought. These experiments suggest that the mind’s natural tendency is to explore and to favor novelty, but when occupied it looks for the most familiar and inevitably least interesting solution.
  • In general, there is a tension in our brains between exploration and exploitation. When we are exploratory, we attend to things with a wide scope, curious and desiring to learn. Other times, we rely on, or “exploit,” what we already know, leaning on our expectations, trusting the comfort of a predictable environment
  • Much of our lives are spent somewhere between those extremes. There are functional benefits to both modes: If we were not exploratory, we would never have ventured out of the caves; if we did not exploit the certainty of the familiar, we would have taken too many risks and gone extinct. But there needs to be a healthy balance
  • All these loads can consume mental capacity, leading to dull thought and anhedonia — a flattened ability to experience pleasure.
  • ancient meditative practice helps free the mind to have richer experiences of the present
  • your life leaves too much room for your mind to wander. As a result, only a small fraction of your mental capacity remains engaged in what is before it, and mind-wandering and ruminations become a tax on the quality of your life
  • Honing an ability to unburden the load on your mind, be it through meditation or some other practice, can bring with it a wonderfully magnified experience of the world — and, as our study suggests, of your own mind.
Javier E

The Practical and the Theoretical - NYTimes.com - 1 views

  • Our society is divided into castes based upon a supposed division between theoretical knowledge and practical skill. The college professor holds forth on television, as the plumber fumes about detached ivory tower intellectuals.
  • . There is a natural temptation to view these activities as requiring distinct capacities.
  • If these are distinct cognitive capacities, then knowing how to do something is not knowledge of a fact — that is, there is a distinction between practical and theoretical knowledge.
  • ...6 more annotations...
  • According to the model suggested by this supposed dichotomy, exercises of theoretical knowledge involve active reflection, engagement with the propositions or rules of the theory in question that guides the subsequent exercise of the knowledge. Think of the chess player following an instruction she has learned for an opening move in chess. In contrast, practical knowledge is exercised automatically and without reflection.
  • Additionally, the fact that exercises of theoretical knowledge are guided by propositions or rules seems to entail that they involve instructions that are universally applicable
  • when one reflects upon any exercise of knowledge, whether practical or theoretical, it appears to have the characteristics that would naïvely be ascribed to the exercise of both practical and intellectual capacities
  • Perhaps one way to distinguish practical knowledge and theoretical knowledge is by talking. When we acquire knowledge of how to do something, we may not be able to express our knowledge in words. But when we acquire knowledge of a truth, we are able to express this knowledge in words.
  • once one bears down on the supposed distinction between practical knowledge and knowledge of truths, it breaks down. The plumber’s or electrician’s activities are a manifestation of the same kind of intelligence as the scientist’s or historian’s latest articles — knowledge of truths.
  • these are distinctions along a continuum, rather than distinctions in kind, as the folk distinction between practical and theoretical pursuits is intended to be.
Javier E

Young Minds in Critical Condition - NYTimes.com - 1 views

  • Our best college students are very good at being critical. In fact being smart, for many, means being critical. Having strong critical skills shows that you will not be easily fooled. It is a sign of sophistication, especially when coupled with an acknowledgment of one’s own “privilege.”
  • The combination of resistance to influence and deflection of responsibility by confessing to one’s advantages is a sure sign of one’s ability to negotiate the politics of learning on campus.
  • Taking things apart, or taking people down, can provide the satisfactions of cynicism. But this is thin gruel.
  • ...7 more annotations...
  • In overdeveloping the capacity to show how texts, institutions or people fail to accomplish what they set out to do, we may be depriving students of the chance to learn as much as possible from what they study.
  • As debunkers, they contribute to a cultural climate that has little tolerance for finding or making meaning — a culture whose intellectuals and cultural commentators get “liked” by showing that somebody else just can’t be believed.
  • Liberal education in America has long been characterized by the intertwining of two traditions: of critical inquiry in pursuit of truth and exuberant performance in pursuit of excellence. In the last half-century, though, emphasis on inquiry has become dominant, and it has often been reduced to the ability to expose error and undermine belief.
  • fetishizing disbelief as a sign of intelligence has contributed to depleting our cultural resources. Creative work, in whatever field, depends upon commitment, the energy of participation and the ability to become absorbed in works of literature, art and science. That type of absorption is becoming an endangered species of cultural life, as our nonstop, increasingly fractured technological existence wears down our receptive capacities.
  • Liberal learning depends on absorption in compelling work. It is a way to open ourselves to the various forms of life in which we might actively participate. When we learn to read or look or listen intensively, we are, at least temporarily, overcoming our own blindness by trying to understand an experience from another’s point of view.
  • we are learning to activate potential, and often to instigate new possibilities.
  • Liberal education must not limit itself to critical thinking and problem solving; it must also foster openness, participation and opportunity. It should be designed to take us beyond the campus to a life of ongoing, pragmatic learning that finds inspiration in unexpected sources, and increases our capacity to understand and contribute to the world
Javier E

On the Road to Humankind With Leon Festinger - The New York Times - 1 views

  • another special capacity of the dominant left brain. We called this device the “interpreter” and have come to realize it is the storyteller, the system that builds our narrative and gives our many actions that pour out of us, frequently outside of our conscious control, a centrality, a story — our personal story.
  • It is so powerful an addition to humankind that it masks the reality: We are, in fact, a confederation of relatively independent agents, each struggling to be part of our narrative that is our story. It turns out the left brain has another capacity potentially more important than language itself. The interpreter is the thing that sticks all of those parts together.
  • My threaded interpretation, however, could be different from yours. For stories (beliefs) to be useful as a technology to control groups of people, it is necessary to standardize our interpretations , something we know has occurred almost from the beginning of recorded human history.
  • ...4 more annotations...
  • In Graham Swift’s novel “Waterland,” the narrator, a history teacher named Tom Crick, defines the human as “the storytelling animal” who “wants to leave behind not a chaotic wake, not an empty space,” but the “comforting trail signs of stories: As long as there’s a story, it’s all right.”
  • This is why the historian Yuval Harari, in his book “Sapiens: A Brief History of Humankind,” has proposed that in addition to our personal narratives, we produce collective fictions that are a uniquely human capacity.
  • if you examine any large-scale human cooperation, you will always find that it is based on some fiction like the nation, like money, like human rights. These are all things that do not exist objectively, but they exist only in the stories that we tell and that we spread around. This is something very unique to us, perhaps the most unique feature of our species.”
  • As the novelist captures the personal, the historian captures the social story within which most of us are embedded and uniquely thrive. It is the inventive interpretive mind first applying itself to our personal life and then to our social existence that is our core skill.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Opinion | Imagination Is More Important Than You Think - The New York Times - 0 views

  • Plato and Aristotle disagreed about the imagination
  • Plato gave the impression that imagination is a somewhat airy-fairy luxury good. It deals with illusions and make-believe and distracts us from reality and our capacity to coolly reason about it. Aristotle countered that imagination is one of the foundations of all knowledge.
  • What is imagination?
  • ...14 more annotations...
  • Imagination is the capacity to make associations among all these bits of information and to synthesize them into patterns and concepts.
  • When you walk, say, into a coffee shop you don’t see an array of surfaces, lights and angles. Your imagination instantly coalesces all that into an image: “coffee shop.”
  • Neuroscientists have come to appreciate how fantastically complicated and subjective this process of creating mental images really is. You may think perception is a simple “objective” process of taking in the world and cognition is a complicated process of thinking about it. But that’s wrong.
  • Perception — the fast process of selecting, putting together, interpreting and experiencing facts, thoughts and emotions — is the essential poetic act that makes you you.
  • For example, you don’t see the naked concept “coffee shop.” The image you create is coated with personal feelings, memories and evaluations. You see: “slightly upscale suburban coffee shop trying and failing to send off a hipster vibe.” The imagination, Charles Darwin wrote, “unites former images and ideas, independently of the will, and thus creates brilliant and novel results.”
  • Imagination helps you perceive reality, try on other realities, predict possible futures, experience other viewpoints. And yet how much do schools prioritize the cultivation of this essential ability?
  • “A fool sees not the same tree that a wise man sees,” William Blake observed.
  • Can you improve your imagination? Yes. By creating complex and varied lenses through which to see the world
  • A person who feeds his or her imagination with a fuller repertoire of thoughts and experiences has the ability not only to see reality more richly but also — even more rare — to imagine the world through the imaginations of others.
  • This is the skill we see in Shakespeare to such a miraculous degree — his ability to disappear into his characters and inhabit their points of view without ever pretending to explain them.
  • Different people have different kinds of imagination. Some people mainly focus on the parts of the world that can be quantified.
  • it often doesn’t see the subjective way people coat the world with values and emotions and aspirations, which is exactly what we want to see if we want to glimpse how they experience their experience.
  • Furthermore, imagination can get richer over time. When you go to Thanksgiving dinner, your image of Uncle Frank contains the memories of past Thanksgivings, the arguments and the jokes, and the whole sum of your common experiences. The guy you once saw as an insufferable blowhard you now see — as your range of associations has widened and deepened — as a decent soul struggling with his wounds.
  • What happens to a society that lets so much of its imaginative capacity lie fallow? Perhaps you wind up in a society in which people are strangers to one another and themselves.
Emily Horwitz

The Country That Stopped Reading - NYTimes.com - 0 views

  • EARLIER this week, I spotted, among the job listings in the newspaper Reforma, an ad from a restaurant in Mexico City looking to hire dishwashers. The requirement: a secondary school diploma.
  • Years ago, school was not for everyone. Classrooms were places for discipline, study. Teachers were respected figures. Parents actually gave them permission to punish their children by slapping them or tugging their ears. But at least in those days, schools aimed to offer a more dignified life.
  • During a strike in 2008 in Oaxaca, I remember walking through the temporary campground in search of a teacher reading a book. Among tens of thousands, I found not one. I did find people listening to disco-decibel music, watching television, playing cards or dominoes, vegetating. I saw some gossip magazines, too.
  • ...10 more annotations...
  • Despite recent gains in industrial development and increasing numbers of engineering graduates, Mexico is floundering socially, politically and economically because so many of its citizens do not read. Upon taking office in December, our new president, Enrique Peña Nieto, immediately announced a program to improve education. This is typical. All presidents do this upon taking office.
  • Put the leader of the teachers’ union, Elba Esther Gordillo, in jail — which he did last week. Ms. Gordillo, who has led the 1.5 million-member union for 23 years, is suspected of embezzling about $200 million.
  • Nobody in Mexico organizes as many strikes as the teachers’ union. And, sadly, many teachers, who often buy or inherit their jobs, are lacking in education themselves.
  • they learn much less. They learn almost nothing. The proportion of the Mexican population that is literate is going up, but in absolute numbers, there are more illiterate people in Mexico now than there were 12 years ago
  • I picked out five of the ignorant majority and asked them to tell me why they didn’t like reading. The result was predictable: they stuttered, grumbled, grew impatient. None was able to articulate a sentence, express an idea.
  • In 2002, President Vicente Fox began a national reading plan; he chose as a spokesman Jorge Campos, a popular soccer player, ordered millions of books printed and built an immense library. Unfortunately, teachers were not properly trained and children were not given time for reading in school. The plan focused on the book instead of the reader. I have seen warehouses filled with hundreds of thousands of forgotten books, intended for schools and libraries, simply waiting for the dust and humidity to render them garbage.
  • When my daughter was 15, her literature teacher banished all fiction from her classroom. “We’re going to read history and biology textbooks,” she said, “because that way you’ll read and learn at the same time.” In our schools, children are being taught what is easy to teach rather than what they need to learn. It is for this reason that in Mexico — and many other countries — the humanities have been pushed aside.
  • it is natural that in secondary school we are training chauffeurs, waiters and dishwashers.
  • he educational machine does not need fine-tuning; it needs a complete change of direction. It needs to make students read, read and read.
  • But perhaps the Mexican government is not ready for its people to be truly educated. We know that books give people ambitions, expectations, a sense of dignity. If tomorrow we were to wake up as educated as the Finnish people, the streets would be filled with indignant citizens and our frightened government would be asking itself where these people got more than a dishwasher’s training.
  •  
    This article claimed that the more we read (not just textbooks, but fiction), the greater capacity we have to know. It also said that many of the students in Mexico do not learn much because their teachers are ill-educated. This made me think of the knowledge question: how much can we know if we rely on inaccurate knowledge by authority?
Emily Horwitz

Upside of Distraction - NYTimes.com - 0 views

  • Writing a book consists largely of avoiding distractions. If you can forget your real circumstances and submerge yourself in your subject for hours every day, characters become more human, sentences become clearer and prettier. But utter devotion to the principle that distraction is Satan and writing is paramount can be just as poisonous as an excess of diversion.
  • Monomania is what it sounds like: a pathologically intense focus on one thing.
  • It’s the opposite of the problem you have, in other words, if you are a normal, contemporary, non-agrarian 30-something.
  • ...8 more annotations...
  • There was nothing to do besides read, write, reflect on God and drink. It was a circumstance favorable to writing fiction. But it was also conducive to depravity, the old Calvinist definition thereof: a warping of the spirit.
  • When I socialized, it was often with poets, who confirmed by their very existence that I had landed in a better, vanished time. Even their physical ailments were of the 19th century. One day, in the depths of winter, I came upon one of them picking his way across the snow and ice on crutches, pausing to drag on his cigarette.
  • The disaster unfolded slowly. The professors and students were diplomatic, but a pall of boredom fell over the seminar table when my work was under discussion. I could see everyone struggling to care. And then, trying feverishly to write something that would engage people, I got worse. First my writing became overthought, and then it went rank with the odor of desperation. It got to the point that every chapter, short story, every essay was trash.
  • It took me a long time to realize that the utter domination of my consciousness by the desire to write well was itself the problem.
  • When good writing was my only goal, I made the quality of my work the measure of my worth. For this reason, I wasn’t able to read my own writing well. I couldn’t tell whether something I had just written was good or bad, because I needed it to be good in order to feel sane.
  • I purged myself of monomania — slowly, and somewhat unwittingly. I fell in love, an overpowering diversion, and began to spend more time at my girlfriend’s place, where she had Wi-Fi, a flat-screen TV and a DVD player.
  • One morning, after I diversified my mania, my writing no longer stank of decay.
  • I’m glad I went to 19th-century Russia. But I wish I had been more careful, more humble, and kept one foot in modernity. The thing about 19th-century Russia is that if you race in, heedless of all but conquest and glory, you get stuck.
  •  
    An interesting article about the need for distractions - if we focus too much on one thing at a time, we lose the capacity to tell whether it is good or not.
Javier E

History News Network - 0 views

  • A myth is a narrative that people tell to express their most basic views about what the world is like and how they should live in it. The myth serves that purpose whether it’s totally false, totally true, or (as is usually the case) some mixture of the two.
  • Fact-checking the myth is irrelevant to its role in the lives of the people who tell it.  They do not judge it by whether it can be proven factually true. Rather, it shapes their view of truth; it tells them what they can accept as factually true and what they must consider false. So they act out their myth in a ritual to reinforce their commitment to truth as the myth teaches them to see it -- or so the old theory goes.
  • What happens when fact-checking itself becomes a ritual? I don’t have quantitative data, but it seems to me that we have much more fact-checking in this presidential election than in any election before
  • ...12 more annotations...
  • Why is fact-checking so popular? The traditional American view of democracy has a ready answer: The people know that, to be responsible voters, they must know the facts.
  • There’s a complex myth of democracy packed into that little story. There’s a basic premise: Democracy can work because we humans are rational animals. We are built to be fact-checkers; we all have the capacity to separate true facts from lies. And once we have true facts, we know how to analyze them logically to come to reasonable conclusions. If that weren’t true, democracy would be a foolish experiment, indeed.
  • But, the myth goes on to say, a capacity is useless unless it is developed through training. That’s why democracy demands universal access to education
  • Only educated people can be responsible citizens because only the educated have actualized their potential for fact-checking and rational thinking.
  • The myth of democracy says that citizens must educated enough to know which policies are best for their community. But good citizens must also bring their rationality into the polling booth.
  • they must have honesty from their leaders and transparency from their government.
  • Hence, the need for fact-checkers at every step on the campaign trail. It’s only logical.Except that there’s no evidence all the fact-checking has any measurable impact on the voters’ choices.
  • ideas hardly mattered any more than facts in the outcome of the first debate. Romney won on style points alone.
  • The “theater state” is a performance art. Every candidate is judged, above all, on their performance. Good theatrical performers know how to create satisfying illusory images of truth. It’s one of their highest skills. Mitt Romney proved that in the first debate.
  • When the final book is written on this campaign, one-sided deception will still have played a central role. As it stands, the very notions of fact and truth are employed in American politics as much to distort as to reveal. And until the voting public demands something else, not just from the politicians they oppose but also from the ones they support, there is little reason to suspect that will change.
  • The fact-checkers, too, are seasoned performers skilled in the art of creating satisfying illusory images of truth.
  • Above all, they create the illusion that American democracy is alive and well because the public is apparently being informed of the facts and the veracity of each candidate is apparently being carefully evaluated and widely reported. Fact-checking, then, is the ritual enactment of our myth of democracy. As long as the myth keeps getting acted out, we can trust that it is alive and well.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Atul Gawande: Failure and Rescue : The New Yorker - 0 views

  • the critical skills of the best surgeons I saw involved the ability to handle complexity and uncertainty. They had developed judgment, mastery of teamwork, and willingness to accept responsibility for the consequences of their choices. In this respect, I realized, surgery turns out to be no different than a life in teaching, public service, business, or almost anything you may decide to pursue. We all face complexity and uncertainty no matter where our path takes us. That means we all face the risk of failure. So along the way, we all are forced to develop these critical capacities—of judgment, teamwork, and acceptance of responsibility.
  • people admonish us: take risks; be willing to fail. But this has always puzzled me. Do you want a surgeon whose motto is “I like taking risks”? We do in fact want people to take risks, to strive for difficult goals even when the possibility of failure looms. Progress cannot happen otherwise. But how they do it is what seems to matter. The key to reducing death after surgery was the introduction of ways to reduce the risk of things going wrong—through specialization, better planning, and technology.
  • there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
  • ...8 more annotations...
  • I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
  • this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
  • This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
  • When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all.
  • All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
  • But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself
  • Yet you cannot blind yourself to failure, either. Indeed, you must prepare for it. For, strangely enough, only then is success possible.
  • So you will take risks, and you will have failures. But it’s what happens afterward that is defining. A failure often does not have to be a failure at all. However, you have to be ready for it—will you admit when things go wrong? Will you take steps to set them right?—because the difference between triumph and defeat, you’ll find, isn’t about willingness to take risks. It’s about mastery of rescue.
Javier E

The psychology of hate: How we deny human beings their humanity - Salon.com - 0 views

  • The cross-cultural psychologist Gustav Jahoda catalogued how Europeans since the time of the ancient Greeks viewed those living in relatively primitive cultures as lacking a mind in one of two ways: either lacking self-control and emotions, like an animal, or lacking reason and intellect, like a child. So foreign in appearance, language, and manner, “they” did not simply become other people, they became lesser people. More specifically, they were seen as having lesser minds, diminished capacities to either reason or feel.
  • In the early 1990ss, California State Police commonly referred to crimes involving young black men as NHI—No Humans Involved.
  • The essence of dehumanization is, therefore, failing to recognize the fully human mind of another person. Those who fight against dehumanization typically deal with extreme cases that can make it seem like a relatively rare phenomenon. It is not. Subtle versions are all around us.
  • ...15 more annotations...
  • Even doctors—those whose business is to treat others humanely— can remain disengaged from the minds of their patients, particularly when those patients are easily seen as different from the doctors themselves. Until the early 1990s, for instance, it was routine practice for infants to undergo surgery without anesthesia. Why? Because at the time, doctors did not believe that infants were able to experience pain, a fundamental capacity of the human mind.
  • Your sixth sense functions only when you engage it. When you do not, you may fail to recognize a fully human mind that is right before your eyes.
  • Although it is indeed true that the ability to read the minds of others exists along a spectrum with stable individual differences, I believe that the more useful knowledge comes from understanding the moment-to-moment, situational influences that can lead even the most social person—yes, even you and me—to treat others as mindless animals or objects.
  • None of the cases described in this chapter so far involve people with chronic and stable personality disorders. Instead, they all come from predictable contexts in which people’s sixth sense remained disengaged for one fundamental reason: distance.
  • This three-part chain—sharing attention, imitating action, and imitation creating experience—shows one way in which your sixth sense works through your physical senses. More important, it also shows how your sixth sense could remain disengaged, leaving you disconnected from the minds of others. Close your eyes, look away, plug your ears, stand too far away to see or hear, or simply focus your attention elsewhere, and your sixth sense may not be triggered.
  • Distance keeps your sixth sense disengaged for at least two reasons. First, your ability to understand the minds of others can be triggered by your physical senses. When you’re too far away in physical space, those triggers do not get pulled. Second, your ability to understand the minds of others is also engaged by your cognitive inferences. Too far away in psychological space—too different, too foreign, too other—and those triggers, again, do not get pulled
  • For psychologists, distance is not just physical space. It is also psychological space, the degree to which you feel closely connected to someone else. You are describing psychological distance when you say that you feel “distant” from your spouse, “out of touch” with your kids’ lives, “worlds apart” from a neighbor’s politics, or “separated” from your employees. You don’t mean that you are physically distant from other people; you mean that you feel psychologically distant from them in some way
  • Interviews with U.S. soldiers in World War II found that only 15 to 20 percent were able to discharge their weapons at the enemy in close firefights. Even when they did shoot, soldiers found it hard to hit their human targets. In the U.S. Civil War, muskets were capable of hitting a pie plate at 70 yards and soldiers could typically reload anywhere from 4 to 5 times per minute. Theoretically, a regiment of 200 soldiers firing at a wall of enemy soldiers 100 feet wide should be able to kill 120 on the first volley. And yet the kill rate during the Civil War was closer to 1 to 2 men per minute, with the average distance of engagement being only 30 yards.
  • Modern armies now know that they have to overcome these empathic urges, so soldiers undergo relentless training that desensitizes them to close combat, so that they can do their jobs. Modern technology also allows armies to kill more easily because it enables killing at such a great physical distance. Much of the killing by U.S. soldiers now comes through the hands of drone pilots watching a screen from a trailer in Nevada, with their sixth sense almost completely disengaged.
  • Other people obviously do not need to be standing right in front of you for you to imagine what they are thinking or feeling or planning. You can simply close your eyes and imagine it.
  • The MPFC and a handful of other brain regions undergird the inferential component of your sixth sense. When this network of brain regions is engaged, you are thinking about others’ minds. Failing to engage this region when thinking about other people is then a solid indication that you’re overlooking their minds.
  • Research confirms that the MPFC is engaged more when you’re thinking about yourself, your close friends and family, and others who have beliefs similar to your own. It is activated when you care enough about others to care what they are thinking, and not when you are indifferent to others
  • As people become more and more different from us, or more distant from our immediate social networks, they become less and less likely to engage our MPFC. When we don’t engage this region, others appear relatively mindless, something less than fully human.
  • The mistake that can arise when you fail to engage with the minds of others is that you may come to think of them as relatively mindless. That is, you may come to think that these others have less going on between their ears than, say, you do.
  • It’s not only free will that other minds might seem to lack. This lesser minds effect has many manifestations, including what appears to be a universal tendency to assume that others’ minds are less sophisticated and more superficial than one’s own. Members of distant out-groups, ranging from terrorists to poor hurricane victims to political opponents, are also rated as less able to experience complicated emotions, such as shame, pride, embarassment, and guilt than close members of one’s own group.
Javier E

Delay Kindergarten at Your Child's Peril - NYTimes.com - 2 views

  • THIS fall, one in 11 kindergarten-age children in the United States will not be going to class. Parents of these children often delay school entry in an attempt to give them a leg up on peers, but this strategy is likely to be counterproductive.
  • Teachers may encourage redshirting because more mature children are easier to handle in the classroom and initially produce better test scores than their younger classmates.
  • This advantage fades by the end of elementary school, though, and disadvantages start to accumulate. In high school, redshirted children are less motivated and perform less well. By adulthood, they are no better off in wages or educational attainment — in fa
  • ...9 more annotations...
  • ct, their lifetime earnings are reduced by one year.
  • The benefits of being younger are even greater for those who skip a grade, an option available to many high-achieving children. Compared with nonskippers of similar talent and motivation, these youngsters pursue advanced degrees and enter professional school more often. Acceleration is a powerful intervention, with effects on achievement that are twice as large as programs for the gifted.
  • Parents who want to give their young children an academic advantage have a powerful tool: school itself. In a large-scale study at 26 Canadian elementary schools, first graders who were young for their year made considerably more progress in reading and math than kindergartners who were old for their year
  • school makes children smarter.
  • The question we should ask instead is: What approach gives children the greatest opportunity to learn?
  • These differences may come from the increased challenges of a demanding environment. Learning is maximized not by getting all the answers right, but by making errors and correcting them quickly.
  • Some children, especially boys, are slow to mature emotionally, a process that may be aided by the presence of older children.
  • The benefits of interacting with older children may extend to empathetic abilities. Empathy requires the ability to reason about the beliefs of others. This capacity relies on brain maturation, but it is also influenced by interactions with other children. Having an older (but not younger) sibling speeds the onset of this capacity in 3- to 5-year-olds. The acceleration is large: up to half a year per sibling.
  • children are not on a fixed trajectory but learn actively from teachers — and classmates. It matters very much who a child’s peers are. Redshirted children begin school with others who are a little further behind them. Because learning is social, the real winners in that situation are their classmates.
  •  
    I had never realized how incredibly critical the first years of a child's life were. This situation seems almost like a win-lose one; the younger children are more challenged and thus more prepared later on in life while the older ones will always be less motivated and all-around strong. Does this mean that we must set up our classrooms to have some students be statistically advantaged in life while others might potentially suffer? ARE WE GONNA DO THAT?!
Javier E

Stop Googling. Let's Talk. - The New York Times - 3 views

  • In a 2015 study by the Pew Research Center, 89 percent of cellphone owners said they had used their phones during the last social gathering they attended. But they weren’t happy about it; 82 percent of adults felt that the way they used their phones in social settings hurt the conversation.
  • I’ve been studying the psychology of online connectivity for more than 30 years. For the past five, I’ve had a special focus: What has happened to face-to-face conversation in a world where so many people say they would rather text than talk?
  • Young people spoke to me enthusiastically about the good things that flow from a life lived by the rule of three, which you can follow not only during meals but all the time. First of all, there is the magic of the always available elsewhere. You can put your attention wherever you want it to be. You can always be heard. You never have to be bored.
  • ...23 more annotations...
  • But the students also described a sense of loss.
  • A 15-year-old boy told me that someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation. One college junior tried to capture what is wrong about life in his generation. “Our texts are fine,” he said. “It’s what texting does to our conversations when we are together that’s the problem.”
  • One teacher observed that the students “sit in the dining hall and look at their phones. When they share things together, what they are sharing is what is on their phones.” Is this the new conversation? If so, it is not doing the work of the old conversation. The old conversation taught empathy. These students seem to understand each other less.
  • In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.
  • We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.
  • the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape.
  • It’s a powerful insight. Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.
  • Yalda T. Uhls was the lead author on a 2014 study of children at a device-free outdoor camp. After five days without phones or tablets, these campers were able to read facial emotions and correctly identify the emotions of actors in videotaped scenes significantly better than a control group. What fostered these new empathic responses? They talked to one another. In conversation, things go best if you pay close attention and learn how to put yourself in someone else’s shoes. This is easier to do without your phone in hand. Conversation is the most human and humanizing thing that we do.
  • At a nightly cabin chat, a group of 14-year-old boys spoke about a recent three-day wilderness hike. Not that many years ago, the most exciting aspect of that hike might have been the idea of roughing it or the beauty of unspoiled nature. These days, what made the biggest impression was being phoneless. One boy called it “time where you have nothing to do but think quietly and talk to your friends.” The campers also spoke about their new taste for life away from the online feed. Their embrace of the virtue of disconnection suggests a crucial connection: The capacity for empathic conversation goes hand in hand with the capacity for solitude.
  • In solitude we find ourselves; we prepare ourselves to come to conversation with something to say that is authentic, ours. If we can’t gather ourselves, we can’t recognize other people for who they are. If we are not content to be alone, we turn others into the people we need them to be. If we don’t know how to be alone, we’ll only know how to be lonely.
  • we have put this virtuous circle in peril. We turn time alone into a problem that needs to be solved with technology.
  • People sometimes say to me that they can see how one might be disturbed when people turn to their phones when they are together. But surely there is no harm when people turn to their phones when they are by themselves? If anything, it’s our new form of being together.
  • But this way of dividing things up misses the essential connection between solitude and conversation. In solitude we learn to concentrate and imagine, to listen to ourselves. We need these skills to be fully present in conversation.
  • One start toward reclaiming conversation is to reclaim solitude. Some of the most crucial conversations you will ever have will be with yourself. Slow down sufficiently to make this possible. And make a practice of doing one thing at a time. Think of unitasking as the next big thing. In every domain of life, it will increase performance and decrease stress.
  • Multitasking comes with its own high, but when we chase after this feeling, we pursue an illusion. Conversation is a human way to practice unitasking.
  • Our phones are not accessories, but psychologically potent devices that change not just what we do but who we are. A second path toward conversation involves recognizing the degree to which we are vulnerable to all that connection offers. We have to commit ourselves to designing our products and our lives to take that vulnerability into account.
  • We can choose not to carry our phones all the time. We can park our phones in a room and go to them every hour or two while we work on other things or talk to other people. We can carve out spaces at home or work that are device-free, sacred spaces for the paired virtues of conversation and solitude.
  • Families can find these spaces in the day to day — no devices at dinner, in the kitchen and in the car.
  • Engineers are ready with more ideas: What if our phones were not designed to keep us attached, but to do a task and then release us? What if the communications industry began to measure the success of devices not by how much time consumers spend on them but by whether it is time well spent?
  • The young woman who is so clear about the seven minutes that it takes to see where a conversation is going admits that she often doesn’t have the patience to wait for anything near that kind of time before going to her phone. In this she is characteristic of what the psychologists Howard Gardner and Katie Davis called the “app generation,” which grew up with phones in hand and apps at the ready. It tends toward impatience, expecting the world to respond like an app, quickly and efficiently. The app way of thinking starts with the idea that actions in the world will work like algorithms: Certain actions will lead to predictable results.
  • This attitude can show up in friendship as a lack of empathy. Friendships become things to manage; you have a lot of them, and you come to them with tools
  • here is a first step: To reclaim conversation for yourself, your friendships and society, push back against viewing the world as one giant app. It works the other way, too: Conversation is the antidote to the algorithmic way of looking at life because it teaches you about fluidity, contingency and personality.
  • We have time to make corrections and remember who we are — creatures of history, of deep psychology, of complex relationships, of conversations, artless, risky and face to face.
Javier E

Does Google Make Us Stupid? - Pew Research Center - 0 views

  • Carr argued that the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate. "I'm not thinking the way I used to," he wrote, in part because he is becoming a skimming, browsing reader, rather than a deep and engaged reader. "The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.... If we lose those quiet spaces, or fill them up with ‘content,' we will sacrifice something important not only in our selves but in our culture."
  • force us to get smarter if we are to survive. "Most people don't realize that this process is already under way," he wrote. "In fact, it's happening all around us, across the full spectrum of how we understand intelligence. It's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity." He argued that while the proliferation of technology and media can challenge humans' capacity to concentrate there were signs that we are developing "fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge." He also expressed hope that techies will develop tools to help people find and assess information smartly.
  • 76% of the experts agreed with the statement, "By 2020, people's use of the internet has enhanced human intelligence; as people are allowed unprecedented access to more information they become smarter and make better choices. Nicholas Carr was wrong: Google does not make us stupid."
Javier E

Zachary Stockill: The Want for Privacy: Facebook's Assault on Friendship - 1 views

  • privacy is in turn the basis of a person's capacity for friendship and intimacy. [People] who lose the guarantee of privacy also eventually lose the capacity for making friends.
  • What is unsettling is that so many of us are voluntarily declining this right to privacy, and opening up our lives to a vast consortium of various, and often spurious, acquaintances: "Facebook friends."
  • Aside from the basics -- relationship status (whether listed or unlisted, have a look at the photo albums -- you'll know), age, school and other categories such as employment, by reading between the lines you will discover a wealth of information about poor Joe's hapless existence: his income, the details of his social life, if he got fat(ter), if his Grandma/dog/dealer died, what he's eating, the movies he likes, the movies he doesn't like, if he got dumb(er), if he's getting any, if he's a drunkard, if he drives a Camaro, if he voted for Obama (he didn't), if he watches Glenn Beck (he does), etc. etc. etc. It is likely that you will be able to determine, in a very real sense, the nature of Joe's current existence, warts and all.
  • ...2 more annotations...
  • what does it say about our society when we pass about freely the details of our personal lives with an audience of several hundred -- in some cases, thousands -- of onlookers, many of whom we barely like or even know? Indeed, many of these Facebook "friends" are genuine friends, lovers, family. Surely worthy of our trust. But how many of your Facebook "friends" are opportunistic voyeurs who remain your "friend" only to retain access to your world, far removed from any direct, meaningful, personal interaction?
  • I fear for the day when your dissociation from the physical exposes the fact that your online "community" is no substitute for genuine, human companionship and intimacy.
Javier E

Why Elders Smile - NYTimes.com - 1 views

  • When researchers ask people to assess their own well-being, people in their 20s rate themselves highly. Then there’s a decline as people get sadder in middle age, bottoming out around age 50. But then happiness levels shoot up, so that old people are happier than young people. The people who rate themselves most highly are those ages 82 to 85.
  • Older people are more relaxed, on average. They are spared some of the burden of thinking about the future. As a result, they get more pleasure out of present, ordinary activities.
  • I’d rather think that elder happiness is an accomplishment, not a condition, that people get better at living through effort, by mastering specific skills. I’d like to think that people get steadily better at handling life’s challenges. In middle age, they are confronted by stressful challenges they can’t control, like having teenage children. But, in old age, they have more control over the challenges they will tackle and they get even better at addressing them.
  • ...10 more annotations...
  • Aristotle teaches us that being a good person is not mainly about learning moral rules and following them. It is about performing social roles well — being a good parent or teacher or lawyer or friend.
  • First, there’s bifocalism, the ability to see the same situation from multiple perspectives.
  • “Anyone who has worn bifocal lenses knows that it takes time to learn to shift smoothly between perspectives and to combine them in a single field of vision. The same is true of deliberation. It is difficult to be compassionate, and often just as difficult to be detached, but what is most difficult of all is to be both at once.”
  • Only with experience can a person learn to see a fraught situation both close up, with emotional intensity, and far away, with detached perspective.
  • Then there’s lightness, the ability to be at ease with the downsides of life.
  • while older people lose memory they also learn that most setbacks are not the end of the world. Anxiety is the biggest waste in life. If you know that you’ll recover, you can save time and get on with it sooner.
  • Then there is the ability to balance tensions. In “Practical Wisdom,” Barry Schwartz and Kenneth Sharpe argue that performing many social roles means balancing competing demands. A doctor has to be honest but also kind. A teacher has to instruct but also inspire.
  • You can’t find the right balance in each context by memorizing a rule book. This form of wisdom can only be earned by acquiring a repertoire of similar experiences.
  • Finally, experienced heads have intuitive awareness of the landscape of reality, a feel for what other people are thinking and feeling, an instinct for how events will flow.
  • a lifetime of intellectual effort can lead to empathy and pattern awareness. “What I have lost with age in my capacity for hard mental work,” Goldberg writes, “I seem to have gained in my capacity for instantaneous, almost unfairly easy insight.”
1 - 20 of 163 Next › Last »
Showing 20 items per page