Skip to main content

Home/ TOK Friends/ Group items matching "Skills" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
3More

Critical Thinking Skills - 0 views

  • There are as many definitions of critical thinking as there are writers about it.
  • "The purpose of critical thinking is, therefore, to achieve understanding, evaluate view points, and solve problems
  • In addition, it stresses the need for instruction and student activity to progress from lower to higher levels of critical analysis.
3More

Critical Thinking: Identifying the Targets - 0 views

  •  National assessments in virtually every subject indicate that, although our students can perform basic skills pretty well, they are not doing well on thinking and reasoning.
  • Textbooks in this country typically pay scant attention to big ideas, offer no analysis, and pose no challenging questions
  • Critical thinking is based on two assumptions: first, that the quality of our thinking affects the quality of our lives, and second, that everyone can learn how to continually improve the quality of his or her thinking.
3More

Is the 'gig economy' turning us all into freelancers? - BBC News - 0 views

  • Well, thanks to the rise of on-demand talent marketplaces, the so-called "gig economy" is fast becoming a reality.
  • Cloud-based platforms are making it easier for firms to find the people they need from a global talent pool, and for freelancers to advertise their skills.
  • And in the US, around 54 million people are now freelance, roughly a third of all workers.
15More

The scientific mystery of why humans love music - Vox - 0 views

  • From an evolutionary perspective, it makes no sense whatsoever that music makes us feel emotions. Why would our ancestors have cared about music?
  • Why does something as abstract as music provoke such consistent emotions?
  • Studies have shown that when we listen to music, our brains release dopamine, which in turn makes us happy
  • ...12 more annotations...
  • It's quite possible that our love of music was simply an accident. We originally evolved emotions to help us navigate dangerous worlds (fear) and social situations (joy). And somehow, the tones and beats of musical composition activate similar brain areas.
  • Nature Neuroscience, led by Zatorre, researchers found that dopamine release is strongest when a piece of music reaches an emotional peak and the listener feels "chills"— the spine-tingling sensation of excitement and awe.
  • "Music engages the same [reward] system, even though it is not biologically necessary for survival," says Zatorre.
  • Presumably, we evolved to recognize patterns because it's an essential skill for survival. Does a rustling in the trees mean a dangerous animal is about to attack? Does the smell of smoke mean I should run, because a fire may be coming my way?
  • Music is a pattern. As we listen, we're constantly anticipating what melodies, harmonies, and rhythms may come next.
  • That's why we typically don't like styles of music we're not familiar with. When we're unfamiliar with a style of music, we don't have a basis to predict its patterns
  • We learn through our cultures what sounds constitute music. The rest is random noise.
  • When we hear a piece of music, its rhythm latches onto us in a process called entrainment. If the music is fast-paced, our heartbeats and breathing patterns will accelerate to match the beat.
  • Another hypothesis is that music latches onto the regions of the brain attuned to speech — which convey all of our emotions.
  • "It makes sense that our brains are really good at picking up emotions in speech," the French Institute of Science's Aucouturier says. It's essential to understand if those around us are happy, sad, angry, or scared. Much of that information is contained in the tone of a person's speech. Higher-pitched voices sound happier. More warbled voices are scared.
  • Music may then be an exaggerated version of speech.
  • And because we tend to mirror the emotions we hear in others, if the music is mimicking happy speech, then the listener will become happy too.
2More

Belief versus Knowledge - 2 views

  • Knowledge has been defined as "A clear perception of a truth or fact, erudition; skill from practice." Also "to know, viz.; To perceive with certainty, to understand clearly, to have experience of." On the other hand, Belief is an "Assent to anything proposed or declared, and its acceptance as fact by reason of the authority from whence it proceeds, apart from personal knowledge; faith; the whole body of tenets held by any faith; a creed; a conviction."
  •  
    This is an interesting article because sometimes I have trouble wrapping my head around the difference, especially when I think of personal knowledge vs. belief. - Ryan (9/14/16)
21More

Adam Kirsch: Art Over Biology | The New Republic - 1 views

  • Nietzsche, who wrote in Human, All Too Human, under the rubric “Art dangerous for the artist,” about the particular ill-suitedness of the artist to flourishing in a modern scientific age: When art seizes an individual powerfully, it draws him back to the views of those times when art flowered most vigorously.... The artist comes more and more to revere sudden excitements, believes in gods and demons, imbues nature with a soul, hates science, becomes unchangeable in his moods like the men of antiquity, and desires the overthrow of all conditions that are not favorable to art.... Thus between him and the other men of his period who are the same age a vehement antagonism is finally generated, and a sad end
  • What is modern is the sense of the superiority of the artist’s inferiority, which is only possible when the artist and the intellectual come to see the values of ordinary life—prosperity, family, worldly success, and happiness—as inherently contemptible.
  • Art, according to a modern understanding that has not wholly vanished today, is meant to be a criticism of life, especially of life in a materialist, positivist civilization such as our own. If this means the artist does not share in civilization’s boons, then his suffering will be a badge of honor.
  • ...18 more annotations...
  • The iron law of Darwinian evolution is that everything that exists strives with all its power to reproduce, to extend life into the future, and that every feature of every creature can be explained as an adaptation toward this end. For the artist to deny any connection with the enterprise of life, then, is to assert his freedom from this universal imperative; to reclaim negatively the autonomy that evolution seems to deny to human beings. It is only because we can freely choose our own ends that we can decide not to live for life, but for some other value that we posit. The artist’s decision to produce spiritual offspring rather than physical ones is thus allied to the monk’s celibacy and the warrior’s death for his country, as gestures that deny the empire of mere life.
  • Animals produce beauty on their bodies; humans can also produce it in their artifacts. The natural inference, then, would be that art is a human form of sexual display, a way for males to impress females with spectacularly redundant creations.
  • For Darwin, the human sense of beauty was not different in kind from the bird’s.
  • Still, Darwin recognized that the human sense of beauty was mediated by “complex ideas and trains of thought,” which make it impossible to explain in terms as straightforward as a bird’s:
  • Put more positively, one might say that any given work of art can be discussed critically and historically, but not deduced from the laws of evolution.
  • with the rise of evolutionary psychology, it was only a matter of time before the attempt was made to explain art in Darwinian terms. After all, if ethics and politics can be explained by game theory and reciprocal altruism, there is no reason why aesthetics should be different: in each case, what appears to be a realm of human autonomy can be reduced to the covert expression of biological imperatives
  • Still, there is an unmistakable sense in discussions of Darwinian aesthetics that by linking art to fitness, we can secure it against charges of irrelevance or frivolousness—that mattering to reproduction is what makes art, or anything, really matter.
  • The first popular effort in this direction was the late Denis Dutton’s much-discussed book The Art Instinct, which appeared in 2009.
  • Dutton’s Darwinism was aesthetically conservative: “Darwinian aesthetics,” he wrote, “can restore the vital place of beauty, skill, and pleasure as high artistic values.” Dutton’s argument has recently been reiterated and refined by a number of new books,
  • “The universality of art and artistic behaviors, their spontaneous appearance everywhere across the globe ... and the fact that in most cases they can be easily recognized as artistic across cultures suggest that they derive from a natural, innate source: a universal human psychology.”
  • Again like language, art is universal in the sense that any local expression of it can be “learned” by anyone.
  • Yet earlier theorists of evolution were reluctant to say that art was an evolutionary adaptation like language, for the simple reason that it does not appear to be evolutionarily adaptive.
  • Stephen Jay Gould suggested that art was not an evolutionary adaptation but what he called a “spandrel”—that is, a showy but accidental by-product of other adaptations that were truly functiona
  • the very words “success” and “failure,” despite themselves, bring an emotive and ethical dimension into the discussion, so impossible is it for human beings to inhabit a valueless world. In the nineteenth century, the idea that fitness for survival was a positive good motivated social Darwinism and eugenics. Proponents of these ideas thought that in some way they were serving progress by promoting the flourishing of the human race, when the basic premise of Darwinism is that there is no such thing as progress or regress, only differential rates of reproduction
  • In particular, Darwin suggests that it is impossible to explain the history or the conventions of any art by the general imperatives of evolution
  • Boyd begins with the premise that human beings are pattern-seeking animals: both our physical perceptions and our social interactions are determined by our brain’s innate need to find and to
  • Art, then, can be defined as the calisthenics of pattern-finding. “Just as animal physical play refines performance, flexibility, and efficiency in key behaviors,” Boyd writes, “so human art refines our performance in our key perceptual and cognitive modes, in sight (the visual arts), sound (music), and social cognition (story). These three modes of art, I propose, are adaptations ... they show evidence of special design in humans, design that offers survival and especially reproductive advantages.”
  • make coherent patterns
10More

Why Are Some People So Smart? The Answer Could Spawn a Generation of Superbabies | WIRED - 0 views

  • use those machines to examine the genetic underpinnings of genius like his own. He wants nothing less than to crack the code for intelligence by studying the genomes of thousands of prodigies, not just from China but around the world.
  • fully expect they will succeed in identifying a genetic basis for IQ. They also expect that within a decade their research will be used to screen embryos during in vitro fertilization, boosting the IQ of unborn children by up to 20 points. In theory, that’s the difference between a kid who struggles through high school and one who sails into college.
  • studies make it clear that IQ is strongly correlated with the ability to solve all sorts of abstract problems, whether they involve language, math, or visual patterns. The frightening upshot is that IQ remains by far the most powerful predictor of the life outcomes that people care most about in the modern world. Tell me your IQ and I can make a decently accurate prediction of your occupational attainment, how many kids you’ll have, your chances of being arrested for a crime, even how long you’ll live.
  • ...6 more annotations...
  • Dozens of popular books by nonexperts have filled the void, many claiming that IQ—which after more than a century remains the dominant metric for intelligence—predicts nothing important or that intelligence is simply too complex and subtle to be measured.
  • evidence points toward a strong genetic component in IQ. Based on studies of twins, siblings, and adoption, contemporary estimates put the heritability of IQ at 50 to 80 percent
  • intelligence has a genetic recipe
  • “Do you know any Perl?” Li asked him. Perl is a programming language often used to analyze genomic data. Zhao admitted he did not; in fact, he had no programming skills at all. Li handed him a massive textbook, Programming Perl. There were only two weeks left in the camp, so this would get rid of the kid for good. A few days later, Zhao returned. “I finished it,” he said. “The problems are kind of boring. Do you have anything harder?” Perl is a famously complicated language that takes university students a full year to learn.
  • So Li gave him a large DNA data set and a complicated statistical problem. That should do it. But Zhao returned later that day. “Finished.” Not only was it finished—and correct—but Zhao had even built a slick interface on top of the data.
  • driven by a fascination with kids who are born smart; he wants to know what makes them—and by extension, himself—the way they are.
  •  
    This is a really interesting article about using science to improve intelligence.
1More

Teaching kids philosophy makes them smarter in math and English - 0 views

  •  
    Schools face relentless pressure to up their offerings in the STEM fields-science, technology, engineering, and math. Few are making the case for philosophy. Maybe they should. Nine- and 10-year-old children in England who participated in a philosophy class once a week over the course of a year significantly boosted their math and literacy skills.
8More

Do Political Experts Know What They're Talking About? | Wired Science | Wired... - 1 views

  • I often joke that every cable news show should be forced to display a disclaimer, streaming in a loop at the bottom of the screen. The disclaimer would read: “These talking heads have been scientifically proven to not know what they are talking about. Their blather is for entertainment purposes only.” The viewer would then be referred to Tetlock’s most famous research project, which began in 1984.
  • He picked a few hundred political experts – people who made their living “commenting or offering advice on political and economic trends” – and began asking them to make predictions about future events. He had a long list of pertinent questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds.
  • Most of Tetlock’s questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals. These results are summarized in his excellent Expert Political Judgment.
  • ...5 more annotations...
  • Some experts displayed a top-down style of reasoning: politics as a deductive art. They started with a big-idea premise about human nature, society, or economics and applied it to the specifics of the case. They tended to reach more confident conclusions about the future. And the positions they reached were easier to classify ideologically: that is the Keynesian prediction and that is the free-market fundamentalist prediction and that is the worst-case environmentalist prediction and that is the best case technology-driven growth prediction etc. Other experts displayed a bottom-up style of reasoning: politics as a much messier inductive art. They reached less confident conclusions and they are more likely to draw on a seemingly contradictory mix of ideas in reaching those conclusions (sometimes from the left, sometimes from the right). We called the big-idea experts “hedgehogs” (they know one big thing) and the more eclectic experts “foxes” (they know many, not so big things).
  • The most consistent predictor of consistently more accurate forecasts was “style of reasoning”: experts with the more eclectic, self-critical, and modest cognitive styles tended to outperform the big-idea people (foxes tended to outperform hedgehogs).
  • Lehrer: Can non-experts do anything to encourage a more effective punditocracy?
  • Tetlock: Yes, non-experts can encourage more accountability in the punditocracy. Pundits are remarkably skillful at appearing to go out on a limb in their claims about the future, without actually going out on one. For instance, they often “predict” continued instability and turmoil in the Middle East (predicting the present) but they virtually never get around to telling you exactly what would have to happen to disconfirm their expectations. They are essentially impossible to pin down. If pundits felt that their public credibility hinged on participating in level playing field forecasting exercises in which they must pit their wits against an extremely difficult-to-predict world, I suspect they would be learn, quite quickly, to be more flexible and foxlike in their policy pronouncements.
  • tweetmeme_style = 'compact'; Digg Stumble Upon Delicious Reddit if(typeof CN!=='undefined' && CN.dart){ CN.dart.call("blogsBody",{sz: "300x250", kws : ["bottom"]}); } Disqus Login About Disqus Like Dislike and 5 others liked this. Glad you liked it. Would you like to share? Facebook Twitter Share No thanks Sharing this page … Thanks! Close Login Add New Comment Post as … Image http://mediacdn.disqus.com/1312506743/build/system/upload.html#xdm_e=http%3A%2F%2Fwww.wired.com&xdm_c=default5471&xdm_p=1&f=wiredscience&t=do_political_experts_know_what_they8217re_talking_
8More

Scientists Figure Out When Different Cognitive Abilities Peak Throughout Life | Big Think - 0 views

  • Such skills come from accumulated knowledge which benefits from a lifetime of experience. 
  • Vocabulary, in fact, peaked even later, in the late 60s to early 70s. So now you know why grandpa is so good at crosswords.
  • And here’s a win for the 40+ folks - the below representation of a test of 10,000 visitors to TestMyBrain.org shows that older subjects did better than the young on the vocabulary test.
  • ...4 more annotations...
  • The under-30 group did much better on memory-related tasks, however.
  • Is there one age when all of your mental powers are at their maximum? The researchers don’t think so.  
  • In general, the researchers found 24 to be a key age, after which player abilities slowly declined, losing about 15% of the speed every 15 years. 
  • Older players did perform better in some aspects, making up for the slower brain processing by using simpler strategies and being more efficient. They were, in other words, wiser.
  •  
    It is really surprising to me that cognitive abilities are directly related to age. But it is understandable since there also feels like a gulp between seniors and teenagers. There is always something we are especially good at at a certain age. I think this aligns with the logic of evolution as the society consists of people from different ages so they will cooperate well and reach the maximum benefit by working together. The society is really diverse and by having people of different age in the same team can have people cover up the cognitive disadvantages of others. --Sissi (4/4/2017)
22More

The Flight From Conversation - NYTimes.com - 0 views

  • we have sacrificed conversation for mere connection.
  • the little devices most of us carry around are so powerful that they change not only what we do, but also who we are.
  • A businessman laments that he no longer has colleagues at work. He doesn’t stop by to talk; he doesn’t call. He says that he doesn’t want to interrupt them. He says they’re “too busy on their e-mail.”
  • ...19 more annotations...
  • We want to customize our lives. We want to move in and out of where we are because the thing we value most is control over where we focus our attention. We have gotten used to the idea of being in a tribe of one, loyal to our own party.
  • We are tempted to think that our little “sips” of online connection add up to a big gulp of real conversation. But they don’t.
  • “Someday, someday, but certainly not now, I’d like to learn how to have a conversation.”
  • We can’t get enough of one another if we can use technology to keep one another at distances we can control: not too close, not too far, just right. I think of it as a Goldilocks effect. Texting and e-mail and posting let us present the self we want to be. This means we can edit. And if we wish to, we can delete. Or retouch: the voice, the flesh, the face, the body. Not too much, not too little — just right.
  • Human relationships are rich; they’re messy and demanding. We have learned the habit of cleaning them up with technology.
  • I have often heard the sentiment “No one is listening to me.” I believe this feeling helps explain why it is so appealing to have a Facebook page or a Twitter feed — each provides so many automatic listeners. And it helps explain why — against all reason — so many of us are willing to talk to machines that seem to care about us. Researchers around the world are busy inventing sociable robots, designed to be companions to the elderly, to children, to all of us.
  • Connecting in sips may work for gathering discrete bits of information or for saying, “I am thinking about you.” Or even for saying, “I love you.” But connecting in sips doesn’t work as well when it comes to understanding and knowing one another. In conversation we tend to one another.
  • We can attend to tone and nuance. In conversation, we are called upon to see things from another’s point of view.
  • I’m the one who doesn’t want to be interrupted. I think I should. But I’d rather just do things on my BlackBerry.
  • And we use conversation with others to learn to converse with ourselves. So our flight from conversation can mean diminished chances to learn skills of self-reflection
  • we have little motivation to say something truly self-reflective. Self-reflection in conversation requires trust. It’s hard to do anything with 3,000 Facebook friends except connect.
  • we seem almost willing to dispense with people altogether. Serious people muse about the future of computer programs as psychiatrists. A high school sophomore confides to me that he wishes he could talk to an artificial intelligence program instead of his dad about dating; he says the A.I. would have so much more in its database. Indeed, many people tell me they hope that as Siri, the digital assistant on Apple’s iPhone, becomes more advanced, “she” will be more and more like a best friend — one who will listen when others won’t.
  • FACE-TO-FACE conversation unfolds slowly. It teaches patience. When we communicate on our digital devices, we learn different habits. As we ramp up the volume and velocity of online connections, we start to expect faster answers. To get these, we ask one another simpler questions; we dumb down our communications, even on the most important matters.
  • WE expect more from technology and less from one another and seem increasingly drawn to technologies that provide the illusion of companionship without the demands of relationship. Always-on/always-on-you devices provide three powerful fantasies: that we will always be heard; that we can put our attention wherever we want it to be; and that we never have to be alone. Indeed our new devices have turned being alone into a problem that can be solved.
  • When people are alone, even for a few moments, they fidget and reach for a device. Here connection works like a symptom, not a cure, and our constant, reflexive impulse to connect shapes a new way of being.
  • Think of it as “I share, therefore I am.” We use technology to define ourselves by sharing our thoughts and feelings as we’re having them. We used to think, “I have a feeling; I want to make a call.” Now our impulse is, “I want to have a feeling; I need to send a text.”
  • Lacking the capacity for solitude, we turn to other people but don’t experience them as they are. It is as though we use them, need them as spare parts to support our increasingly fragile selves.
  • If we are unable to be alone, we are far more likely to be lonely. If we don’t teach our children to be alone, they will know only how to be lonely.
  • I am a partisan for conversation. To make room for it, I see some first, deliberate steps. At home, we can create sacred spaces: the kitchen, the dining room. We can make our cars “device-free zones.”
9More

The Practical and the Theoretical - NYTimes.com - 1 views

  • Our society is divided into castes based upon a supposed division between theoretical knowledge and practical skill. The college professor holds forth on television, as the plumber fumes about detached ivory tower intellectuals.
  • . There is a natural temptation to view these activities as requiring distinct capacities.
  • If these are distinct cognitive capacities, then knowing how to do something is not knowledge of a fact — that is, there is a distinction between practical and theoretical knowledge.
  • ...6 more annotations...
  • According to the model suggested by this supposed dichotomy, exercises of theoretical knowledge involve active reflection, engagement with the propositions or rules of the theory in question that guides the subsequent exercise of the knowledge. Think of the chess player following an instruction she has learned for an opening move in chess. In contrast, practical knowledge is exercised automatically and without reflection.
  • Additionally, the fact that exercises of theoretical knowledge are guided by propositions or rules seems to entail that they involve instructions that are universally applicable
  • when one reflects upon any exercise of knowledge, whether practical or theoretical, it appears to have the characteristics that would naïvely be ascribed to the exercise of both practical and intellectual capacities
  • Perhaps one way to distinguish practical knowledge and theoretical knowledge is by talking. When we acquire knowledge of how to do something, we may not be able to express our knowledge in words. But when we acquire knowledge of a truth, we are able to express this knowledge in words.
  • once one bears down on the supposed distinction between practical knowledge and knowledge of truths, it breaks down. The plumber’s or electrician’s activities are a manifestation of the same kind of intelligence as the scientist’s or historian’s latest articles — knowledge of truths.
  • these are distinctions along a continuum, rather than distinctions in kind, as the folk distinction between practical and theoretical pursuits is intended to be.
2More

A Crush on God | Commonweal magazine - 0 views

  • Ignatius taught the Jesuits to end each day doing something called the Examen. You start by acknowledging that God is there with you; then you give thanks for the good parts of your day (mine usually include food); and finally, you run through the events of the day from morning to the moment you sat down to pray, stopping to consider when you felt consolation, the closeness of God, or desolation, when you ignored God or when you felt like God bailed on you. Then you ask for forgiveness for anything shitty you did, and for guidance tomorrow. I realize I’ve spent most of my life saying “thanks” to people in a perfunctory, whatever kind of way. Now when I say it I really mean it, even if it’s to the guy who makes those lattes I love getting in the morning, because I stopped and appreciated his latte-making skills the night before. If you are lucky and prone to belief, the Examen will also help you start really feeling God in your life.
  • My church hosts a monthly dinner for the homeless. Serious work is involved; volunteers pull multiple shifts shopping, prepping, cooking, serving food, and cleaning. I show up for the first time and am shuttled into the kitchen by a harried young woman with a pen stuck into her ponytail, who asks me if I can lift heavy weights before putting me in front of two bins of potato salad and handing me an ice cream scoop. For three hours, I scoop potato salad onto plates, heft vats of potato salad, and scrape leftover potato salad into the compost cans. I never want to eat potato salad again, but I learn something about the homeless people I’ve been avoiding for years: some are mentally a mess, many—judging from the smell—are drunk off their asses, but on the whole, they are polite, intelligent, and, more than anything else, grateful. As I walk back to my car, I’m stopped several times by many of them who want to thank me, saying how good the food was, how much they enjoyed it. “I didn’t do anything,” I say in return. “You were there,” one of them replies. It’s enough to make me go back the next month, and the month after that. And in between, when I see people I feed on the street, instead of focusing my eyes in the sidewalk and hoping they go away, we have conversations. It’s those conversations that move me from intellectual distance toward a greater sense of gratitude for the work of God.
23More

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
5More

The Benefits of Bilingualism - NYTimes.com - 2 views

  • Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age.
  • in a bilingual’s brain both language systems are active even when he is using only one language, thus creating situations in which one system obstructs the other. But this interference, researchers are finding out, isn’t so much a handicap as a blessing in disguise. It forces the brain to resolve internal conflict, giving the mind a workout that strengthens its cognitive muscles.
  • the bilingual experience improves the brain’s so-called executive function — a command system that directs the attention processes that we use for planning, solving problems and performing various other mentally demanding tasks. These processes include ignoring distractions to stay focused, switching attention willfully from one thing to another and holding information in mind — like remembering a sequence of directions while driving.
  • ...2 more annotations...
  • The key difference between bilinguals and monolinguals may be more basic: a heightened ability to monitor the environment. “Bilinguals have to switch languages quite often — you may talk to your father in one language and to your mother in another language,” says Albert Costa, a researcher at the University of Pompeu Fabra in Spain. “It requires keeping track of changes around you in the same way that we monitor our surroundings when driving.”
  • individuals with a higher degree of bilingualism — measured through a comparative evaluation of proficiency in each language — were more resistant than others to the onset of dementia and other symptoms of Alzheimer’s disease: the higher the degree of bilingualism, the later the age of onset.
11More

Atul Gawande: Failure and Rescue : The New Yorker - 0 views

  • the critical skills of the best surgeons I saw involved the ability to handle complexity and uncertainty. They had developed judgment, mastery of teamwork, and willingness to accept responsibility for the consequences of their choices. In this respect, I realized, surgery turns out to be no different than a life in teaching, public service, business, or almost anything you may decide to pursue. We all face complexity and uncertainty no matter where our path takes us. That means we all face the risk of failure. So along the way, we all are forced to develop these critical capacities—of judgment, teamwork, and acceptance of responsibility.
  • people admonish us: take risks; be willing to fail. But this has always puzzled me. Do you want a surgeon whose motto is “I like taking risks”? We do in fact want people to take risks, to strive for difficult goals even when the possibility of failure looms. Progress cannot happen otherwise. But how they do it is what seems to matter. The key to reducing death after surgery was the introduction of ways to reduce the risk of things going wrong—through specialization, better planning, and technology.
  • there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
  • ...8 more annotations...
  • I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
  • this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
  • This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
  • When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all.
  • All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
  • But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself
  • Yet you cannot blind yourself to failure, either. Indeed, you must prepare for it. For, strangely enough, only then is success possible.
  • So you will take risks, and you will have failures. But it’s what happens afterward that is defining. A failure often does not have to be a failure at all. However, you have to be ready for it—will you admit when things go wrong? Will you take steps to set them right?—because the difference between triumph and defeat, you’ll find, isn’t about willingness to take risks. It’s about mastery of rescue.
3More

Apple, America and a Squeezed Middle Class - NYTimes.com - 0 views

  • When Barack Obama joined Silicon Valley’s top luminaries for dinner in California last February, each guest was asked to come with a question for the president.
  • what would it take to make iPhones in the United States?
  • Not long ago, Apple boasted that its products were made in America. Today, few are. Almost all of the 70 million iPhones, 30 million iPads and 59 million other products Apple sold last year were manufactured overseas. Why can’t that work come home? Mr. Obama asked. Mr. Jobs’s reply was unambiguous. “Those jobs aren’t coming back,” he said, according to another dinner guest. The president’s question touched upon a central conviction at Apple. It isn’t just that workers are cheaper abroad. Rather, Apple’s executives believe the vast scale of overseas factories as well as the flexibility, diligence and industrial skills of foreign workers have so outpaced their American counterparts that “Made in the U.S.A.” is no longer a viable option for most Apple products. Apple has become one of the best-known, most admired and most imitated companies on earth, in part through an unrelenting mastery of global operations. Last year, it earned over $400,000 in profit per employee, more than Goldman Sachs, Exxon Mobil or Google. However, what has vexed Mr. Obama as well as economists and policy makers is that Apple — and many of its high-technology peers — are not nearly as avid in creating American jobs as other famous companies were in their heydays.
14More

New Statesman - All machine and no ghost? - 0 views

  • More subtly, there are many who insist that consciousness just reduces to brain states - a pang of regret, say, is just a surge of chemicals across a synapse. They are collapsers rather than deniers. Though not avowedly eliminative, this kind of view is tacitly a rejection of the very existence of consciousness
  • it occurred to me that the problem might lie not in nature but in ourselves: we just don't have the faculties of comprehension that would enable us to remove the sense of mystery. Ontologically, matter and consciousness are woven intelligibly together but epistemologically we are precluded from seeing how. I used Noam Chomsky's notion of "mysteries of nature" to describe the situation as I saw it. Soon, I was being labelled (by Owen Flanagan) a "mysterian"
  • Dualism makes the mind too separate, thereby precluding intelligible interaction and dependence.
  • ...11 more annotations...
  • At this point the idealist swooshes in: ladies and gentlemen, there is nothing but mind! There is no problem of interaction with matter because matter is mere illusion
  • idealism has its charms but taking it seriously requires an antipathy to matter bordering on the maniacal. Are we to suppose that material reality is just a dream, a baseless fantasy, and that the Big Bang was nothing but the cosmic spirit having a mental sneezing fit?
  • pan­psychism: even the lowliest of material things has a streak of sentience running through it, like veins in marble. Not just parcels of organic matter, such as lizards and worms, but also plants and bacteria and water molecules and even electrons. Everything has its primitive feelings and minute allotment of sensation.
  • The trouble with panpsychism is that there just isn't any evidence of the universal distribution of consciousness in the material world.
  • The dualist, by contrast, freely admits that consciousness exists, as well as matter, holding that reality falls into two giant spheres. There is the physical brain, on the one hand, and the conscious mind, on the other: the twain may meet at some point but they remain distinct entities.
  • The more we know of the brain, the less it looks like a device for creating consciousness: it's just a big collection of biological cells and a blur of electrical activity - all machine and no ghost.
  • mystery is quite pervasive, even in the hardest of sciences. Physics is a hotbed of mystery: space, time, matter and motion - none of it is free of mysterious elements. The puzzles of quantum theory are just a symptom of this widespread lack of understanding
  • The human intellect grasps the natural world obliquely and glancingly, using mathematics to construct abstract representations of concrete phenomena, but what the ultimate nature of things really is remains obscure and hidden. How everything fits together is particularly elusive, perhaps reflecting the disparate cognitive faculties we bring to bear on the world (the senses, introspection, mathematical description). We are far from obtaining a unified theory of all being and there is no guarantee that such a theory is accessible by finite human intelligence.
  • real naturalism begins with a proper perspective on our specifically human intelligence. Palaeoanthropologists have taught us that the human brain gradually evolved from ancestral brains, particularly in concert with practical toolmaking, centring on the anatomy of the human hand. This history shaped and constrained the form of intelligence now housed in our skulls (as the lifestyle of other species form their set of cognitive skills). What chance is there that an intelligence geared to making stone tools and grounded in the contingent peculiarities of the human hand can aspire to uncover all the mysteries of the universe? Can omniscience spring from an opposable thumb? It seems unlikely, so why presume that the mysteries of consciousness will be revealed to a thumb-shaped brain like ours?
  • The "mysterianism" I advocate is really nothing more than the acknowledgment that human intelligence is a local, contingent, temporal, practical and expendable feature of life on earth - an incremental adaptation based on earlier forms of intelligence that no one would reg
  • rd as faintly omniscient. The current state of the philosophy of mind, from my point of view, is just a reflection of one evolutionary time-slice of a particular bipedal species on a particular humid planet at this fleeting moment in cosmic history - as is everything else about the human animal. There is more ignorance in it than knowledge.
13More

Elon studies future of "Generation Always-On" - 1 views

  • Elon studies the future of "Generation Always-On"
  • By the year 2020, it is expected that youth of the “always-on generation,” brought up from childhood with a continuous connection to each other and to information, will be nimble, quick-acting multitaskers who count on the Internet as their external brain and who approach problems in a different way from their elders. "There is no doubt that brains are being rewired,"
  • the Internet Center, refers to the teens-to-20s age group born since the turn of the century as Generation AO, for “always-on." “They have grown up in a world that has come to offer them instant access to nearly the entirety of human knowledge, and incredible opportunities to connect, create and collaborate,"
  • ...10 more annotations...
  • some said they are already witnessing deficiencies in young peoples’ abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."
  • Many of the respondents in this survey predict that Gen AO will exhibit a thirst for instant gratification and quick fixes and a lack of patience and deep-thinking ability due to what one referred to as “fast-twitch wiring.”
  • “The replacement of memorization by analysis will be the biggest boon to society since the coming of mass literacy in the late 19th to early 20th century.” — Paul Jones, University of North Carolina-Chapel Hill
  • “Teens find distraction while working, distraction while driving, distraction while talking to the neighbours. Parents and teachers will have to invest major time and efforts into solving this issue – silence zones, time-out zones, meditation classes without mobile, lessons in ignoring people.”
  • “Society is becoming conditioned into dependence on technology in ways that, if that technology suddenly disappears or breaks down, will render people functionally useless. What does that mean for individual and social resiliency?
  • “Short attention spans resulting from quick interactions will be detrimental to focusing on the harder problems and we will probably see a stagnation in many areas: technology, even social venues such as literature. The people who will strive and lead the charge will be the ones able to disconnect themselves to focus.”
  • “The underlying issue is that they will become dependent on the Internet in order to solve problems and conduct their personal, professional, and civic lives. Thus centralized powers that can control access to the Internet will be able to significantly control future generations. It will be much as in Orwell's 1984, where control was achieved by using language to shape and limit thought, so future regimes may use control of access to the Internet to shape and limit thought.”
  • “Increasingly, teens and young adults rely on the first bit of information they find on a topic, assuming that they have found the ‘right’ answer, rather than using context and vetting/questioning the sources of information to gain a holistic view of a topic.”
  • “Parents and kids will spend less time developing meaningful and bonded relationships in deference to the pursuit and processing of more and more segmented information competing for space in their heads, slowly changing their connection to humanity.”
  • “It’s simply not possible to discuss, let alone form societal consensus around major problems without lengthy, messy conversations about those problems. A generation that expects to spend 140 or fewer characters on a topic and rejects nuance is incapable of tackling these problems.”
26More

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
« First ‹ Previous 101 - 120 of 292 Next › Last »
Showing 20 items per page