Skip to main content

Home/ TOK Friends/ Group items tagged skills

Rss Feed Group items tagged

Javier E

When bias beats logic: why the US can't have a reasoned gun debate | US news | The Guar... - 1 views

  • Jon Stokes, a writer and software developer, said he is frustrated after each mass shooting by “the sentiment among very smart people, who are used to detail and nuance and doing a lot of research, that this is cut and dried, this is black and white”.
  • Stokes has lived on both sides of America’s gun culture war, growing up in rural Louisiana, where he got his first gun at age nine, and later studying at Harvard and the University of Chicago, where he adopted some of a big-city resident’s skepticism about guns. He’s written articles about the gun geek culture behind the popularity of the AR-15, why he owns a military-style rifle, and why gun owners are so skeptical of tech-enhanced “smart guns”.
  • Even to suggest that the debate is more complicated – that learning something about guns, by taking a course on how to safely carry a concealed weapon, or learning how to fire a gun, might shift their perspective on whichever solution they have just heard about on TV – “just upsets them, and they basically say you’re trying to obscure the issue”.
  • ...8 more annotations...
  • In early 2013, a few months after the mass shooting at Sandy Hook elementary school, a Yale psychologist created an experiment to test how political bias affects our reasoning skills. Dan Kahan was attempting to understand why public debates over social problems remain deadlocked, even when good scientific evidence is available. He decided to test a question about gun control.
  • Then Kahan ran the same test again. This time, instead of evaluating skin cream trials, participants were asked to evaluate whether a law banning citizens from carrying concealed firearms in public made crime go up or down. The result: when liberals and conservatives were confronted with a set of results that contradicted their political assumptions, the smartest people were barely more likely to arrive at the correct answer than the people with no math skills at all. Political bias had erased the advantages of stronger reasoning skills.
  • The reason that measurable facts were sidelined in political debates was not that people have poor reasoning skills, Kahan concluded. Presented with a conflict between holding to their beliefs or finding the correct answer to a problem, people simply went with their tribe.
  • It wasa reasonable strategy on the individual level – and a “disastrous” one for tackling social change, he concluded.
  • But the biggest distortion in the gun control debate is the dramatic empathy gap between different kinds of victims. It’s striking how puritanical the American imagination is, how narrow its range of sympathy. Mass shootings, in which the perpetrator kills complete strangers at random in a public place, prompt an outpouring of grief for the innocent lives lost. These shootings are undoubtedly horrifying, but they account for a tiny percentage of America’s overall gun deaths each year.
  • The roughly 60 gun suicides each day, the 19 black men and boys lost each day to homicide, do not inspire the same reaction, even though they represent the majority of gun violence victims. Yet there are meaningful measures which could save lives here – targeted inventions by frontline workers in neighborhoods where the gun homicide rate is 400 times higher than other developed countries, awareness campaigns to help gun owners in rural states learn about how to identify suicide risk and intervene with friends in trouble.
  • When it comes to suicide, “there is so much shame about that conversation … and where there is shame there is also denial,”
  • When young men of color are killed, “you have disdain and aggression,” fueled by the type of white supremacist argument which equates blackness with criminality.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

The Story Behind the SAT Overhaul - NYTimes.com - 2 views

  • “When you cover too many topics,” Coleman said, “the assessments designed to measure those standards are inevitably superficial.” He pointed to research showing that more students entering college weren’t prepared and were forced into “remediation programs from which they never escape.” In math, for example, if you examined data from top-performing countries, you found an approach that emphasized “far fewer topics, far deeper,” the opposite of the curriculums he found in the United States, which he described as “a mile wide and an inch deep.”
  • The lessons he brought with him from thinking about the Common Core were evident — that American education needed to be more focused and less superficial, and that it should be possible to test the success of the newly defined standards through an exam that reflected the material being taught in the classroom.
  • she and her team had extensive conversations with students, teachers, parents, counselors, admissions officers and college instructors, asking each group to tell them in detail what they wanted from the test. What they arrived at above all was that a test should reflect the most important skills that were imparted by the best teachers
  • ...12 more annotations...
  • for example, a good instructor would teach Martin Luther King Jr.’s “I Have a Dream” speech by encouraging a conversation that involved analyzing the text and identifying the evidence, both factual and rhetorical, that makes it persuasive. “The opposite of what we’d want is a classroom where a teacher might ask only: ‘What was the year the speech was given? Where was it given?’ ”
  • in the past, assembling the SAT focused on making sure the questions performed on technical grounds, meaning: Were they appropriately easy or difficult among a wide range of students, and were they free of bias when tested across ethnic, racial and religious subgroups? The goal was “maximizing differentiation” among kids, which meant finding items that were answered correctly by those students who were expected to get them right and incorrectly by the weaker students. A simple way of achieving this, Coleman said, was to test the kind of obscure vocabulary words for which the SAT was famous
  • In redesigning the test, the College Board shifted its emphasis. It prioritized content, measuring each question against a set of specifications that reflect the kind of reading and math that students would encounter in college and their work lives. Schmeiser and others then spent much of early last year watching students as they answered a set of 20 or so problems, discussing the questions with the students afterward. “The predictive validity is going to come out the same,” she said of the redesigned test. “But in the new test, we have much more control over the content and skills that are being measured.”
  • Evidence-based reading and writing, he said, will replace the current sections on reading and writing. It will use as its source materials pieces of writing — from science articles to historical documents to literature excerpts — which research suggests are important for educated Americans to know and understand deeply. “The Declaration of Independence, the Constitution, the Bill of Rights and the Federalist Papers,” Coleman said, “have managed to inspire an enduring great conversation about freedom, justice, human dignity in this country and the world” — therefore every SAT will contain a passage from either a founding document or from a text (like Lincoln’s Gettysburg Address) that is part of the “great global conversation” the founding documents inspired.
  • The Barbara Jordan vocabulary question would have a follow-up — “How do you know your answer is correct?” — to which students would respond by identifying lines in the passage that supported their answer.
  • The idea is that the test will emphasize words students should be encountering, like “synthesis,” which can have several meanings depending on their context. Instead of encouraging students to memorize flashcards, the test should promote the idea that they must read widely throughout their high-school years.
  • . No longer will it be good enough to focus on tricks and trying to eliminate answer choices. We are not interested in students just picking an answer, but justifying their answers.”
  • the essay portion of the test will also be reformulated so that it will always be the same, some version of: “As you read the passage in front of you, consider how the author uses evidence such as facts or examples; reasoning to develop ideas and to connect claims and evidence; and stylistic or persuasive elements to add power to the ideas expressed. Write an essay in which you explain how the author builds an argument to persuade an audience.”
  • The math section, too, will be predicated on research that shows that there are “a few areas of math that are a prerequisite for a wide range of college courses” and careers. Coleman conceded that some might treat the news that they were shifting away from more obscure math problems to these fewer fundamental skills as a dumbing-down the test, but he was adamant that this was not the case. He explained that there will be three areas of focus: problem solving and data analysis, which will include ratios and percentages and other mathematical reasoning used to solve problems in the real world; the “heart of algebra,” which will test how well students can work with linear equations (“a powerful set of tools that echo throughout many fields of study”); and what will be called the “passport to advanced math,” which will focus on the student’s familiarity with complex equations and their applications in science and social science.
  • “Sometimes in the past, there’s been a feeling that tests were measuring some sort of ineffable entity such as intelligence, whatever that might mean. Or ability, whatever that might mean. What this is is a clear message that good hard work is going to pay off and achievement is going to pay off. This is one of the most significant developments that I have seen in the 40-plus years that I’ve been working in admissions in higher education.”
  • The idea of creating a transparent test and then providing a free website that any student could use — not to learn gimmicks but to get a better grounding and additional practice in the core knowledge that would be tested — was appealing to Coleman.
  • (The College Board won’t pay Khan Academy.) They talked about a hypothetical test-prep experience in which students would log on to a personal dashboard, indicate that they wanted to prepare for the SAT and then work through a series of preliminary questions to demonstrate their initial skill level and identify the gaps in their knowledge. Khan said he could foresee a way to estimate the amount of time it would take to achieve certain benchmarks. “It might go something like, ‘O.K., we think you’ll be able to get to this level within the next month and this level within the next two months if you put in 30 minutes a day,’ ” he said. And he saw no reason the site couldn’t predict for anyone, anywhere the score he or she might hope to achieve with a commitment to a prescribed amount of work.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
sanderk

Is Pain a Construct of the Mind? - Scientific American - 0 views

  • Clear Lake found that Rogers could recruit an abnormally high number of muscle fibers. But was this ability because of a freak genetic mutation? Another possibility, which Rogers thinks is more likely, is the way he processes pain when he strains those muscles.
  • What if, instead of superpowered muscles, Rogers has a normal—though extremely well exercised—body, and his abilities arise because he can withstand more pain than most mere mortals?
  • Rogers reasons that, unlike in the dentist's office—where he has no control over the pain that is inflicted on him—he has direct executive control over pain that he inflicts on himself. “I know it's coming, I have an idea of what to expect and I can decide to ignore it,” he says. Confronted with severe pain, most people fear that they will damage their body permanently if they persist, so they stop well before they are in real danger, Rogers explains.
  • ...2 more annotations...
  • Maybe Rogers's muscle cells are normal, and he experiences pain as most of us do but chooses to disregard it when he feels in command.
  • An illusion is a perception that does not match the physical reality. Is pain, then, as with illusions, a mind construct that some people can decide to turn off? As you will see in the studies that follow, pain varies as a function of mood, attentiveness and circumstances, lending support to the theory that pain is an emotion.
  •  
    During practice, my coaches always say that I need to overcome pain but I never knew how. I found this article interesting because it says that pain is an emotion. I have never thought that pain could even be close to an emotion. It is interesting how the world's strongest man can just ignore the pain and keep doing these incredible feats. Controlling one's pain must either be a skill or just a natural-born gift. Like in TOK, we must practice skills such as controlling our emotions. If pain is an emotion, then theoretically, I could control it. I am going to test this out in my life and see if I can control my own pain during exercise.
Javier E

A New Kind of Tutoring Aims to Make Students Smarter - NYTimes.com - 1 views

  • the goal is to improve cognitive skills. LearningRx is one of a growing number of such commercial services — some online, others offered by psychologists. Unlike traditional tutoring services that seek to help students master a subject, brain training purports to enhance comprehension and the ability to analyze and mentally manipulate concepts, images, sounds and instructions. In a word, it seeks to make students smarter.
  • “The average gain on I.Q. is 15 points after 24 weeks of training, and 20 points in less than 32 weeks.”
  • , “Our users have reported profound benefits that include: clearer and quicker thinking; faster problem-solving skills; increased alertness and awareness; better concentration at work or while driving; sharper memory for names, numbers and directions.”
  • ...8 more annotations...
  • “It used to take me an hour to memorize 20 words. Now I can learn, like, 40 new words in 20 minutes.”
  • “I don’t know if it makes you smarter. But when you get to each new level on the math and reading tasks, it definitely builds up your self-confidence.”
  • . “What you care about is not an intelligence test score, but whether your ability to do an important task has really improved. That’s a chain of evidence that would be really great to have. I haven’t seen it.”
  • Still,a new and growing body of scientific evidence indicates that cognitive training can be effective, including that offered by commercial services.
  • He looked at 340 middle-school students who spent two hours a week for a semester using LearningRx exercises in their schools’ computer labs and an equal number of students who received no such training. Those who played the online games, Dr. Hill found, not only improved significantly on measures of cognitive abilities compared to their peers, but also on Virginia’s annual Standards of Learning exam.
  • I’ve had some kids who not only reported that they had very big changes in the classroom, but when we bring them back in the laboratory to do neuropsychological testing, we also see great changes. They show increases that would be highly unlikely to happen just by chance.”
  • where crosswords and Sudoku are intended to be a diversion, the games here give that same kind of reward, only they’re designed to improve your brain, your memory, your problem-solving skills.”
  • More than 40 games are offered by Lumosity. One, the N-back, is based on a task developed decades ago by psychologists. Created to test working memory, the N-back challenges users to keep track of a continuously updated list and remember which item appeared “n” times ago.
sissij

Pregnancy Changes the Brain in Ways That May Help Mothering - The New York Times - 0 views

  • Pregnancy changes a woman’s brain, altering the size and structure of areas involved in perceiving the feelings and perspectives of others, according to a first-of-its-kind study published Monday.
  • The results were remarkable: loss of gray matter in several brain areas involved in a process called social cognition or “theory of mind,” the ability to register and consider how other people perceive things.
  • A third possibility is that the loss is “part of the brain’s program for dealing with the future,” he said. Hormone surges in pregnancy might cause “pruning or cellular adaptation that is helpful,” he said, streamlining certain brain areas to be more efficient at mothering skills “from nurturing to extra vigilance to teaching.”
  • ...4 more annotations...
  • Pregnancy, she explained, may help a woman’s brain specialize in “a mother’s ability to recognize the needs of her infant, to recognize social threats or to promote mother-infant bonding.”
  • Researchers wanted to see if the women’s brain changes affected anything related to mothering. They found that relevant brain regions in mothers showed more activity when women looked at photos of their own babies than with photos of other children.
  • During another period of roiling hormonal change — adolescence — gray matter decreases in several brain regions that are believed to provide fine-tuning for the social, emotional and cognitive territory of being a teenager.
  • evidence against the common myth of ‘mommy brain.’
  •  
    Our brain changes during our lifetime to better fit our need. The decrease in gray matter in brain during pregnancy enables mothers to learn mothering skills fasters and be more focused on their own child. This aligns with the logic of evolution because newborns need a lot of attention and care from their mother. I am also very surprised to see that the similar thing also happens to teenager. The decrease in gray matter gives plasticity for teenagers to absorb new knowledge. It's so amazing that our brain is actually adjusting itself in different stages of life. --Sissi (12/20/2016)
Javier E

The Superior Social Skills of Bilinguals - The New York Times - 2 views

  • We found that bilingual children were better than monolingual children at this task. If you think about it, this makes intuitive sense. Interpreting someone’s utterance often requires attending not just to its content, but also to the surrounding context. What does a speaker know or not know? What did she intend to convey? Children in multilingual environments have social experiences that provide routine practice in considering the perspectives of others: They have to think about who speaks which language to whom, who understands which content, and the times and places in which different languages are spoken.
  • children who were effectively monolingual yet regularly exposed to another language — for example, those who had grandparents who spoke another language — were just as talented as the bilingual children at this task. It seems that being raised in an environment in which multiple languages are spoken, rather than being bilingual per se, is the driving factor.
Javier E

Practice And The Self - The Dish | By Andrew Sullivan - The Daily Beast - 0 views

  • Just about everything we do modifies connections between brain cells—learning and memory are dependent on this flexibility. When we improve a skill through practice, we strengthen connections between neurons involved in that skill
  • it reflects what one might call a conservative truth: we become what we do. We are not Etch-A-Sketches, blank slates on whom a new abstract idea can simply and easily be applied to turn our lives around. We are constantly evolving organisms, each choice leading to another fate and another choice and all of these creating us, slowly, by will and habit.
  • . I'm also reminded of Pascal's rather controversial dictum that merely practicing faith will instill it. Acts become thoughts which become acts, and habits become personality which becomes character. There can be no total rupture - which is why I am not a fan of "born-again" Christianity. It only takes if it reorients practice.
  • ...2 more annotations...
  • what exercize and diet and sleep do for the body, thought and practice and sleep do for the brain. And when this kind of practice of something becomes effortless, when it becomes second nature, instinctual, it becomes part of you and you of it. You simply cannot describe the great skill of a craftsman, or a cook, or a priest, or an artist except by observing how he or she has become what she creates and does.
  • where the idea and the practice and the person simply become one, human activity takes flight. It becomes integral.
Javier E

Welcome, Robot Overlords. Please Don't Fire Us? | Mother Jones - 0 views

  • There will be no place to go but the unemployment line.
  • There will be no place to go but the unemployment line.
  • at this point our tale takes a darker turn. What do we do over the next few decades as robots become steadily more capable and steadily begin taking away all our jobs?
  • ...34 more annotations...
  • The economics community just hasn't spent much time over the past couple of decades focusing on the effect that machine intelligence is likely to have on the labor marke
  • The Digital Revolution is different because computers can perform cognitive tasks too, and that means machines will eventually be able to run themselves. When that happens, they won't just put individuals out of work temporarily. Entire classes of workers will be out of work permanently. In other words, the Luddites weren't wrong. They were just 200 years too early
  • Slowly but steadily, labor's share of total national income has gone down, while the share going to capital owners has gone up. The most obvious effect of this is the skyrocketing wealth of the top 1 percent, due mostly to huge increases in capital gains and investment income.
  • Robotic pets are growing so popular that Sherry Turkle, an MIT professor who studies the way we interact with technology, is uneasy about it: "The idea of some kind of artificial companionship," she says, "is already becoming the new normal."
  • robots will take over more and more jobs. And guess who will own all these robots? People with money, of course. As this happens, capital will become ever more powerful and labor will become ever more worthless. Those without money—most of us—will live on whatever crumbs the owners of capital allow us.
  • Economist Paul Krugman recently remarked that our long-standing belief in skills and education as the keys to financial success may well be outdated. In a blog post titled "Rise of the Robots," he reviewed some recent economic data and predicted that we're entering an era where the prime cause of income inequality will be something else entirely: capital vs. labor.
  • while it's easy to believe that some jobs can never be done by machines—do the elderly really want to be tended by robots?—that may not be true.
  • Third, as more people compete for fewer jobs, we'd expect to see middle-class incomes flatten in a race to the bottom.
  • The question we want to answer is simple: If CBTC is already happening—not a lot, but just a little bit—what trends would we expect to see? What are the signs of a computer-driven economy?
  • if automation were displacing labor, we'd expect to see a steady decline in the share of the population that's employed.
  • Second, we'd expect to see fewer job openings than in the past.
  • In the economics literature, the increase in the share of income going to capital owners is known as capital-biased technological change
  • Fourth, with consumption stagnant, we'd expect to see corporations stockpile more cash and, fearing weaker sales, invest less in new products and new factories
  • Fifth, as a result of all this, we'd expect to see labor's share of national income decline and capital's share rise.
  • We're already seeing them, and not just because of the crash of 2008. They started showing up in the statistics more than a decade ago. For a while, though, they were masked by the dot-com and housing bubbles, so when the financial crisis hit, years' worth of decline was compressed into 24 months. The trend lines dropped off the cliff.
  • Corporate executives should worry too. For a while, everything will seem great for them: Falling labor costs will produce heftier profits and bigger bonuses. But then it will all come crashing down. After all, robots might be able to produce goods and services, but they can't consume them
  • in another sense, we should be very alarmed. It's one thing to suggest that robots are going to cause mass unemployment starting in 2030 or so. We'd have some time to come to grips with that. But the evidence suggests that—slowly, haltingly—it's happening already, and we're simply not prepared for it.
  • the first jobs to go will be middle-skill jobs. Despite impressive advances, robots still don't have the dexterity to perform many common kinds of manual labor that are simple for humans—digging ditches, changing bedpans. Nor are they any good at jobs that require a lot of cognitive skill—teaching classes, writing magazine articles
  • in the middle you have jobs that are both fairly routine and require no manual dexterity. So that may be where the hollowing out starts: with desk jobs in places like accounting or customer support.
  • In fact, there's even a digital sports writer. It's true that a human being wrote this story—ask my mother if you're not sure—but in a decade or two I might be out of a job too
  • Doctors should probably be worried as well. Remember Watson, the Jeopardy!-playing computer? It's now being fed millions of pages of medical information so that it can help physicians do a better job of diagnosing diseases. In another decade, there's a good chance that Watson will be able to do this without any human help at all.
  • Take driverless cars.
  • The next step might be passenger vehicles on fixed routes, like airport shuttles. Then long-haul trucks. Then buses and taxis. There are 2.5 million workers who drive trucks, buses, and taxis for a living, and there's a good chance that, one by one, all of them will be displaced
  • There will be no place to go but the unemployment lin
  • we'll need to let go of some familiar convictions. Left-leaning observers may continue to think that stagnating incomes can be improved with better education and equality of opportunity. Conservatives will continue to insist that people without jobs are lazy bums who shouldn't be coddled. They'll both be wrong.
  • The modern economy is complex, and most of these trends have multiple causes.
  • we'll probably have only a few options open to us. The simplest, because it's relatively familiar, is to tax capital at high rates and use the money to support displaced workers. In other words, as The Economist's Ryan Avent puts it, "redistribution, and a lot of it."
  • would we be happy in a society that offers real work to a dwindling few and bread and circuses for the rest?
  • Most likely, owners of capital would strongly resist higher taxes, as they always have, while workers would be unhappy with their enforced idleness. Still, the ancient Romans managed to get used to it—with slave labor playing the role of robots—and we might have to, as well.
  •  economist Noah Smith suggests that we might have to fundamentally change the way we think about how we share economic growth. Right now, he points out, everyone is born with an endowment of labor by virtue of having a body and a brain that can be traded for income. But what to do when that endowment is worth a fraction of what it is today? Smith's suggestion: "Why not also an endowment of capital? What if, when each citizen turns 18, the government bought him or her a diversified portfolio of equity?"
  • In simple terms, if owners of capital are capturing an increasing fraction of national income, then that capital needs to be shared more widely if we want to maintain a middle-class society.
  • it's time to start thinking about our automated future in earnest. The history of mass economic displacement isn't encouraging—fascists in the '20s, Nazis in the '30s—and recent high levels of unemployment in Greece and Italy have already produced rioting in the streets and larger followings for right-wing populist parties. And that's after only a few years of misery.
  • When the robot revolution finally starts to happen, it's going to happen fast, and it's going to turn our world upside down. It's easy to joke about our future robot overlords—R2-D2 or the Terminator?—but the challenge that machine intelligence presents really isn't science fiction anymore. Like Lake Michigan with an inch of water in it, it's happening around us right now even if it's hard to see
  • A robotic paradise of leisure and contemplation eventually awaits us, but we have a long and dimly lit tunnel to navigate before we get there.
Javier E

Creativity Becomes an Academic Discipline - NYTimes.com - 0 views

  • Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill.
  • “The reality is that to survive in a fast-changing world you need to be creative,”
  • “That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”
  • ...16 more annotations...
  • Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.
  • The method, which is used in Buffalo State classrooms, has four steps: clarifying, ideating, developing and implementing. People tend to gravitate to particular steps, suggesting their primary thinking style.
  • What’s igniting campuses, though, is the conviction that everyone is creative, and can learn to be more so.
  • Just about every pedagogical toolbox taps similar strategies, employing divergent thinking (generating multiple ideas) and convergent thinking (finding what works).The real genius, of course, is in the how.
  • as content knowledge evolves at lightning speed, educators are talking more and more about “process skills,” strategies to reframe challenges and extrapolate and transform information, and to accept and deal with ambiguity.
  • Ideating is brainstorming and calls for getting rid of your inner naysayer to let your imagination fly.
  • Clarifying — asking the right question — is critical because people often misstate or misperceive a problem. “If you don’t have the right frame for the situation, it’s difficult to come up with a breakthrough,
  • Developing is building out a solution, and maybe finding that it doesn’t work and having to start over
  • Implementing calls for convincing others that your idea has value.
  • “the frequency and intensity of failures is an implicit principle of the course. Getting into a creative mind-set involves a lot of trial and error.”
  • His favorite assignments? Construct a résumé based on things that didn’t work out and find the meaning and influence these have had on your choices.
  • “Examine what in the culture is preventing you from creating something new or different. And what is it like to look like a fool because a lot of things won’t work out and you will look foolish? So how do you handle that?”
  • Because academics run from failure, Mr. Keywell says, universities are “way too often shapers of formulaic minds,” and encourage students to repeat and internalize fail-safe ideas.
  • “The new people who will be creative will sit at the juxtaposition of two or more fields,” she says. When ideas from different fields collide, Dr. Cramond says, fresh ones are generated.
  • Basic creativity tools used at the Torrance Center include thinking by analogy, looking for and making patterns, playing, literally, to encourage ideas, and learning to abstract problems to their essence.
  • students explore definitions of creativity, characteristics of creative people and strategies to enhance their own creativity.These include rephrasing problems as questions, learning not to instinctively shoot down a new idea (first find three positives), and categorizing problems as needing a solution that requires either action, planning or invention.
charlottedonoho

How Technology Can Help Language Learning | Suren Ramasubbu - 0 views

  • Intelligence, according to Gardner, is of eight types - verbal-linguistic, logical-mathematical, musical-rhythmic, visual-spatial, bodily-kinesthetic, interpersonal, intrapersonal, and naturalistic; existential and moral intelligence were added as afterthoughts in the definition of Intelligence. This is the first in a series of posts that explore and understand how each of the above forms of intelligence is affected by technology-mediated education.
  • Verbal-linguistic Intelligence involves sensitivity to spoken and written language, the ability to learn languages, and the capacity to use language to accomplish goals. Such intelligence is fostered by three specific activities: reading, writing and interpersonal communication - both written and oral.
  • Technology allows addition of multisensory elements that provide meaningful contexts to facilitate comprehension, thus expanding the learning ground of language and linguistics.
  • ...8 more annotations...
  • Research into the effect of technology on the development of the language and literacy skills vis-à-vis reading activities of children has offered evidence for favorable effects of digital-form books.
  • E-books are also being increasingly used to teach reading among beginners and children with reading difficulties.
  • Technology can be used to improve reading ability in many ways. It can enhance and sustain the interest levels for digitial natives by allowing immediate feedback on performance and providing added practice when necessary.
  • Technology can also help in improvement of writing skills. Word processing software promotes not only composition but also editing and revising in ways that streamline the task of writing.
  • However, the web cannot be discounted as being "bad for language", considering that it also offers very useful tools such as blogging and microblogging that can help the student improve her writing skills with dynamic feedback. The possibility of incorporating other media into a written document (e.g. figures, graphics, videos etc.) can enhance the joy of writing using technology.
  • Technology enhanced oral communication is indeed useful in that it allows students from remote locations, or from all over the world to communicate orally through video and audio conferencing tools.
  • As with anything to do with technology, there are also detractors who propose negative influence of features like animation, sound, music and other multimedia effects possible in digital media, which may distract young readers from the story content.
  • Such complaints notwithstanding, the symbiotic ties between linguistics and technology cannot be ignored.
jongardner04

To Improve a Memory, Consider Chocolate - NYTimes.com - 1 views

  •  
    Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age.
  •  
    Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age.
Javier E

People Argue Just to Win, Scholars Assert - NYTimes.com - 0 views

  • For centuries thinkers have assumed that the uniquely human capacity for reasoning has existed to let people reach beyond mere perception and reflex in the search for truth.
  • Now some researchers are suggesting that reason evolved for a completely different purpose: to win arguments. Rationality, by this yardstick (and irrationality too, but we’ll get to that) is nothing more or less than a servant of the hard-wired compulsion to triumph in the debating arena. According to this view, bias, lack of logic and other supposed flaws that pollute the stream of reason are instead social adaptations that enable one group to persuade (and defeat) another.
  • the argumentative theory of reasoning
  • ...11 more annotations...
  • It was a purely social phenomenon. It evolved to help us convince others and to be careful when others try to convince us.” Truth and accuracy were beside the point.
  • Mr. Sperber wanted to figure out why people persisted in picking out evidence that supported their views and ignored the rest — what is known as confirmation bias — leading them to hold on to a belief doggedly in the face of overwhelming contrary evidence.
  • Other scholars have previously argued that reasoning and irrationality are both products of evolution. But they usually assume that the purpose of reasoning is to help an individual arrive at the truth, and that irrationality is a kink in that process, a sort of mental myopia.
  • distortions in reasoning are unintended side effects of blind evolution. They are a result of the way that the brain, a Rube Goldberg mental contraption, processes memory. People are more likely to remember items they are familiar with, like their own beliefs, rather than those of others.
  • What is revolutionary about argumentative theory is that it presumes that since reason has a different purpose — to win over an opposing group — flawed reasoning is an adaptation in itself, useful for bolstering debating skills.
  • attempts to rid people of biases have failed because reasoning does exactly what it is supposed to do: help win an argument.
  • To Ms. Narvaez, “reasoning is something that develops from experience; it’s a subset of what we really know.” And much of what we know cannot be put into words, she explained, pointing out that language evolved relatively late in human development.
  • Mr. Sperber and Mr. Mercier contend that as people became better at producing and picking apart arguments, their assessment skills evolved as well.
  • “At least in some cultural contexts, this results in a kind of arms race towards greater sophistication in the production and evaluation of arguments,” they write. “When people are motivated to reason, they do a better job at accepting only sound arguments, which is quite generally to their advantage.” Groups are more likely than individuals to come up with better results, they say, because they will be exposed to the best arguments
  • In a new paper, he and Hélène Landemore, an assistant professor of political science at Yale, propose that the arguing and assessment skills employed by groups make democratic debate the best form of government for evolutionary reasons, regardless of philosophical or moral rationales.
  • Mr. Mercier and Ms. Landemore, as a practical matter, endorse the theory of deliberative democracy, an approach that arose in the 1980s, which envisions cooperative town-hall-style deliberations. Championed by the philosophers John Rawls and Jürgen Habermas, this sort of collaborative forum can overcome the tendency of groups to polarize at the extremes and deadlock,
Javier E

Americans Think We Have the World's Best Colleges. We Don't. - NYTimes.com - 1 views

  • When President Obama has said, “We have the best universities,” he has not meant: “Our universities are, on average, the best” — even though that’s what many people hear. He means, “Of the best universities, most are ours.” The distinction is important.
  • We see K-12 schools and colleges differently because we’re looking at two different yardsticks: the academic performance of the whole population of students in one case, the research performance of a small number of institutions in the other.
  • The fair way to compare the two systems, to each other and to systems in other countries, would be to conduct something like a PISA for higher education. That had never been done until late 2013, when the O.E.C.D. published exactly such a study.
  • ...7 more annotations...
  • The project is called the Program for the International Assessment of Adult Competencies (known as Piaac, sometimes called “pee-ack”). In 2011 and 2012, 166,000 adults ages 16 to 65 were tested in the O.E.C.D. countries
  • Like PISA, Piaac tests people’s literacy and math skills. Because the test takers were adults, they were asked to use those skills in real-world contexts.
  • As with the measures of K-12 education, the United States battles it out for last place, this time with Italy and Spain.
  • Only 18 percent of American adults with bachelor’s degrees score at the top two levels of numeracy, compared with the international average of 24 percent. Over one-third of American bachelor’s degree holders failed to reach Level 3 on the five-level Piaac scale, which means that they cannot perform math-related tasks that “require several steps and may involve the choice of problem-solving strategies.”
  • American results on the literacy and technology tests were somewhat better, in the sense that they were only mediocre. American adults were eighth from the bottom in literacy, for instance. And recent college graduates look no better than older ones. Among people ages 16 to 29 with a bachelor’s degree or better, America ranks 16th out of 24 in numeracy.
  • There is no reason to believe that American colleges are, on average, the best in the world.
  • Instead, Piaac suggests that the wide disparities of knowledge and skill present among American schoolchildren are not ameliorated by higher education. If anything, they are magnified. In 2000, American 15-year-olds scored slightly above the international average. Twelve years later, Americans who were about 12 years older scored below the international average.
Javier E

Turning Negative Thinkers Into Positive Ones - The New York Times - 2 views

  • “The results suggest that taking time to learn the skills to self-generate positive emotions can help us become healthier, more social, more resilient versions of ourselves,”
  • as little as two weeks’ training in compassion and kindness meditation generated changes in brain circuitry linked to an increase in positive social behaviors like generosity.
  • Dr. Fredrickson’s team found that six weeks of training in a form of meditation focused on compassion and kindness resulted in an increase in positive emotions and social connectedness and improved function of one of the main nerves that helps to control heart rate. The result is a more variable heart rate that, she said in an interview, is associated with objective health benefits like better control of blood glucose, less inflammation and faster recovery from a heart attack.
  • ...13 more annotations...
  • he and Dr. Fredrickson and their colleagues have demonstrated that the brain is “plastic,” or capable of generating new cells and pathways, and it is possible to train the circuitry in the brain to promote more positive responses. That is, a person can learn to be more positive by practicing certain skills that foster positivity.
  • Negative feelings activate a region of the brain called the amygdala, which is involved in processing fear and anxiety and other emotions. Dr. Richard J. Davidson, a neuroscientist and founder of the Center for Healthy Minds at the University of Wisconsin — Madison, has shown that people in whom the amygdala recovers slowly from a threat are at greater risk for a variety of health problems than those in whom it recovers quickly.
  • Worry, sadness, anger and other such “downers” have their place in any normal life. But chronically viewing the glass as half-empty is detrimental both mentally and physically and inhibits one’s ability to bounce back from life’s inevitable stresses.
  • In other words, Dr. Davidson said, “well-being can be considered a life skill. If you practice, you can actually get better at it.”
  • Activities Dr. Fredrickson and others endorse to foster positive emotions include:
  • Do good things for other people
  • Appreciate the world around you
  • Develop and bolster relationships.
  • Establish goals that can be accomplished.
  • Learn something new.
  • Choose to accept yourself, flaws and all.
  • Practice resilience.
  • Practice mindfulness
Javier E

Look At Me by Patricia Snow | Articles | First Things - 0 views

  • Maurice stumbles upon what is still the gold standard for the treatment of infantile autism: an intensive course of behavioral therapy called applied behavioral analysis that was developed by psychologist O. Ivar Lovaas at UCLA in the 1970s
  • in a little over a year’s time she recovers her daughter to the point that she is indistinguishable from her peers.
  • Let Me Hear Your Voice is not a particularly religious or pious work. It is not the story of a miracle or a faith healing
  • ...54 more annotations...
  • Maurice discloses her Catholicism, and the reader is aware that prayer undergirds the therapy, but the book is about the therapy, not the prayer. Specifically, it is about the importance of choosing methods of treatment that are supported by scientific data. Applied behavioral analysis is all about data: its daily collection and interpretation. The method is empirical, hard-headed, and results-oriented.
  • on a deeper level, the book is profoundly religious, more religious perhaps than its author intended. In this reading of the book, autism is not only a developmental disorder afflicting particular individuals, but a metaphor for the spiritual condition of fallen man.
  • Maurice’s autistic daughter is indifferent to her mother
  • In this reading of the book, the mother is God, watching a child of his wander away from him into darkness: a heartbroken but also a determined God, determined at any cost to bring the child back
  • the mother doesn’t turn back, concedes nothing to the condition that has overtaken her daughter. There is no political correctness in Maurice’s attitude to autism; no nod to “neurodiversity.” Like the God in Donne’s sonnet, “Batter my heart, three-personed God,” she storms the walls of her daughter’s condition
  • Like God, she sets her sights high, commits both herself and her child to a demanding, sometimes painful therapy (life!), and receives back in the end a fully alive, loving, talking, and laughing child
  • the reader realizes that for God, the harrowing drama of recovery is never a singular, or even a twice-told tale, but a perennial one. Every child of his, every child of Adam and Eve, wanders away from him into darkness
  • we have an epidemic of autism, or “autism spectrum disorder,” which includes classic autism (Maurice’s children’s diagnosis); atypical autism, which exhibits some but not all of the defects of autism; and Asperger’s syndrome, which is much more common in boys than in girls and is characterized by average or above average language skills but impaired social skills.
  • At the same time, all around us, we have an epidemic of something else. On the street and in the office, at the dinner table and on a remote hiking trail, in line at the deli and pushing a stroller through the park, people go about their business bent over a small glowing screen, as if praying.
  • This latter epidemic, or experiment, has been going on long enough that people are beginning to worry about its effects.
  • for a comprehensive survey of the emerging situation on the ground, the interested reader might look at Sherry Turkle’s recent book, Reclaiming Conversation: The Power of Talk in a Digital Age.
  • she also describes in exhaustive, chilling detail the mostly horrifying effects recent technology has had on families and workplaces, educational institutions, friendships and romance.
  • many of the promises of technology have not only not been realized, they have backfired. If technology promised greater connection, it has delivered greater alienation. If it promised greater cohesion, it has led to greater fragmentation, both on a communal and individual level.
  • If thinking that the grass is always greener somewhere else used to be a marker of human foolishness and a temptation to be resisted, today it is simply a possibility to be checked out. The new phones, especially, turn out to be portable Pied Pipers, irresistibly pulling people away from the people in front of them and the tasks at hand.
  • all it takes is a single phone on a table, even if that phone is turned off, for the conversations in the room to fade in number, duration, and emotional depth.
  • an infinitely malleable screen isn’t an invitation to stability, but to restlessness
  • Current media, and the fear of missing out that they foster (a motivator now so common it has its own acronym, FOMO), drive lives of continual interruption and distraction, of virtual rather than real relationships, and of “little” rather than “big” talk
  • if you may be interrupted at any time, it makes sense, as a student explains to Turkle, to “keep things light.”
  • we are reaping deficits in emotional intelligence and empathy; loneliness, but also fears of unrehearsed conversations and intimacy; difficulties forming attachments but also difficulties tolerating solitude and boredom
  • consider the testimony of the faculty at a reputable middle school where Turkle is called in as a consultant
  • The teachers tell Turkle that their students don’t make eye contact or read body language, have trouble listening, and don’t seem interested in each other, all markers of autism spectrum disorder
  • Like much younger children, they engage in parallel play, usually on their phones. Like autistic savants, they can call up endless information on their phones, but have no larger context or overarching narrative in which to situate it
  • Students are so caught up in their phones, one teacher says, “they don’t know how to pay attention to class or to themselves or to another person or to look in each other’s eyes and see what is going on.
  • “It is as though they all have some signs of being on an Asperger’s spectrum. But that’s impossible. We are talking about a schoolwide problem.”
  • Can technology cause Asperger’
  • “It is not necessary to settle this debate to state the obvious. If we don’t look at our children and engage them in conversation, it is not surprising if they grow up awkward and withdrawn.”
  • In the protocols developed by Ivar Lovaas for treating autism spectrum disorder, every discrete trial in the therapy, every drill, every interaction with the child, however seemingly innocuous, is prefaced by this clear command: “Look at me!”
  • If absence of relationship is a defining feature of autism, connecting with the child is both the means and the whole goal of the therapy. Applied behavioral analysis does not concern itself with when exactly, how, or why a child becomes autistic, but tries instead to correct, do over, and even perhaps actually rewire what went wrong, by going back to the beginning
  • Eye contact—which we know is essential for brain development, emotional stability, and social fluency—is the indispensable prerequisite of the therapy, the sine qua non of everything that happens.
  • There are no shortcuts to this method; no medications or apps to speed things up; no machines that can do the work for us. This is work that only human beings can do
  • it must not only be started early and be sufficiently intensive, but it must also be carried out in large part by parents themselves. Parents must be trained and involved, so that the treatment carries over into the home and continues for most of the child’s waking hours.
  • there are foundational relationships that are templates for all other relationships, and for learning itself.
  • Maurice’s book, in other words, is not fundamentally the story of a child acquiring skills, though she acquires them perforce. It is the story of the restoration of a child’s relationship with her parents
  • it is also impossible to overstate the time and commitment that were required to bring it about, especially today, when we have so little time, and such a faltering, diminished capacity for sustained engagement with small children
  • The very qualities that such engagement requires, whether our children are sick or well, are the same qualities being bred out of us by technologies that condition us to crave stimulation and distraction, and by a culture that, through a perverse alchemy, has changed what was supposed to be the freedom to work anywhere into an obligation to work everywhere.
  • In this world of total work (the phrase is Josef Pieper’s), the work of helping another person become fully human may be work that is passing beyond our reach, as our priorities, and the technologies that enable and reinforce them, steadily unfit us for the work of raising our own young.
  • in Turkle’s book, as often as not, it is young people who are distressed because their parents are unreachable. Some of the most painful testimony in Reclaiming Conversation is the testimony of teenagers who hope to do things differently when they have children, who hope someday to learn to have a real conversation, and so o
  • it was an older generation that first fell under technology’s spell. At the middle school Turkle visits, as at many other schools across the country, it is the grown-ups who decide to give every child a computer and deliver all course content electronically, meaning that they require their students to work from the very medium that distracts them, a decision the grown-ups are unwilling to reverse, even as they lament its consequences.
  • we have approached what Turkle calls the robotic moment, when we will have made ourselves into the kind of people who are ready for what robots have to offer. When people give each other less, machines seem less inhuman.
  • robot babysitters may not seem so bad. The robots, at least, will be reliable!
  • If human conversations are endangered, what of prayer, a conversation like no other? All of the qualities that human conversation requires—patience and commitment, an ability to listen and a tolerance for aridity—prayer requires in greater measure.
  • this conversation—the Church exists to restore. Everything in the traditional Church is there to facilitate and nourish this relationship. Everything breathes, “Look at me!”
  • there is a second path to God, equally enjoined by the Church, and that is the way of charity to the neighbor, but not the neighbor in the abstract.
  • “Who is my neighbor?” a lawyer asks Jesus in the Gospel of Luke. Jesus’s answer is, the one you encounter on the way.
  • Virtue is either concrete or it is nothing. Man’s path to God, like Jesus’s path on the earth, always passes through what the Jesuit Jean Pierre de Caussade called “the sacrament of the present moment,” which we could equally call “the sacrament of the present person,” the way of the Incarnation, the way of humility, or the Way of the Cross.
  • The tradition of Zen Buddhism expresses the same idea in positive terms: Be here now.
  • Both of these privileged paths to God, equally dependent on a quality of undivided attention and real presence, are vulnerable to the distracting eye-candy of our technologies
  • Turkle is at pains to show that multitasking is a myth, that anyone trying to do more than one thing at a time is doing nothing well. We could also call what she was doing multi-relating, another temptation or illusion widespread in the digital age. Turkle’s book is full of people who are online at the same time that they are with friends, who are texting other potential partners while they are on dates, and so on.
  • This is the situation in which many people find themselves today: thinking that they are special to someone because of something that transpired, only to discover that the other person is spread so thin, the interaction was meaningless. There is a new kind of promiscuity in the world, in other words, that turns out to be as hurtful as the old kind.
  • Who can actually multitask and multi-relate? Who can love everyone without diluting or cheapening the quality of love given to each individual? Who can love everyone without fomenting insecurity and jealousy? Only God can do this.
  • When an individual needs to be healed of the effects of screens and machines, it is real presence that he needs: real people in a real world, ideally a world of God’s own making
  • Nature is restorative, but it is conversation itself, unfolding in real time, that strikes these boys with the force of revelation. More even than the physical vistas surrounding them on a wilderness hike, unrehearsed conversation opens up for them new territory, open-ended adventures. “It was like a stream,” one boy says, “very ongoing. It wouldn’t break apart.”
  • in the waters of baptism, the new man is born, restored to his true parent, and a conversation begins that over the course of his whole life reminds man of who he is, that he is loved, and that someone watches over him always.
  • Even if the Church could keep screens out of her sanctuaries, people strongly attached to them would still be people poorly positioned to take advantage of what the Church has to offer. Anxious people, unable to sit alone with their thoughts. Compulsive people, accustomed to checking their phones, on average, every five and a half minutes. As these behaviors increase in the Church, what is at stake is man’s relationship with truth itself.
Javier E

How Tech Can Turn Doctors Into Clerical Workers - The New York Times - 0 views

  • what I see in my colleague is disillusionment, and it has come too early, and I am seeing too much of it.
  • In America today, the patient in the hospital bed is just the icon, a place holder for the real patient who is not in the bed but in the computer. That virtual entity gets all our attention. Old-fashioned “bedside” rounds conducted by the attending physician too often take place nowhere near the bed but have become “card flip” rounds
  • My young colleague slumping in the chair in my office survived the student years, then three years of internship and residency and is now a full-time practitioner and teacher. The despair I hear comes from being the highest-paid clerical worker in the hospital: For every one hour we spend cumulatively with patients, studies have shown, we spend nearly two hours on our primitive Electronic Health Records, or “E.H.R.s,” and another hour or two during sacred personal time.
  • ...23 more annotations...
  • The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know
  • Our $3.4 trillion health care system is responsible for more than a quarter of a million deaths per year because of medical error, the rough equivalent of, say, a jumbo jet’s crashing every day.
  • I can get cash and account details all over America and beyond. Yet I can’t reliably get a patient record from across town, let alone from a hospital in the same state, even if both places use the same brand of E.H.R
  • the leading E.H.R.s were never built with any understanding of the rituals of care or the user experience of physicians or nurses. A clinician will make roughly 4,000 keyboard clicks during a busy 10-hour emergency-room shift
  • In the process, our daily progress notes have become bloated cut-and-paste monsters that are inaccurate and hard to wade through. A half-page, handwritten progress note of the paper era might in a few lines tell you what a physician really thought
  • so much of the E.H.R., but particularly the physical exam it encodes, is a marvel of fiction, because we humans don’t want to leave a check box empty or leave gaps in a template.
  • For a study, my colleagues and I at Stanford solicited anecdotes from physicians nationwide about patients for whom an oversight in the exam (a “miss”) had resulted in real consequences, like diagnostic delay, radiation exposure, therapeutic or surgical misadventure, even death. They were the sorts of things that would leave no trace in the E.H.R. because the recorded exam always seems complete — and yet the omission would be glaring and memorable to other physicians involved in the subsequent care. We got more than 200 such anecdotes.
  • The reason for these errors? Most of them resulted from exams that simply weren’t done as claimed. “Food poisoning” was diagnosed because the strangulated hernia in the groin was overlooked, or patients were sent to the catheterization lab for chest pain because no one saw the shingles rash on the left chest.
  • I worry that such mistakes come because we’ve gotten trapped in the bunker of machine medicine. It is a preventable kind of failure
  • How we salivated at the idea of searchable records, of being able to graph fever trends, or white blood counts, or share records at a keystroke with another institution — “interoperability”
  • The seriously ill patient has entered another kingdom, an alternate universe, a place and a process that is frightening, infantilizing; that patient’s greatest need is both scientific state-of-the-art knowledge and genuine caring from another human being. Caring is expressed in listening, in the time-honored ritual of the skilled bedside exam — reading the body — in touching and looking at where it hurts and ultimately in localizing the disease for patients not on a screen, not on an image, not on a biopsy report, but on their bodies.
  • What if the computer gave the nurse the big picture of who he was both medically and as a person?
  • a professor at M.I.T. whose current interest in biomedical engineering is “bedside informatics,” marvels at the fact that in an I.C.U., a blizzard of monitors from disparate manufacturers display EKG, heart rate, respiratory rate, oxygen saturation, blood pressure, temperature and more, and yet none of this is pulled together, summarized and synthesized anywhere for the clinical staff to use
  • What these monitors do exceedingly well is sound alarms, an average of one alarm every eight minutes, or more than 180 per patient per day. What is our most common response to an alarm? We look for the button to silence the nuisance because, unlike those in a Boeing cockpit, say, our alarms are rarely diagnosing genuine danger.
  • By some estimates, more than 50 percent of physicians in the United States have at least one symptom of burnout, defined as a syndrome of emotional exhaustion, cynicism and decreased efficacy at work
  • It is on the increase, up by 9 percent from 2011 to 2014 in one national study. This is clearly not an individual problem but a systemic one, a 4,000-key-clicks-a-day problem.
  • The E.H.R. is only part of the issue: Other factors include rapid patient turnover, decreased autonomy, merging hospital systems, an aging population, the increasing medical complexity of patients. Even if the E.H.R. is not the sole cause of what ails us, believe me, it has become the symbol of burnou
  • burnout is one of the largest predictors of physician attrition from the work force. The total cost of recruiting a physician can be nearly $90,000, but the lost revenue per physician who leaves is between $500,000 and $1 million, even more in high-paying specialties.
  • I hold out hope that artificial intelligence and machine-learning algorithms will transform our experience, particularly if natural-language processing and video technology allow us to capture what is actually said and done in the exam room.
  • as with any lab test, what A.I. will provide is at best a recommendation that a physician using clinical judgment must decide how to apply.
  • True clinical judgment is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done.
  • Much of that is a result of poorly coordinated care, poor communication, patients falling through the cracks, knowledge not being transferred and so on, but some part of it is surely from failing to listen to the story and diminishing skill in reading the body as a text.
  • As he was nearing death, Avedis Donabedian, a guru of health care metrics, was asked by an interviewer about the commercialization of health care. “The secret of quality,” he replied, “is love.”/•/
caelengrubb

Distribution of Income - Econlib - 0 views

  • The distribution of income lies at the heart of an enduring issue in political economy—the extent to which government should redistribute income from those with more income to those with less.
  • The term “income distribution” is a statistical concept. No one person is distributing income. Rather, the income distribution arises from people’s decisions about work, saving, and investment as they interact through markets and are affected by the tax system.
  • In the longer view, the path of income inequality over the twentieth century is marked by two main events: a sharp fall in inequality around the outbreak of World War II and an extended rise in inequality that began in the mid-1970s and accelerated in the 1980s. Income inequality today is about as large as it was in the 1920s.
  • ...11 more annotations...
  • Over multiple years, family income fluctuates, and so the distribution of multiyear income is moderately more equal than the distribution of single-year income.
  • n one sense, the growth of inequality in the last part of the twentieth century comes as a surprise. In the 1950s, the bottom part of the income distribution contained large concentrations of two kinds of families: farm families whose in-kind income was not counted in Census data, and elderly families, many of whom were ineligible for the new Social Security program
  • Over subsequent decades, farm families declined as a proportion of the population while increased Social Security benefits and an expanding private pension system lifted elderly incomes. Both trends favored greater income equality but were outweighed by four main factors.
  • Family structure. Over time, the two-parent, one-earner family was increasingly replaced by low-income single-parent families and higher-income two-parent, two-earner families
  • Trade and technology increasingly shifted demand away from less-educated and less-skilled workers toward workers with higher education or particular skills. The result was a growing earnings gap between more- and less-educated/skilled workers.
  • With improved communications and transportation, people increasingly functioned in national, rather than local, markets. In these broader markets, persons with unique talents could command particularly high salaries.
  • In 2002, immigrants who had entered the country since 1980 constituted nearly 11 percent of the labor force (see immigration). A relatively high proportion of these immigrants had low levels of education and increased the number of workers competing for low-paid work.
  • A second offset to estimated inequality is economic mobility. Because most family incomes increase as people’s careers develop, long-run incomes are more equal than standard single-year statistics suggest
  • Is inequality of wages and incomes bad? The question seems ludicrous. Of course inequality is bad, isn’t it? Actually, no. What matters crucially is how the inequality came about.
  • Inequality of wages and incomes is clearly bad if it results from government privileges. Many people would find such an outcome unjust, but even more important to many economists is that such inequality sets up perverse incentives.
  • But inequality in wages and incomes in relatively free economies serves two important social functions.
margogramiak

Memory may be preserved in condition with brain changes similar to Alzheimer's disease ... - 0 views

  • About 40% of people with the condition have underlying Alzheimer's disease.
  • About 40% of people with the condition have underlying Alzheimer's disease.
    • margogramiak
       
      Based on that brief description, the two sound similar, so this makes sense.
  • While we knew that the memories of people with primary progressive aphasia were not affected at first, we did not know if they maintained their memory functioning over years,
    • margogramiak
       
      it's interesting to think about what a big deal losing your memory. We've learned so much about how unreliable our memories are, but we really do need them.
  • ...6 more annotations...
  • language skills.
    • margogramiak
       
      Language is another topic we've covered in class.
  • They were tested once and then again an average of 1.7 years later.
    • margogramiak
       
      Again, that's a long time. The research must be important for scientists to devote so much time to it.
  • Researchers tested memory skills of the people with primary progressive aphasia by showing them pictures of common objects. After waiting 10 minutes, they were shown the same pictures along with others and had to indicate whether they had seen the picture before. This test was given once and then again an average of 2.4 years later.
    • margogramiak
       
      2.4 years? This research has been a long time coming...
  • their verbal memory and language skills declined with equal severity during the study.
    • margogramiak
       
      So there are lots of similarities between the two.
  • Left sided asymmetry of brain shrinkage and a lower incidence of brain proteins known as ApoE4 and TDP-43 were identified as potential contributors to the preservation of memory in this rare type of Alzheimer's disease.
    • margogramiak
       
      the term "brain shrinkage" is very impactful.
  • Limitations of the study are the relatively small sample size and that autopsies were not available for all of the primary progressive aphasia cases.
    • margogramiak
       
      Noting limitations is part of the scientific method that we covered in class!
‹ Previous 21 - 40 of 296 Next › Last »
Showing 20 items per page