Skip to main content

Home/ TOK Friends/ Group items tagged pioneer

Rss Feed Group items tagged

Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Nate Silver, Artist of Uncertainty - 0 views

  • In 2008, Nate Silver correctly predicted the results of all 35 Senate races and the presidential results in 49 out of 50 states. Since then, his website, fivethirtyeight.com (now central to The New York Times’s political coverage), has become an essential source of rigorous, objective analysis of voter surveys to predict the Electoral College outcome of presidential campaigns. 
  • Political junkies, activists, strategists, and journalists will gain a deeper and more sobering sense of Silver’s methods in The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t (Penguin Press). A brilliant analysis of forecasting in finance, geology, politics, sports, weather, and other domains, Silver’s book is also an original fusion of cognitive psychology and modern statistical theory.
  • Its most important message is that the first step toward improving our predictions is learning how to live with uncertainty.
  • ...7 more annotations...
  • The second step is starting to understand why it is that big data, super computers, and mathematical sophistication haven’t made us better at separating signals (information with true predictive value) from noise (misleading information). 
  • Silver’s background in sports and poker turns out to be invaluable. Successful analysts in gambling and sports are different from fans and partisans—far more aware that “sure things” are likely to be illusions,
  • he blends the best of modern statistical analysis with research on cognition biases pioneered by Princeton psychologist and Nobel laureate in economics  Daniel Kahneman and the late Stanford psychologist Amos Tversky. 
  • One of the biggest problems we have in separating signal from noise is that when we look too hard for certainty that isn’t there, we often end up attracted to noise, either because it is more prominent or because it confirms what we would like to believe.
  • In discipline after discipline, Silver shows in his book that when you look at even the best single forecast, the average of all independent forecasts is 15 to 20 percent more accurate. 
  • Silver has taken the next major step: constantly incorporating both state polls and national polls into Bayesian models that also incorporate economic data.
  • Silver explains why we will be misled if we only consider significance tests—i.e., statements that the margin of error for the results is, for example, plus or minus four points, meaning there is one chance in 20 that the percentages reported are off by more than four. Calculations like these assume the only source of error is sampling error—the irreducible error—while ignoring errors attributable to house effects, like the proportion of cell-phone users, one of the complex set of assumptions every pollster must make about who will actually vote. In other words, such an approach ignores context in order to avoid having to justify and defend judgments. 
Javier E

Coursera Plans to Announce University Partners for Online Classes - NYTimes.com - 0 views

  • John Doerr, a Kleiner investment partner, said via e-mail that he saw a clear business model: “Yes. Even with free courses. From a community of millions of learners some should ‘opt in’ for valuable, premium services. Those revenues should fund investment in tools, technology and royalties to faculty and universities.”
  • Previously he said he had been involved with Stanford’s effort to put academic lectures online for viewing. But he noted that there was evidence that the newer interactive systems provided much more effective learning experiences.
  • Coursera and Udacity are not alone in the rush to offer mostly free online educational alternatives. Start-up companies like Minerva and Udemy, and, separately, the Massachusetts Institute of Technology, have recently announced similar platforms.
  • ...4 more annotations...
  • Unlike previous video lectures, which offered a “static” learning model, the Coursera system breaks lectures into segments as short as 10 minutes and offers quick online quizzes as part of each segment.
  • Where essays are required, especially in the humanities and social sciences, the system relies on the students themselves to grade their fellow students’ work, in effect turning them into teaching assistants.
  • The Coursera system also offers an online feature that allows students to get support from a global student community. Dr. Ng said an early test of the system found that questions were typically answered within 22 minutes.
  • Dr. Koller said the educational approach was similar to that of the “flipped classroom,” pioneered by the Khan Academy, a creation of the educator Salman Khan. Students watch lectures at home and then work on problem-solving or “homework” in the classroom, either one-on-one with the teacher or in small groups.
Javier E

untitled - 0 views

  • Scientists at Stanford University and the J. Craig Venter Institute have developed the first software simulation of an entire organism, a humble single-cell bacterium that lives in the human genital and respiratory tracts.
  • the work was a giant step toward developing computerized laboratories that could carry out many thousands of experiments much faster than is possible now, helping scientists penetrate the mysteries of diseases like cancer and Alzheimer’s.
  • cancer is not a one-gene problem; it’s a many-thousands-of-factors problem.”
  • ...7 more annotations...
  • This kind of modeling is already in use to study individual cellular processes like metabolism. But Dr. Covert said: “Where I think our work is different is that we explicitly include all of the genes and every known gene function. There’s no one else out there who has been able to include more than a handful of functions or more than, say, one-third of the genes.”
  • The simulation, which runs on a cluster of 128 computers, models the complete life span of the cell at the molecular level, charting the interactions of 28 categories of molecules — including DNA, RNA, proteins and small molecules known as metabolites, which are generated by cell processes.
  • They called the simulation an important advance in the new field of computational biology, which has recently yielded such achievements as the creation of a synthetic life form — an entire bacterial genome created by a team led by the genome pioneer J. Craig Venter. The scientists used it to take over an existing cell.
  • A decade ago, scientists developed simulations of metabolism that are now being used to study a wide array of cells, including bacteria, yeast and photosynthetic organisms. Other models exist for processes like protein synthesis.
  • “Right now, running a simulation for a single cell to divide only one time takes around 10 hours and generates half a gigabyte of data,” Dr. Covert wrote. “I find this fact completely fascinating, because I don’t know that anyone has ever asked how much data a living thing truly holds. We often think of the DNA as the storage medium, but clearly there is more to it than that.”
  • scientists chose an approach called object-oriented programming, which parallels the design of modern software systems. Software designers organize their programs in modules, which communicate with one another by passing data and instructions back and forth.
  • “The major modeling insight we had a few years ago was to break up the functionality of the cell into subgroups, which we could model individually, each with its own mathematics, and then to integrate these submodels together into a whole,”
Javier E

The Crowd Pleaser - NYTimes.com - 0 views

  • Obama seems self-sufficient while Romney seems other-directed.
  • I’m borrowing the phrase “other-directed” from David Riesman’s 1950 classic, “The Lonely Crowd.”
  • Riesman argued that different eras nurture different personality types. The agricultural economy nurtured tradition-directed individuals. People lived according to the ancient cycles, customs and beliefs. Children grew up and performed the same roles as their parents.
  • ...2 more annotations...
  • The industrial era favored the inner-directed personality type. The inner-directed person was guided by a set of strong internal convictions, like Victorian morality. The inner-directed person was a hardy pioneer, the stolid engineer or the resilient steelworker — working on physical things. This person was often rigid, but also steadfast.
  • The other-directed personality type emerges in a service or information age economy. In this sort of economy, most workers are not working with physical things; they are manipulating people. The other-directed person becomes adept at pleasing others, at selling him or herself. The other-directed person is attuned to what other people want him to be. The other-directed person is a pliable member of a team and yearns for acceptance. He or she is less notable for having a rigid character than for having a smooth personality.
Javier E

Young Women Often Trendsetters in Vocal Patterns - NYTimes.com - 0 views

  • vocal trends associated with young women are often seen as markers of immaturity or even stupidity.
  • such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize.
  • they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”
  • ...7 more annotations...
  • “The truth is this: Young women take linguistic features and use them as power tools for building relationships.”
  • women tend to be maybe half a generation ahead of males on average.”
  • Less clear is why. Some linguists suggest that women are more sensitive to social interactions and hence more likely to adopt subtle vocal cues. Others say women use language to assert their power in a culture that, at least in days gone by, asked them to be sedate and decorous. Another theory is that young women are simply given more leeway by society to speak flamboyantly.
  • Several studies have shown that uptalk can be used for any number of purposes, even to dominate a listener.
  • by far the most common uptalkers were fathers of young women. For them, it was “a way of showing themselves to be friendly and not asserting power in the situation,” she said.
  • So what does the use of vocal fry denote?
  • a natural result of women’s lowering their voices to sound more authoritative. It can also be used to communicate disinterest, something teenage girls are notoriously fond of doing.
anonymous

Young Women Often Trendsetters in Vocal Patterns - NYTimes.com - 0 views

  • Whether it be uptalk (pronouncing statements as if they were questions? Like this?), creating slang words like “bitchin’ ” and “ridic,” or the incessant use of “like” as a conversation filler, vocal trends associated with young women are often seen as markers of immaturity or even stupidity. Right? But linguists — many of whom once promoted theories consistent with that attitude — now say such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize. “A lot of these really flamboyant things you hear are cute, and girls are supposed to be cute,” said Penny Eckert, a professor of linguistics at Stanford University. “But they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”
Javier E

Breathing In vs. Spacing Out - NYTimes.com - 0 views

  • Although pioneers like Jon Kabat-Zinn, now emeritus professor at the University of Massachusetts Medical Center, began teaching mindfulness meditation as a means of reducing stress as far back as the 1970s, all but a dozen or so of the nearly 100 randomized clinical trials have been published since 2005.
  • Michael Posner, of the University of Oregon, and Yi-Yuan Tang, of Texas Tech University, used functional M.R.I.’s before and after participants spent a combined 11 hours over two weeks practicing a form of mindfulness meditation developed by Tang. They found that it enhanced the integrity and efficiency of the brain’s white matter, the tissue that connects and protects neurons emanating from the anterior cingulate cortex, a region of particular importance for rational decision-making and effortful problem-solving.
  • Perhaps that is why mindfulness has proved beneficial to prospective graduate students. In May, the journal Psychological Science published the results of a randomized trial showing that undergraduates instructed to spend a mere 10 minutes a day for two weeks practicing mindfulness made significant improvement on the verbal portion of the Graduate Record Exam — a gain of 16 percentile points. They also significantly increased their working memory capacity, the ability to maintain and manipulate multiple items of attention.
  • ...7 more annotations...
  • By emphasizing a focus on the here and now, it trains the mind to stay on task and avoid distraction.
  • “Your ability to recognize what your mind is engaging with, and control that, is really a core strength,” said Peter Malinowski, a psychologist and neuroscientist at Liverpool John Moores University in England. “For some people who begin mindfulness training, it’s the first time in their life where they realize that a thought or emotion is not their only reality, that they have the ability to stay focused on something else, for instance their breathing, and let that emotion or thought just pass by.”
  • the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness.
  • he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.
  • The trick is knowing when mindfulness is called for and when it’s not.
  • one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies.
  • “There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning
Javier E

The Age of 'Infopolitics' - NYTimes.com - 0 views

  • we need a new way of thinking about our informational milieu. What we need is a concept of infopolitics that would help us understand the increasingly dense ties between politics and information
  • Infopolitics encompasses not only traditional state surveillance and data surveillance, but also “data analytics” (the techniques that enable marketers at companies like Target to detect, for instance, if you are pregnant), digital rights movements (promoted by organizations like the Electronic Frontier Foundation), online-only crypto-currencies (like Bitcoin or Litecoin), algorithmic finance (like automated micro-trading) and digital property disputes (from peer-to-peer file sharing to property claims in the virtual world of Second Life)
  • Surveying this iceberg is crucial because atop it sits a new kind of person: the informational person. Politically and culturally, we are increasingly defined through an array of information architectures: highly designed environments of data, like our social media profiles, into which we often have to squeeze ourselves
  • ...12 more annotations...
  • We have become what the privacy theorist Daniel Solove calls “digital persons.” As such we are subject to infopolitics (or what the philosopher Grégoire Chamayou calls “datapower,” the political theorist Davide Panagia “datapolitik” and the pioneering thinker Donna Haraway “informatics of domination”).
  • Once fingerprints, biometrics, birth certificates and standardized names were operational, it became possible to implement an international passport system, a social security number and all other manner of paperwork that tells us who someone is. When all that paper ultimately went digital, the reams of data about us became radically more assessable and subject to manipulation,
  • We like to think of ourselves as somehow apart from all this information. We are real — the information is merely about us.
  • But what is it that is real? What would be left of you if someone took away all your numbers, cards, accounts, dossiers and other informational prostheses? Information is not just about you — it also constitutes who you are.
  • We understandably do not want to see ourselves as bits and bytes. But unless we begin conceptualizing ourselves in this way, we leave it to others to do it for us
  • agencies and corporations will continue producing new visions of you and me, and they will do so without our input if we remain stubbornly attached to antiquated conceptions of selfhood that keep us from admitting how informational we already are.
  • What should we do about our Internet and phone patterns’ being fastidiously harvested and stored away in remote databanks where they await inspection by future algorithms developed at the National Security Agency, Facebook, credit reporting firms like Experian and other new institutions of information and control that will come into existence in future decades?
  • What bits of the informational you will fall under scrutiny? The political you? The sexual you? What next-generation McCarthyisms await your informational self? And will those excesses of oversight be found in some Senate subcommittee against which we democratic citizens might hope to rise up in revolt — or will they lurk among algorithmic automatons that silently seal our fates in digital filing systems?
  • Despite their decidedly different political sensibilities, what links together the likes of Senator Wyden and the international hacker network known as Anonymous is that they respect the severity of what is at stake in our information.
  • information is a site for the call of justice today, alongside more quintessential battlefields like liberty of thought and equality of opportunity.
  • we lack the intellectual framework to grasp the new kinds of political injustices characteristic of today’s information society.
  • though nearly all of us have a vague sense that something is wrong with the new regimes of data surveillance, it is difficult for us to specify exactly what is happening and why it raises serious concern
grayton downing

BBC News - First human trial of new bone-marrow transplant method - 0 views

  • Doctors at London's Great Ormond Street Hospital have carried out a pioneering bone-marrow transplant technique.
  • Mohammed Ahmed, who is nearly five years old, was among the first three children in the world to try out the new treatment.
  • Mohammed's doctors then modified these donated immune cells, called "T-cells", in the lab to engineer a safety switch - a self-destruct message that could be activated if Mohammed's body should start to reject them once transplanted.
  • ...2 more annotations...
  • "We waited for a full match but it did not come. By the grace of God, we took the decision to have the treatment.
  • There are currently about 1,600 people in the UK waiting for a bone-marrow transplant and 37,000 worldwide.
Javier E

Arianna Huffington's Improbable, Insatiable Content Machine - The New York Times - 0 views

  • Display advertising — wherein advertisers pay each time an ad is shown to a reader — still dominates the market. But native advertising, designed to match the look and feel of the editorial content it runs alongside, has been on the rise for years.
  • the ethical debate in the media world is over. Socintel360, a research firm, predicts that spending on native advertising in the United States will more than double in the next four years to $18.4 billion.
  • news start-ups today are like cable-television networks in the early ’80s: small, pioneering companies that will be handsomely rewarded for figuring out how to monetize your attention through a new medium. If this is so, the size of The Huffington Post’s audience could one day justify that $1 billion valuation.
sgardner35

Edward Snowden: The World Says No to Surveillance - NYTimes.com - 0 views

  • MOSCOW — TWO years ago today, three journalists and I worked nervously in a Hong Kong hotel room, waiting to see how the world would react to the revelation that the National Security Agency had been making records of nearly every phone call in the United States. In the days that followed, those journalists and others published documents revealing that democratic governments had been monitoring the private activities of ordinary citizens who had done nothing wrong.
  • Privately, there were moments when I worried that we might have put our privileged lives at risk for nothing — that the public would react with indifference, or practiced cynicism, to the revelations.
  • Since 2013, institutions across Europe have ruled similar laws and operations illegal and imposed new restrictions on future activities. The United Nations declared mass surveillance an unambiguous violation of human rights. In Latin America, the efforts of citizens in Brazil led to the Marco Civil, an Internet Bill of Rights. Recognizing the critical role of informed citizens in correcting the excesses of government, the Council of Europe called for new laws to protect whistle-blowers.
  • ...2 more annotations...
  • are now enabled by default in the products of pioneering companies like Apple, ensuring that even if your phone is stolen, your private life remains private. Such structural technological changes can ensure access to basic privacies beyond borders, insulating ordinary citizens from the arbitrary passage of anti-privacy laws, such as those now descending upon Russia.
  • Spymasters in Australia, Canada and France have exploited recent tragedies to seek intrusive new powers despite evidence such programs would not have prevented attacks. Prime Minister David Cameron of Britain recently mused, “Do we want to allow a means of communication between people which we cannot read?” He soon found his answer, proclaiming that “for too long, we have been a passively tolerant society, saying to our citizens: As long as you obey the law, we will leave you alone.”
carolinewren

Laser-Controlled And See-Through Brains Get Biomedical Prize | Popular Science - 0 views

  • The mouse brain above has undergone a process called CLARITY
  • Through a series of chemical reactions, CLARITY stabilizes organs taken from an animal or human and makes them transparent to the naked eye.
  • allows scientists to look into organs in a whole new way.
  • ...3 more annotations...
  • The rodent at the top of this story is being studied with a technique called optogenetics, which Deisseroth pioneered.
  • genetically engineered the mouse so that its brain cells turn certain genes on or off when scientists shine laser light onto them. The light enters the mouse's brain through that optical fiber you see in the photo.
  • For example, say 20 percent of people with autism don't have Gene A, but scientists aren't sure what Gene A does. They could turn off Gene A in a mouse's brain and see what happens next. The mouse's reaction could provide a clue about what Gene A does in people and why it's missing in certain patients
carolinewren

Book Review: 'A New History of Life' by Peter Ward and Joe Kirschvink - WSJ - 0 views

  • I imagine that physicists are similarly deluged with revelations about how to build a perpetual-motion machine or about the hitherto secret truth behind relativity. And so I didn’t view the arrival of “A New History of Life” with great enthusiasm.
  • subtitle breathlessly promises “radical new discoveries about the origins and evolution of life on earth,” while the jacket copy avers that “our current paradigm for understanding the history of life on Earth dates back to Charles Darwin’s time, yet scientific advances of the last few decades have radically reshaped that aging picture.”
  • authors Peter Ward and Joe Kirschvink are genuine scientists—paleontologists, to be exact. And they can write.
  • ...16 more annotations...
  • even genuine scientists are human and as such susceptible to the allure of offering up new paradigms (as the historian of science Thomas Kuhn put it)
  • paleontologist Stephen Jay Gould insisted that his conception of “punctuated equilibria” (a kind of Marxist biology that blurred the lines between evolution and revolution), which he developed along with fellow paleontologist Niles Eldredge, upended the traditional Darwinian understanding of how natural selection works.
  • This notion doesn’t constitute a fundamental departure from plain old evolution by natural selection; it simply italicizes that sometimes the process is comparatively rapid, other times slower.
  • In addition, they have long had a peculiar perspective on evolution, because of the limitations of the fossil record
  • Darwin was a pioneering geologist as well as the greatest of all biologists, and his insights were backgrounded by the key concept of uniformitarianism, as advocated by Charles Lyell, his friend and mentor
  • previously regnant paradigm among geologists had been “catastrophism
  • fossil record was therefore seen as reflecting the creation and extinction of new species by an array of dramatic and “unnatural” dei ex machina.
  • Of late, however, uniformitarianism has been on a losing streak. Catastrophism is back, with a bang . . . or a flood, or a burst of extraterrestrial radiation, or an onslaught of unpleasant, previously submerged chemicals
  • This emphasis on catastrophes is the first of a triad of novelties on which “A New History of Life” is based. The second involves an enhanced role for some common but insufficiently appreciated inorganic molecules, notably carbon dioxide, oxygen and hydrogen sulfide.
  • Life didn’t so much unfold smoothly over hundreds of millions of years as lurch chaotically in response to diverse crises and opportunities: too much oxygen, too little carbon dioxide, too little oxygen, too much carbon dioxide, too hot, too cold
  • So far, so good, except that in their eagerness to emphasize what is new and different, the authors teeter on the verge of the same trap as Gould: exaggerating the novelty of their own ideas.
  • Things begin to unravel when it comes to the third leg of Messrs. Ward and Kirschvink’s purported paradigmatic novelty: a supposed role for ecosystems—rain forests, deserts, rivers, coral reefs, deep-sea vents—as units of evolutionary change
  • “While the history of life may be populated by species,” they write, “it has been the evolution of ecosystems that has been the most influential factor in arriving at the modern-day assemblage of life. . . . [W]e know that on occasion in the deep past entirely new ecosystems appear, populated by new kinds of life.” True enough, but it is those “new kinds of life,” not whole ecosystems, upon which natural selection acts.
  • One of the most common popular misconceptions about evolution is that it proceeds “for the good of the species.”
  • The problem is that smaller, nimbler units are far more likely to reproduce differentially than are larger, clumsier, more heterogeneous ones. Insofar as ecosystems are consequential for evolution—and doubtless they are—it is because, like occasional catastrophes, they provide the immediate environment within which something not-so-new is acted out.
  • This is natural selection doing its same-old, same-old thing: acting by a statistically potent process of variation combined with selective retention and differential reproduction, a process that necessarily operates within the particular ecosystem that a given lineage occupies.
Javier E

Learning How Little We Know About the Brain - NYTimes.com - 0 views

  • So many large and small questions remain unanswered. How is information encoded and transferred from cell to cell or from network to network of cells?
  • Science found a genetic code but there is no brain-wide neural code; no electrical or chemical alphabet exists that can be recombined to say “red” or “fear” or “wink” or “run.” And no one knows whether information is encoded differently in various parts of the brain.
  • Single neurons, he said, are fairly well understood, as are small circuits of neurons.The question now on his mind, and that of many neuroscientists, is how larger groups, thousands of neurons, work together — whether to produce an action, like reaching for a cup, or to perceive something, like a flower.
  • ...6 more annotations...
  • A decade ago, he moved from Brandeis to Columbia, which now has one of the biggest groups of theoretical neuroscientists in the world, he says, and which has a new university-wide focus on integrating brain science with other disciplines.
  • a “pioneer of computational neuroscience.” Mr. Abbott brought the mathematical skills of a physicist to the field, but he is able to plunge right into the difficulties of dealing with actual brain experiments
  • the goal is to discover the physiological mechanism in the data.
  • For example, he asks why does one pattern of neurons firing “make you jump off the couch and run out the door and others make you just sit there and do nothing?” It could be, Dr. Abbott says, that simultaneous firing of all the neurons causes you to take action. Or it could be that it is the number of neurons firing that prompts an action.
  • “We’ve looked at the nervous system from the two ends in,” Dr. Abbott said, meaning sensations that flow into the brain and actions that are initiated there. “Somewhere in the middle is really intelligence, right? That’s where the action is.”
  • In the brain, somehow, stored memories and desires like hunger or thirst are added to information about the world, and actions are the result. This is the case for all sorts of animals, not just humans. It is thinking, at the most basic level.
kushnerha

American 'space pioneers' deserve asteroid rights, Congress says | Science | The Guardian - 0 views

  • In a rare bipartisan moment US lawmakers opened up the possibility of mining on other worlds despite an international treaty barring sovereign claims in space
  • The US Senate passed the Space Act of 2015 this week, sending its revisions of the bill back to the House for an expected approval, after which it would land on the president’s desk. The bill has a slew of provisions to encourage commercial companies that want to explore space and exploit its resources, granting “asteroid resource” and “space resource” rights to US citizens who managed to acquire the resource themselves.
  • lawmakers defined “space resource” as “an abiotic resource in situ in outer space” that would include water and minerals but not life.
  • ...8 more annotations...
  • The company’s president, Chris Lewicki, compared the bill to the Homestead Act, which distributed public land to Americans heading west and helped reshape the United States. “The Homestead Act of 1862 advocated for the search for gold and timber, and today, HR 2262 fuels a new economy,” Lewicki said in a statement. “This off-planet economy will forever change our lives for the better here on Earth.”
  • obstacle to space mining is an 1967 international treaty known as the Outer Space Treaty, to which the US is a signatory. The treaty holds that no “celestial body” is subject to “national appropriation by claim of sovereignty, by means of use or occupation, or by any other means”.
  • careful to add in their bill that they grant rights only to citizens who act under the law, “including the international obligations of the United States”.
  • added a “disclaimer of extraterritorial sovereignty”, saying the US does not thereby assert ownership, exclusive rights or jurisdiction “of any celestial body”.
  • bill asserts certain rights for US citizens, it disavows any national claim – sending a mixed message on asteroid rights
  • “They’re trying to dance around the issue. I tend to think it doesn’t create any rights because it conflicts with international law. The bottom line is before you can give somebody the right to harvest a resource you have to have ownership.”
  • Asteroids vary in their makeup, but some are rich in platinum and other valuable metals. Nasa has run missions to explore the possibilities of mining asteroids
  • solidifies America’s leading role in the commercial space sector
kushnerha

Are scientists blocking their own progress? - The Washington Post - 1 views

  • Max Planck won a Nobel prize for his revolutionary work in quantum mechanics, but it was his interest in the philosophy of science that led to what is now called “Planck’s Principle.” Planck argued that science was an evolving system of thought which changes slowly over time, fueled by the deaths of old ideas. As he wrote in his 1968 autobiography: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
  • Is our understanding of the world based in pure objective reason, or are the theories that underpin it shaped by generational biases? Do our most famous thinkers actually block new ideas from gaining ground?
  • A new paper published by the National Bureau of Economic Research suggests that fame does play a significant role in deciding when and whether new scientific ideas can gain traction. When a prominent scientist dies, the paper’s authors found, the number of articles published by his or her collaborators tends to fall “precipitously” in the years following the death — those supporters tend not to continue advocating for a once-famous scientist’s ideas once the scientist is gone.
  • ...8 more annotations...
  • the number of research articles written by other scientists — including those with opposing ideas — increases by 8 percent on average, implying that the work of these scientists had been stifled before, but that after the death of a ubiquitous figure, the field becomes more open to new ideas. The study also found that these new articles are less likely to cite previous research and are more likely to be cited by others in the field. Death signifies a changing of the guard
  • Our instinct is often to view science as a concrete tower, growing ever upward and built upon the immovable foundations of earlier pioneers.  Sir Isaac Newton famously characterized this as “standing on the shoulders of giants.”
  • Mid-20th century philosopher Thomas Kuhn was among the first to come to this conclusion, in his 1962 book “The Structure of Scientific Revolutions.” He argued that scientific theories appeared in punctuated “paradigm shifts,” in which the underlying assumptions of a field are questioned and eventually overthrown
  • Kuhn’s book was, to some extent, a paradigm shift in its own right. According to his logic, commonly held notions in science were bound to change and become outdated. What we believe today will tomorrow be revised, rewritten — and in the most extreme cases ridiculed.
  • the journal Nature earlier this year said scientific data is prone to bias because researchers design experiments and make observations in ways that support hypotheses
  • equally as important are simple shifts in perspective. It only takes one researcher seeing an accepted scientific model in a new light for a solidified paradigm to enter what Kuhn called a “crisis phase” and beg for alternative explanations
  • The NBER study shows that those who questioned consensus ought to be given the opportunity to make their case, not ignored, silenced or pushed to the back of the line.
  • We’re likely to see these “paradigm shifts” happen at a much faster rate as data and research become easier to share worldwide. For some, this reality might seem chaotic; for the truly curious, it is exhilarating. The result may be a more democratic version of science — one in which the progress of ideas doesn’t have to wait until the funeral of a great mind.
kushnerha

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
1 - 20 of 50 Next › Last »
Showing 20 items per page