Skip to main content

Home/ Long Game/ Group items matching "research" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
anonymous

Why you should starve yourself a little bit each day - 0 views

  • While some might be inclined to cynically dismiss intermittent fasting as just another fad diet, the scientific evidence in support of daily fasting (or any fasting for that matter) is compelling.
  •  
    "We've been told since we were children that we need to eat three square meals a day. But new research shows that we don't need to be eating throughout the course of the day. And in fact, it might even be undermining our health. These insights have given rise to what's known as "intermittent fasting" - the daily restriction of meals and caloric intake. Here's why some health experts believe you should starve yourself just a little bit each day."
anonymous

Time and the End of History Illusion - 0 views

  • “Middle-aged people – like me – often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”
  • There are several ways to explain these findings. It’s more difficult to predict the future than to recall the past; perhaps participants simply weren’t willing to speculate on something they felt uncertain about. It’s also possible that study participants overestimated how much they had changed in the past, making it seem as though they were underestimating their change in the future. However, the psychologists suggest that the end of history illusion is most probably explained by the fact that it just makes us feel better about ourselves:
  • On the other hand, French postmodern philosopher Jean Beaudrillard contends that Fukuyama’s modernist theory is no more than an illusion caused by our particular relationship with time. He writes that contemporary civilization has simply “lost” its sense of history:
  • ...5 more annotations...
  • … one might suppose that the acceleration of modernity, of technology, events and media, of all exchanges – economic, political, and sexual – has propelled us to ‘escape velocity’, with the result that we have flown free of the referential sphere of the real and of history. … A degree of slowness (that is, a certain speed, but not too much), a degree of distance, but not too much, and a degree of liberation (an energy for rupture and change), but not too much, are needed to bring about the kind of condensation or significant crystallization of events we call history, the kind of coherent unfolding of causes and effects we call reality.
  • Once beyond this gravitational effect, which keeps bodies in orbit, all the atoms of meaning get lost in space. Each atom pursues its own trajectory to infinity and is lost in space. This is precisely what we are seeing in our present-day societies, intent as they are on accelerating all bodies, messages and processes in all directions and which, with modern media, have created for every event, story and image a simulation of an infinite trajectory.
  • Every political, historical and cultural fact possesses a kinetic energy which wrenches it from its own space and propels it into a hyperspace where, since it will never return, it loses all meaning. No need for science fiction here: already, here and now – in the shape of our computers, circuits and networks – we have the particle accelerator which has smashed the referential orbit of things once and for all.
  • Illusion or not, the Harvard study shows that a sense of being at the end of history has real-world consequences: underestimating how differently we’ll feel about things in the future, we sometimes make decisions we later come to regret.
  • In other words, the end of history illusion could be thought of as a lack of long-term thinking.
  •  
    "In a paper published last week in Science, these researchers report on a study that asked participants to estimate how much their personality, tastes, and values had changed over the last decade, and how much they expected they would change in the next. Statistical analysis reveals what these psychologists call an "End of History Illusion": while we remember our past selves to be quite different from who we are today, we nevertheless believe that we won't change much at all in the future. The New York Times quotes:"
anonymous

Massive espionage malware targeting governments undetected for 5 years | Ars Technica - 0 views

  •  
    "Researchers have uncovered an ongoing, large-scale computer espionage network that's targeting hundreds of diplomatic, governmental, and scientific organizations in at least 39 countries, including the Russian Federation, Iran, and the United States."
anonymous

When a Calorie Is Not a Calorie - 0 views

  • In a wide-ranging discussion of how food is digested in everything from humans to rats to pythons, the panel reviewed a new spate of studies showing that foods are processed differently as they move from our gullet to our guts and beyond. They agreed that net caloric counts for many foods are flawed because they don’t take into account the energy used to digest food; the bite that oral and gut bacteria take out of various foods; or the properties of different foods themselves that speed up or slow down their journey through the intestines, such as whether they are cooked or resistant to digestion.
  • The process used to estimate calories for food was developed at the turn of the 19th to 20th century by Wilbur Atwater. It was a simple system of calculating four calories for each gram of protein, nine calories for each gram of fat, and four calories for each gram of carbohydrate (modified later by others to add two calories for a gram of fiber). Although it has been useful for approximating the energetic costs of metabolizing many foods, its shortcomings have been known for decades—and some nations, such as Australia, have dropped the system because it is “inaccurate and impractical,” said panelist Geoffrey Livesey, a nutritional biochemist and director of Independent Nutrition Logic Ltd. in Wymondham, U.K..
  • One key area where the system is inaccurate, Wrangham reported, is in estimating the calories for cooked food.
  • ...3 more annotations...
  • The way foods are processed can also make them easier to digest.
  • New studies also are finding that bacteria in the gut respond differently to processed foods and cooked foods. Carmody reported that she and Peter Turnbaugh of Harvard University are finding “key differences in the type of bacterial communities” in the guts of mice, depending on whether they were fed chow or cooked meat.
  • Why does all of this matter? Because we’re in the midst of an obesity epidemic and counting calories has been misleading, said David Ludwig, a pediatric endocrinologist at Children’s Hospital Boston and Harvard Medical School.
  •  
    "When it comes to weight loss, a calorie is a calorie is a calorie. That's been the mantra of nutritionists, dietitians, and food regulators in the United States and Europe for more than a century. But when it comes to comparing raw food with cooked food, or beans with breakfast cereals, that thinking may be incorrect. That was the consensus of a panel of researchers who listed the many ways that the math doesn't always add up correctly on food labels"
anonymous

The Walking Dead, Mirror Neurons, and Empathy - 1 views

  • Suddenly the researcher noticed that according to the equipment hooked up to the monkey’s brain, neurons were firing that were associated with grasping motions, even though the animal had only SEEN something being grasped. This was odd, because normally brain cells are very specialized and nobody knew of any neurons that would activate both when performing an action or when seeing someone else perform the same action. Yet here the monkey was, blithely firing neurons previously only associated with performing motor actions while just sitting still and watching.
  • Thus was the first observation of a mirror neuron in action, a brain cell set apart from many of its peers and which are also present in delicious human brains. It turns out that many researchers like the aforementioned Dr. Marco Iacoboni, Professor of Psychiatry and Behavioral Sciences at UCLA, believe that mirror neurons are important for our ability to empathize with things we see, like the plight of poor Lee and Clementine in The Walking Dead.
  • I think this is one of the reasons why The Walking Dead is so good at eliciting emotions: it frequently shows us the faces of the characters and lets us see all the work put into creating easily recognizable and convincing facial expressions. And so it’s not the zombies that elicit dread in us. Instead it’s things like the face that Kenny makes when Lee tells him to make a hard decision about his family.
  •  
    "And the amazing thing is that the game gets me to feel all those emotions too. I'm glad that the game comes in monthly installments, because I need the time between episodes to recover. But why is that? By what psychological, neurological, and biological mechanisms do video games like The Walking Dead get us to not only empathize with characters onscreen, but also share their emotions?"
anonymous

Abstract Science - 5 views

  •  
    "Scientific abstracts are the hooks attempting to capture a discerning reader's attention, the shortcuts saving the busy reader some time and the keys unlocking scientific knowledge for those lacking a portfolio of academic journal subscriptions. But don't be dismayed if you're still confused after reading an abstract multiple times. When writing this leading, summarizing paragraph of a scientific manuscript, researchers often make mistakes. Some authors include too much information about the experimental methods while forgetting to announce what they actually discovered. Others forget to include any methodology at all. Sometimes the scientists fail to divulge why they even conducted the study in the first place, yet feel comfortable boldly speculating with a loose-fitting claim of general importance. Nevertheless, the abstract serves a critical importance and every science enthusiast needs to become comfortable with reading them."
  • ...4 more comments...
  •  
    Took at class (well, more than one) with the UChicago professional writing program (http://writing-program.uchicago.edu/courses/index.htm). There was a lot of hammering this home to the writers of those abstracts, too. We've got all these forms, and it's not always clear to the reader or writer what's expected of those forms. This does not lead to effective communication.
  •  
    Too true. Sadly, it's a lesson that still lost on some pretty senior P.I.'s, who think that 'lay summary' means simply spelling out all their acronyms.
  •  
    Honestly, this can be really hard and time-intensive work for some people. Some people understand what they need to do, but end up taking (usually very jargon-filled) shortcuts. I understand that, but I also know that it gets faster and easier with practice.
  •  
    Or hire an editor.
  •  
    It would be interesting to see how much purchase a suggestion like that receives. I suspect more than a few PI's would find the notion insulting because they've been doing it for years, and some of these really technical publications have been tolerating it for so long. For my part as an Admin, I would review the lay summary and give my impressions, which would then get (mostly) completely ignored. :)
  •  
    A _lot_ of people don't think they need professional writing and editing help. After all, they learned to write years ago.
anonymous

Misinformation and Its Correction - 1 views

shared by anonymous on 17 Oct 12 - Cached
  •  
    "Abstract The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation. We first examine the mechanisms by which such misinformation is disseminated in society, both inadvertently and purposely. Misinformation can originate from rumors but also from works of fiction, governments and politicians, and vested interests. Moreover, changes in the media landscape, including the arrival of the Internet, have fundamentally influenced the ways in which information is communicated and misinformation is spread. We next move to misinformation at the level of the individual, and review the cognitive factors that often render misinformation resistant to correction. We consider how people assess the truth of statements and what makes people believe certain things but not others. We look at people's memory for misinformation and answer the questions of why retractions of misinformation are so ineffective in memory updating and why efforts to retract misinformation can even backfire and, ironically, increase misbelief. Though ideology and personal worldviews can be major obstacles for debiasing, there nonetheless are a number of effective techniques for reducing the impact of misinformation, and we pay special attention to these factors that aid in debiasing. We conclude by providing specific recommendations for the debunking of misinformation. These recommendations pertain to the ways in which corrections should be designed, structured, and applied in order to maximize their impact. Grounded in cognitive psychological theory, these rec
anonymous

Rising Share of Americans See Conflict Between Rich and Poor | Pew Social & Demographic Trends - 0 views

  • Not only have perceptions of class conflict grown more prevalent; so, too, has the belief that these disputes are intense. According to the new survey, three-in-ten Americans (30%) say there are “very strong conflicts” between poor people and rich people. That is double the proportion that offered a similar view in July 2009 and the largest share expressing this opinion since the question was first asked in 1987.
  •  
    "The Occupy Wall Street movement no longer occupies Wall Street, but the issue of class conflict has captured a growing share of the national consciousness. A new Pew Research Center survey of 2,048 adults finds that about two-thirds of the public (66%) believes there are "very strong" or "strong" conflicts between the rich and the poor-an increase of 19 percentage points since 2009."
anonymous

Jonah Lehrer and the Problems with "Pithy" Science Writing - 1 views

  • The world economy is crumbling and unemployment is soaring. But let me talk to you about an intangible tipping point that could change your life forever or tell you what happens in your brain when that proverbial light bulb goes off in the cartoon equivalent of a thought bubble. Because talking about the actual economy is much too real and depressing.
  • Science writers have always had to try harder to be interesting. In trying to entice the general public with the tedious, sometimes boring work that goes on in a research lab, they often reduce the nuances and complexities of science—workings of intricate systems like evolution and the human body, the mathematics of financial bubbles, and the inevitable warming of the earth— to interesting tales that combine a tiny bit of data with copious amounts of speculation without context or background.
  • Pop-science writers like Gladwell, Lehrer, Dan Ariely, and Charles Duhigg take a slightly different approach—they combine decades of scientific research with hearsay and speculation, metaphysical analysis and societal trends, and offer it to the audience in bite-size palatable pieces.
  • ...4 more annotations...
  • Lehrer’s neuroscience in Imagine contains some obvious elementary errors—arguably more dangerous than a couple of manufactured Bob Dylan quotes. While Gladwell talks about our amazing powers of cognition in Blink, he doesn’t venture to give a detailed account of how these processes occur in the brain.
  • Our blogging culture is partly to blame for this. The demand of our 24/7 news cycle, first created by cable television, and now carried on by minute-by-minute updates on the Internet creates constant demand for new information that never quite satisfies the insatiable appetite of the limitless Web.
  • What a newspaper or magazine would call ‘A model to help cure cancer,’ for instance, could realistically only be “an adaptation of a previous model to simulate cancer tissue in order to determine if it can be used to study cancer cells and eventually help find a cure.”Want to try that for a headline? Exactly.Confirming a hypothesis or a hunch with empirical evidence is the very essence of science, whereas in journalism—like much of the humanities—theories and schools of thought can rest on their own. However, science journalism, like science, needs to be rooted in fact and observation, without which it would lose its basis.
  • The problem with these examples is not that they are untrue, but the helplessness and futility of the advice. What are you to do to make these “breakthrough” moments happen? Nothing, apparently, except wait for them.In a journalistic equivalent of motivational speeches, these erudite writers hail subconscious processes in the brain that we have almost no control over, stopping just short of saying, “it will happen if you believe.”
  •  
    "The really troubling aspect of the Jonah Lehrer story is not so much that the media allowed his self-plagiarisms and misquotes to slip through the cracks, but that it placed him on such a high pedestal in the first place."
anonymous

Ayn Rand & Human Nature 19 - 0 views

  • In the first place, it is logically fallacious to reason from two is premises to an ought conclusion, something Rand appears not to have understood. Secondly, it is psychologically impossible to derive the an end from reason. Reason is a method, a means for attaining an end. But an end must be wished for it's own sake, because it satisfies some sentiment or desire.
  • And finally, there exists an immense body of research demonstrating that reason is not used to make moral decisions; on the contrary, where reason comes in is after the decision has been made. The role of reason is not to make moral choices, but to defend them after the fact.
  • If reasoning played a central role in moral judgments, we would expect better reasoners to arrive at different conclusions from inferior reasoners. But this is not what the research finds. Smarter, more educated people don't reach different conclusions, they just provide more reasons to support their side of the issue. When people reason about issues of morality, they are blinded by confirmation bias.
  • ...2 more annotations...
  • Reason, as Nietzsche warned us, is a whore. She will sleep with any premises you throw at her, no matter how anti-empirical or absurd.
  • "skilled arguers ... are not after the truth butafter arguments supporting their views." This explains why the confirmation biasis so powerful, and so ineradicable. How hard could it be to teach students toalways look on the other side, to always look for evidence against their favoredviews? Yet, in fact, it's really hard, and nobody has yet found a way to do it.It's hard because the confirmation bias is a built-in feature..., not a bug thatcan be removed...
  •  
    Rand places enormous stress on individual conscious reasoning. "Reason" is her chief moral virtue and is considered a necessity to man's survival. Not surprising, Rand regarded "reason" as particularly important in ethics. Rand regarded any attempt to derive ethical behavior from intuition or gut feelings or emotion as mere "whim worship," which she denounced in fierce, vigorous language.
anonymous

A New Thermodynamics Theory of the Origin of Life - 1 views

  • From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat.
  • Jeremy England, a 31-year-old assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity.
  • “You start with a random clump of atoms, and if you shine light on it for long enough, it should not be so surprising that you get a plant,” England said.
  • ...19 more annotations...
  • “I am certainly not saying that Darwinian ideas are wrong,” he explained. “On the contrary, I am just saying that from the perspective of the physics, you might call Darwinian evolution a special case of a more general phenomenon.”
  • The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.
  • His idea, detailed in a recent paper and further elaborated in a talk he is delivering at universities around the world, has sparked controversy among his colleagues, who see it as either tenuous or a potential breakthrough, or both.
  • Eugene Shakhnovich, a professor of chemistry, chemical biology and biophysics at Harvard University, are not convinced. “Jeremy’s ideas are interesting and potentially promising, but at this point are extremely speculative, especially as applied to life phenomena,” Shakhnovich said.
  • England’s theoretical results are generally considered valid. It is his interpretation — that his formula represents the driving force behind a class of phenomena in nature that includes life — that remains unproven. But already, there are ideas about how to test that interpretation in the lab.
  • “He’s trying something radically different,” said Mara Prentiss, a professor of physics at Harvard who is contemplating such an experiment after learning about England’s work. “As an organizing lens, I think he has a fabulous idea. Right or wrong, it’s going to be very much worth the investigation.”
  • At the heart of England’s idea is the second law of thermodynamics, also known as the law of increasing entropy or the “arrow of time.”
  • Hot things cool down, gas diffuses through air, eggs scramble but never spontaneously unscramble; in short, energy tends to disperse or spread out as time progresses.
  • It increases as a simple matter of probability: There are more ways for energy to be spread out than for it to be concentrated.
  • cup of coffee and the room it sits in become the same temperature, for example. As long as the cup and the room are left alone, this process is irreversible. The coffee never spontaneously heats up again because the odds are overwhelmingly stacked against so much of the room’s energy randomly concentrating in its atoms.
  • A plant, for example, absorbs extremely energetic sunlight, uses it to build sugars, and ejects infrared light, a much less concentrated form of energy. The overall entropy of the universe increases during photosynthesis as the sunlight dissipates, even as the plant prevents itself from decaying by maintaining an orderly internal structure.
  • Life does not violate the second law of thermodynamics, but until recently, physicists were unable to use thermodynamics to explain why it should arise in the first place.
  • In Schrödinger’s day, they could solve the equations of thermodynamics only for closed systems in equilibrium.
  • Jarzynski and Crooks showed that the entropy produced by a thermodynamic process, such as the cooling of a cup of coffee, corresponds to a simple ratio: the probability that the atoms will undergo that process divided by their probability of undergoing the reverse process (that is, spontaneously interacting in such a way that the coffee warms up).
  • Using Jarzynski and Crooks’ formulation, he derived a generalization of the second law of thermodynamics that holds for systems of particles with certain characteristics: The systems are strongly driven by an external energy source such as an electromagnetic wave, and they can dump heat into a surrounding bath.
  • This class of systems includes all living things.
  • Having an overarching principle of life and evolution would give researchers a broader perspective on the emergence of structure and function in living things, many of the researchers said. “Natural selection doesn’t explain certain characteristics,” said Ard Louis, a biophysicist at Oxford University, in an email. These characteristics include a heritable change to gene expression called methylation, increases in complexity in the absence of natural selection, and certain molecular changes Louis has recently studied.
  • If England’s approach stands up to more testing, it could further liberate biologists from seeking a Darwinian explanation for every adaptation and allow them to think more generally in terms of dissipation-driven organization.
  • They might find, for example, that “the reason that an organism shows characteristic X rather than Y may not be because X is more fit than Y, but because physical constraints make it easier for X to evolve than for Y to evolve,” Louis said.
  •  
    Why does life exist? Popular hypotheses credit a primordial soup, a bolt of lightning and a colossal stroke of luck. But if a provocative new theory is correct, luck may have little to do with it. Instead, according to the physicist proposing the idea, the origin and subsequent evolution of life follow from the fundamental laws of nature and "should be as unsurprising as rocks rolling downhill."
anonymous

You Broke Peer Review. Yes, I Mean You | Code and Culture - 0 views

  • no more anonymous co-authors making your paper worse with a bunch of non sequiturs or footnotes with hedging disclaimers.
  • The thing is though that optimistic as I am about the new journal, I don’t think it will replace the incumbent journals overnight and so we still need to fix review at the incumbent journals.
  • So fixing peer review doesn’t begin with you, the author, yelling at your computer “FFS reviewer #10, maybe that’s how you would have done it, but it’s not your paper”
  • ...32 more annotations...
  • Nor, realistically, can fixing peer review happen from the editors telling you to go ahead and ignore comments 2, 5, and 6 of reviewer #6.
  • First, it would be an absurd amount of work
  • Second, from the editor’s perspective the chief practical problem is recruiting reviewers
  • they don’t want to alienate the reviewers by telling them that half their advice sucks in their cover letter
  • Rather, fixing peer review has to begin with you, the reviewer, telling yourself “maybe I would have done it another way myself, but it’s not my paper.”
  • You need to adopt a mentality of “is it good how the author did it” rather than “how could this paper be made better” (read: how would I have done it). That is the whole of being a good reviewer, the rest is commentary. That said, here’s the commentary.
  • Do not brainstorm
  • Responding to a research question by brainstorming possibly relevant citations or methods
  • First, many brainstormed ideas are bad.
  • When I give you advice as a peer reviewer there is a strong presumption that you take the advice even if it’s mediocre
  • Second, many brainstormed ideas are confusing.
  • When I give you advice in my office you can ask follow-up questions
  • When I give advice as a peer reviewer it’s up to you to hope that you read the entrails in a way that correctly augurs the will of the peer reviewers.
  • Being specific has the ancillary benefit that it’s costly to the reviewer which should help you maintain the discipline to thin the mindfart herd stampeding into the authors’ revisions.
  • Third, ideas are more valuable at the beginning of a project than at the end of it.
  • When I give you advice about your new project you can use it to shape the way the project develops organically. When I give it to you as a reviewer you can only graft it on after the fact.
  • it is essential to keep in mind that no matter how highly you think of your own expertise and opinions, you remember that the author doesn’t want to hear it.
  • time is money. It usually takes me an amount of time that is at least the equivalent of a course release to turn-around an R&R and at most schools a course release in turn is worth about $10,000 to $30,000 if you’re lucky enough to raise the grants to buy them.
  • Distinguish demands versus suggestions versus synapses that happened to fire as you were reading the paper
  • A lot of review comments ultimately boil down to some variation on “this reminds me of this citation” or “this research agenda could go in this direction.” OK, great. Now ask yourself, is it a problem that this paper does not yet do these things or are these just possibilities you want to share with the author?
  • As a related issue, demonstrate some rhetorical humility.
  • There’s wrong and then there’s difference of opinion
  • On quite a few methodological and theoretical issues there is a reasonable range of opinion. Don’t force the author to weigh in on your side.
  • For instance, consider Petev ASR 2013. The article relies heavily on McPherson et al ASR 2006, which is an extremely controversial article (see here, here, and here).
  • One reaction to this would be to say the McPherson et al paper is refuted and ought not be cited. However Petev summarizes the controversy in footnote 10 and then in footnote 17 explains why his own data is a semi-independent (same dataset, different variables) corroboration of McPherson et al.
  • These footnotes acknowledge a nontrivial debate about one of the article’s literature antecedents and then situates the paper within the debate.
  • Theoretical debates are rarely an issue of decisive refutation or strictly cumulative knowledge but rather at any given time there’s a reasonable range of opinions and you shouldn’t demand that the author go with your view but at most that they explore its implications if they were to.
  • There are cases where you fall on one side of a theoretical or methodological gulf and the author on another to the extent that you feel that you can’t really be fair.
  • you as the reviewer have to decide if you’re going to engage in what philosophers of science call “the demarcation problem” and sociologists of science call “boundary work” or you’re going to recuse yourself from the review.
  • Don’t try to turn the author’s theory section into a lit review.
  • The theory section is not about demonstrating basic competence or reciting a creedal confession and so it does not need to discuss every book or article ever published on the subject or even just the things important enough to appear on your graduate syllabus or field exam reading list.
  • If the submission reminds you of a citation that’s relevant to the author’s subject matter, think about whether it would materially affect the argument.
  •  
    "I'm as excited as anybody about Sociological Science as it promises a clean break from the "developmental" model of peer review by moving towards an entirely evaluative model. That is, no more anonymous co-authors making your paper worse with a bunch of non sequiturs or footnotes with hedging disclaimers. (The journal will feature frequent comment and replies, which makes debate about the paper a public dialog rather than a secret hostage negotiation). The thing is though that optimistic as I am about the new journal, I don't think it will replace the incumbent journals overnight and so we still need to fix review at the incumbent journals."
anonymous

This Is the Man Bill Gates Thinks You Absolutely Should Be Reading - 0 views

  • Let’s talk about manufacturing. You say a country that stops doing mass manufacturing falls apart. Why? In every society, manufacturing builds the lower middle class. If you give up manufacturing, you end up with haves and have-nots and you get social polarization. The whole lower middle class sinks.
  • You also say that manufacturing is crucial to innovation. Most innovation is not done by research institutes and national laboratories. It comes from manufacturing—from companies that want to extend their product reach, improve their costs, increase their returns. What’s very important is in-house research. Innovation usually arises from somebody taking a product already in production and making it better: better glass, better aluminum, a better chip. Innovation always starts with a product.
  • American companies do still innovate, though. They just outsource the manufacturing. What’s wrong with that? Look at the crown jewel of Boeing now, the 787 Dreamliner. The plane had so many problems—it was like three years late. And why? Because large parts of it were subcontracted around the world. The 787 is not a plane made in the USA; it’s a plane assembled in the USA. They subcontracted composite materials to Italians and batteries to the Japanese, and the batteries started to burn in-flight. The quality control is not there.
  • ...11 more annotations...
  • Restoring manufacturing would mean training Americans again to build things. Only two countries have done this well: Germany and Switzerland. They’ve both maintained strong manufacturing sectors and they share a key thing: Kids go into apprentice programs at age 14 or 15. You spend a few years, depending on the skill, and you can make BMWs. And because you started young and learned from the older people, your products can’t be matched in quality. This is where it all starts.
  • You claim Apple could assemble the iPhone in the US and still make a huge profit. It’s no secret! Apple has tremendous profit margins. They could easily do everything at home. The iPhone isn’t manufactured in China—it’s assembled in China from parts made in the US, Germany, Japan, Malaysia, South Korea, and so on. The cost there isn’t labor. But laborers must be sufficiently dedicated and skilled to sit on their ass for eight hours and solder little pieces together so they fit perfectly.
  • But Apple is supposed to be a giant innovator. Apple! Boy, what a story. No taxes paid, everything made abroad—yet everyone worships them. This new iPhone, there’s nothing new in it. Just a golden color. What the hell, right? When people start playing with color, you know they’re played out.
  • Let’s talk about energy. You say alternative energy can’t scale. Is there no role for renewables? I like renewables, but they move slowly. There’s an inherent inertia, a slowness in energy transitions. It would be easier if we were still consuming 66,615 kilowatt-hours per capita, as in 1950. But in 1950 few people had air-conditioning. We’re a society that demands electricity 24/7. This is very difficult with sun and wind.
  • What about nuclear? The Chinese are building it, the Indians are building it, the Russians have some intention to build. But as you know, the US is not. The last big power plant was ordered in 1974. Germany is out, Italy has vowed never to build one, and even France is delaying new construction. Is it a nice thought that the future of nuclear energy is now in the hands of North Korea, Pakistan, India, and Iran? It’s a depressing thought, isn’t it?
  • You call this Moore’s curse—the idea that if we’re innovative enough, everything can have yearly efficiency gains. It’s a categorical mistake. You just cannot increase the efficiency of power plants like that. You have your combustion machines—the best one in the lab now is about 40 percent efficient. In the field they’re about 15 or 20 percent efficient. Well, you can’t quintuple it, because that would be 100 percent efficient. Impossible, right? There are limits. It’s not a microchip.
  • So what’s left? Making products more energy-efficient? Innovation is making products more energy-efficient — but then we consume so many more products that there’s been no absolute dematerialization of anything. We still consume more steel, more aluminum, more glass, and so on. As long as we’re on this endless material cycle, this merry-go-round, well, technical innovation cannot keep pace.
  • What is the simplest way to make your house super-efficient? Insulation!
  • Right. I have 50 percent more insulation in my walls. It adds very little to the cost. And you insulate your basement from the outside—I have about 20 inches of Styrofoam on the outside of that concrete wall. We were the first people building on our cul-de-sac, so I saw all the other houses after us—much bigger, 3,500 square feet. None of them were built properly. I pay in a year for electricity what they pay in January. You can have a super-efficient house; you can have a super-efficient car, a little Honda Civic, 40 miles per gallon.
  • Your other big subject is food. You’re a pretty grim thinker, but this is your most optimistic area. You actually think we can feed a planet of 10 billion people—if we eat less meat and waste less food. We pour all this energy into growing corn and soybeans, and then we put all that into rearing animals while feeding them antibiotics. And then we throw away 40 percent of the food we produce.
  • So the answers are not technological but political: better economic policies, better education, better trade policies. Right. Today, as you know, everything is “innovation.” We have problems, and people are looking for fairy-tale solutions—innovation like manna from heaven falling on the Israelites and saving them from the desert. It’s like, “Let’s not reform the education system, the tax system. Let’s not improve our dysfunctional government. Just wait for this innovation manna from a little group of people in Silicon Valley, preferably of Indian origin.”
  •  
    ""There is no author whose books I look forward to more than Vaclav Smil," Bill Gates wrote this summer. That's quite an endorsement-and it gave a jolt of fame to Smil, a professor emeritus of environment and geography at the University of Manitoba. In a world of specialized intellectuals, Smil is an ambitious and astonishing polymath who swings for fences. His nearly three dozen books have analyzed the world's biggest challenges-the future of energy, food production, and manufacturing-with nuance and detail. They're among the most data-heavy books you'll find, with a remarkable way of framing basic facts. (Sample nugget: Humans will consume 17 percent of what the biosphere produces this year.)"
anonymous

Watch America's looming age imbalance unfold in 4 seconds - 0 views

  • one such visualization shows the dreaded age distribution phenomenon that's projected to occur over the coming decades as Baby Boomers become the oldest generation. When charted with the oldest folks up top and the youngest people at the bottom, societies tend to have age distributions shaped like pyramids, like so:
  • But all those births a couple of generations ago threw the pyramid out of whack, which will have a lasting, noticeable impact. Here's a handy gif Pew made showing that problem playing out over a century:
  • Really, go check out the whole report.
  •  
    "The Pew Research Center has an amazing new report about America's shifting demographics that shows how the country is getting older, less white, and more liberal. The entire report is worth your time, and it includes some beautiful animated visualizations to put the demographic changes in context."
anonymous

Are You Really Gluten-Intolerant? Maybe Not. - 0 views

  • Many of the people who pursue a gluten-free diet out of choice believe themselves to be gluten-sensitive, a far less serious condition in which limited symptoms of celiac's manifest without any damage to the small intestine.
  • According to the National Foundation for Celiac Awareness, as many as 18 million Americans may have non-celiac gluten sensitivity (NCGS). Since the condition has only been recently described and is poorly understood, it's currently diagnosed via a process of exclusion. If a patient's test for celiac disease comes back negative, but symptoms improve on a gluten-free diet, then he or she is diagnosed with NCGS.
  • Instead of receiving a proper diagnosis, however, many people are self-diagnosing as gluten-sensitive and eating gluten-free by choice. Noticing this trend, Jessica Biesiekierski, a gastroenterologist at Monash University and a leading researcher into the effects of gluten, sought adults who believed they had NCGS to participate in a survey and a clinical trial.
  • ...5 more annotations...
  • First, the survey results: The average age of the respondents was 43.5 years and 130 (88%) were women. These numbers are likely a result of sampling bias, but could reflect the demographics of those who engage in a gluten-free diet by choice.
  • For 63% of respondents, the gluten-free diet was either self-initiated or started at the recommendation of an alternative health professional. Inadequate investigation of celiac disease was common (62%), particularly by individuals who self-diagnosed their sensitivity or sought guidance from an alternative health professional.
  • Moreover, 24% of respondents had uncontrolled symptoms despite restricting their gluten intake, and 27% weren't even following a gluten-free diet.
  • "Indeed, patients who believe they have NCGS are likely to benefit from lowering their dietary intake of FODMAPs," Biesiekierski says. While the underlying causes for non-celiac gluten sensitivity aren't yet understood, it is well known why FODMAPs produce adverse gastrointestinal symptoms. They are not easily digested and absorbed in the small intestine, but bacteria in the large intestine are more than happy to ferment them, producing gas, which results in bloating and flatulence.
  • There are three big takeaways from Biesiekierski's research: 1. If you think you're sensitive to gluten, get tested for celiac disease -- it's a serious condition that's almost certainly underdiagnosed. For each diagnosed celiac patient, at least seven more are undiagnosed. 2. If you don't have celiac's but are still experiencing its symptoms after eating gluten-containing foods, your problems may result from FODMAPs, not gluten sensitivity. Gluten-free diets can be deficient in fiber and a host of other vitamins and minerals, while simply reducing FODMAP intake can be much healthier and less restrictive. 3. Non-Celiac Gluten Sensitivity (a.k.a. gluten intolerance) may not actually exist. More on that next week.
  •  
    "Celiac disease is an autoimmune disorder that affects less than 1% of the population of the United States (PDF). The ingestion of gluten, a protein found in grains like wheat, rye, and barley, gives rise to antibodies that attack the small intestine. At first, the symptoms are annoying: stomachaches, gas, and diarrhea. Over time, they can grow to be debilitating. The autoimmune assault corrodes the small intestine's ability to absorb nutrients, which can prompt anemia, chronic fatigue, and weight loss. There is one treatment for celiac's: strict, lifelong adherence to a diet that's devoid of gluten."
anonymous

Non-Celiac Gluten Sensitivity May Not Exist - 0 views

  • like any meticulous scientist, Gibson wasn't satisfied with his first study.
  • His research turned up no clues to what actually might be causing subjects' adverse reactions to gluten.
  • Moreover, there were many more variables to control! What if some hidden confounder was mucking up the results? He resolved to repeat the trial with a level of rigor lacking in most nutritional research.
  • ...3 more annotations...
  • 37 subjects took part, all with self-reported gluten sensitivity who were confirmed to not have celiac's disease. They were first fed a diet low in FODMAPs for two weeks, then were given one of three diets for a week with either 16 grams per day of added gluten (high-gluten), 2 grams of gluten and 14 grams of whey protein isolate (low-gluten), or 16 grams of whey protein isolate (placebo). Each subject shuffled through every single diet so that they could serve as their own controls, and none ever knew what specific diet he or she was eating.
  • After the main experiment, a second was conducted to ensure that the whey protein placebo was suitable. In this one, 22 of the original subjects shuffled through three different diets -- 16 grams of added gluten, 16 grams of added whey protein isolate, or the baseline diet -- for three days each.
  • Analyzing the data, Gibson found that each treatment diet, whether it included gluten or not, prompted subjects to report a worsening of gastrointestinal symptoms to similar degrees.
  •  
    "In 2011, Peter Gibson, a professor of gastroenterology at Monash University and director of the GI Unit at The Alfred Hospital in Melbourne, Australia, published a study that found gluten, a protein found in grains like wheat, rye, and barley, to cause gastrointestinal distress in patients without celiac disease, an autoimmune disorder unequivocally triggered by gluten. Double-blinded, randomized, and placebo-controlled, the experiment was one of the strongest pieces of evidence to date that non-celiac gluten sensitivity (NCGS), more commonly known as gluten intolerance, is a genuine condition."
anonymous

Coordinated Punishment Leads to Increased Cooperation in Large Groups - 0 views

  • Humans are incredibly cooperative, but why do people cooperate and how is cooperation maintained? A new research study by UCLA anthropology professor Robert Boyd and his colleagues from the Santa Fe Institute in New Mexico suggests cooperation in large groups is maintained by punishment.
  • Group members cooperate because they do not want to hurt their friends by not participating in group efforts, and also because they may want help in the future.
  • in a larger group, like a tribe, those mechanisms for maintaining cooperation are lost. All group members experience the benefits of the large group, even those members who stop cooperating and become "free-riders."
  • ...6 more annotations...
  • Boyd and his colleagues suggest cooperation is maintained by punishment, which reduces the benefits to free riding.
  • To address the problem, Boyd and his colleagues changed the assumptions built into previous cooperation/punishment models.
  • First, they allowed for punishment to be coordinated among group members.
  • Second, the researchers allowed for the cost of punishing a free-rider to decline as the number of punishers increased.
  • Their model had three stages in which a large group of unrelated individuals interacted repeatedly.
  • The first stage was a signaling stage where group members could signal their intent to punish. In the second stage, group members could choose to cooperate or not. The final stage was a punishment stage when group members could punish other group members.
  •  
    By PhysOrg on May 1, 2010. Found on my uncle's Facebook page.
anonymous

Europeans Bury 'Digital DNA' Inside Mountain - 0 views

  • In a secret bunker known as the Swiss Fort Knox deep in the Swiss Alps, European researchers recently deposited a “digital genome” that will provide the blueprint for future generations to read data stored using defunct technology.
  • The capsule is the culmination of the four-year “Planets” project, an 15 million-euro ($18.49 million) project which draws on the expertise of 16 European libraries, archives and research institutions, to preserve the world’s digital assets as hardware and software.
  • “Unlike hieroglyphics carved in stone or ink on parchment, digital data has a shelf life of years not millennia,” said Andreas Rauber, a professor at the University of Technology of Vienna, which is a partner in the project.
  • ...1 more annotation...
  • People will be puzzled at what they find when they open the time capsule, said Rauber. “In 25 years people will be astonished to see how little time must pass to render data carriers unusable because they break or because you don’t have the devices anymore,” he said. “The second shock will probably be what fraction of the objects we can’t use or access in 25 years and that’s hard to predict.”
  •  
    At Sputnik Laboratory on June 15, 2010
anonymous

Artificial meat? Food for thought by 2050 | Environment | The Guardian - 0 views

  • Artificial meat grown in vats may be needed if the 9 billion people expected to be alive in 2050 are to be adequately fed without destroying the earth, some of the world's leading scientists report today.
  • A team of scientists at Rothamsted, the UK's largest agricultural research centre, suggests that extra carbon dioxide in the air from global warming, along with better fertilisers and chemicals to protect arable crops, could hugely increase yields and reduce water consumption.
  • Instead, says Dr Philip Thornton, a scientist with the International Livestock Research Institute in Nairobi, two "wild cards" could transform global meat and milk production. "One is artificial meat, which is made in a giant vat, and the other is nanotechnology, which is expected to become more important as a vehicle for delivering medication to livestock."
  • ...1 more annotation...
  • seven multinational corporations, led by Monsanto, now dominate the global technology field."These companies are accumulating intellectual property to an extent that the public and international institutions are disadvantaged. This represents a threat to the global commons in agricultural technology on which the green revolution has depended," says the paper by Professor Jenifer Piesse at King's College, London.
  •  
    "Leading scientists say meat grown in vats may be necessary to feed 9 billion people expected to be alive by middle of century" By John Vidal at The Guardian on August 16, 2010.
anonymous

Psychedelic Drugs Show Promise as Anti-Depressants - 0 views

  • Ketamine—a powerful anesthetic for humans and animals that lists hallucinations among its side effects and therefore is often abused under the name Special K—delivers rapid relief to chronically depressed patients, and researchers may now have discovered why. In fact, the latest evidence reinforces the idea that the psychedelic drug could be the first new drug in decades to lift the fog of depression.
  • More specifically, as the researchers report in the August 20 issue of Science, ketamine seems to stimulate a biochemical pathway in the brain (known as mTOR) to strengthen synapses in a rat's prefrontal cortex—the region of the brain associated with thinking and personality in humans.
  • In fact, ketamine has shown promise at reducing the risk of suicide and is currently being tested in humans for effectiveness in treating bipolar disorder and addiction.
  • ...1 more annotation...
  • Regardless, it is unlikely that ketamine, psilocybin or any of these psychedelics would be used directly, because of their hallucinogenic and other side effects. According to Duman, several pharmaceutical companies have already begun the search for alternative compounds that target the same biochemistry or brain function, including some that his lab is testing.
    • anonymous
       
      A commenter wryly points out that it is probably the halluciatory effects that is the *reason* for the decreased depression.
  •  
    "Scientists suggest that some psychedelics are remarkably good at treating disorders like depression-and may now have a clue as to why." By David Biello at Scientific American on August 19, 2010.
« First ‹ Previous 41 - 60 of 149 Next › Last »
Showing 20 items per page