Skip to main content

Home/ Long Game/ Group items tagged knowledge

Rss Feed Group items tagged

anonymous

Why Americans Are the Weirdest People in the World - 0 views

  • For instance, the different ways people perceive the Müller-Lyer illusion likely reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions.
  • As the three continued their work, they noticed something else that was remarkable: again and again one group of people appeared to be particularly unusual when compared to other populations—with perceptions, behaviors, and motivations that were almost always sliding down one end of the human bell curve.
  • In the end they titled their paper “The Weirdest People in the World?” (pdf) By “weird” they meant both unusual and Western, Educated, Industrialized, Rich, and Democratic. It is not just our Western habits and cultural preferences that are different from the rest of the world, it appears. The very way we think about ourselves and others—and even the way we perceive reality—makes us distinct from other humans on the planet, not to mention from the vast majority of our ancestors. Among Westerners, the data showed that Americans were often the most unusual, leading the researchers to conclude that “American participants are exceptional even within the unusual population of Westerners—outliers among outliers.”
  • ...31 more annotations...
  • The trio of researchers are young—as professors go—good-humored family men. They recalled that they were nervous as the publication time approached. The paper basically suggested that much of what social scientists thought they knew about fundamental aspects of human cognition was likely only true of one small slice of humanity. They were making such a broadside challenge to whole libraries of research that they steeled themselves to the possibility of becoming outcasts in their own fields.
  • “We were scared,” admitted Henrich. “We were warned that a lot of people were going to be upset.” “We were told we were going to get spit on,” interjected Norenzayan. “Yes,” Henrich said. “That we’d go to conferences and no one was going to sit next to us at lunchtime.”
  • Still, I had to wonder whether describing the Western mind, and the American mind in particular, as weird suggested that our cognition is not just different but somehow malformed or twisted. In their paper the trio pointed out cross-cultural studies that suggest that the “weird” Western mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group. WEIRD minds are also more analytic, possessing the tendency to telescope in on an object of interest rather than understanding that object in the context of what is around it.
  • The WEIRD mind also appears to be unique in terms of how it comes to understand and interact with the natural world. Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world.
  • Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
    • anonymous
       
      I did a shit ton of this. I was very internal, didn't have many friends, and came to identify with 'things' as though they were people.
  • Given that people living in WEIRD societies don’t routinely encounter or interact with animals other than humans or pets, it’s not surprising that they end up with a rather cartoonish understanding of the natural world. “Indeed,” the report concluded, “studying the cognitive development of folkbiology in urban children would seem the equivalent of studying ‘normal’ physical growth in malnourished children.”
  • The three insisted that their goal was not to say that one culturally shaped psychology was better or worse than another—only that we’ll never truly understand human behavior and cognition until we expand the sample pool beyond its current small slice of humanity.
  • Despite these assurances, however, I found it hard not to read a message between the lines of their research. When they write, for example, that weird children develop their understanding of the natural world in a “culturally and experientially impoverished environment” and that they are in this way the equivalent of “malnourished children,” it’s difficult to see this as a good thing.
  • THE TURN THAT HENRICH, Heine, and Norenzayan are asking social scientists to make is not an easy one: accounting for the influence of culture on cognition will be a herculean task. Cultures are not monolithic; they can be endlessly parsed. Ethnic backgrounds, religious beliefs, economic status, parenting styles, rural upbringing versus urban or suburban—there are hundreds of cultural differences that individually and in endless combinations influence our conceptions of fairness, how we categorize things, our method of judging and decision making, and our deeply held beliefs about the nature of the self, among other aspects of our psychological makeup.
    • anonymous
       
      This is another place where my love of long-term thinking rears its head. So modern as we imagine ourselves, with all our fancy machines, we are still bareinfants when it comes to reckoning about ourselves.
  • Recent research has shown that people in “tight” cultures, those with strong norms and low tolerance for deviant behavior (think India, Malaysia, and Pakistan), develop higher impulse control and more self-monitoring abilities than those from other places.
  • Men raised in the honor culture of the American South have been shown to experience much larger surges of testosterone after insults than do Northerners.
  • As Norenzayan sees it, the last few generations of psychologists have suffered from “physics envy,” and they need to get over it.
  • The job, experimental psychologists often assumed, was to push past the content of people’s thoughts and see the underlying universal hardware at work. “This is a deeply flawed way of studying human nature,” Norenzayan told me, “because the content of our thoughts and their process are intertwined.” In other words, if human cognition is shaped by cultural ideas and behavior, it can’t be studied without taking into account what those ideas and behaviors are and how they are different from place to place.
  • This new approach suggests the possibility of reverse-engineering psychological research: look at cultural content first; cognition and behavior second. Norenzayan’s recent work on religious belief is perhaps the best example of the intellectual landscape that is now open for study.
  • “I remember opening textbook after textbook and turning to the index and looking for the word ‘religion,’ ” he told me, “Again and again the very word wouldn’t be listed. This was shocking. How could psychology be the science of human behavior and have nothing to say about religion? Where I grew up you’d have to be in a coma not to notice the importance of religion on how people perceive themselves and the world around them.”
  • He has suggested that there may be a connection between the growth of religions that believe in “morally concerned deities”—that is, a god or gods who care if people are good or bad—and the evolution of large cities and nations.
  • If religion was necessary in the development of large-scale societies, can large-scale societies survive without religion? Norenzayan points to parts of Scandinavia with atheist majorities that seem to be doing just fine. They may have climbed the ladder of religion and effectively kicked it away. Or perhaps, after a thousand years of religious belief, the idea of an unseen entity always watching your behavior remains in our culturally shaped thinking even after the belief in God dissipates or disappears.
  • almost every major theorist on human behavior in the last 100 years predicted that it was just a matter of time before religion was a vestige of the past. But the world persists in being a very religious place.
  • HENRICH, HEINE, AND NORENZAYAN’S FEAR of being ostracized after the publication of the WEIRD paper turned out to be misplaced. Response to the paper, both published and otherwise, has been nearly universally positive, with more than a few of their colleagues suggesting that the work will spark fundamental changes. “I have no doubt that this paper is going to change the social sciences,” said Richard Nisbett, an eminent psychologist at the University of Michigan. “It just puts it all in one place and makes such a bold statement.”
  • At its heart, the challenge of the WEIRD paper is not simply to the field of experimental human research (do more cross-cultural studies!); it is a challenge to our Western conception of human nature. For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve.
  • Henrich has challenged this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own.
  • He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way.
    • anonymous
       
      Goodness, though! I'm in TOTAL control of everything! :P
  • The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.
  • People are not “plug and play,” as he puts it, and you cannot expect to drop a Western court system or form of government into another culture and expect it to work as it does back home.
  • Because of our peculiarly Western way of thinking of ourselves as independent of others, this idea of the culturally shaped mind doesn’t go down very easily.
  • That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason, Heine argues. Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically.
  • That is, the American mind strives to figure out the world by taking it apart and examining its pieces.
  • Shown another way, in a different test analytic Americans will do better on something called the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.
  • Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression).
  • These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.
  • And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers, in other words, have been the predictable consequences of the WEIRD mind doing the thinking.
  •  
    "The growing body of cross-cultural research that the three researchers were compiling suggested that the mind's capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do-the rituals, eating preferences, codes of behavior, and the like-but in the way they mold our most fundamental conscious and unconscious thinking and perception."
anonymous

Geopolitical Journey: Europe, the Glorious and the Banal - 1 views

  • How prosaic business opportunities generate the most risky and grandiose undertakings has come to interest me. This school arose with the specific goal of training sailors to go farther and farther south along the African coast in search of a sea route to India.
  • The Portuguese sought this route to cut out the middleman in the spice trade. Spices were wealth in Europe; they preserved and seasoned food, and were considered medicinal and even aphrodisiacs. But they were fiendishly expensive
  • The more I learn more about Henry, the more his program reminds me of NASA and of Tom Wolfe's classic, The Right Stuff, about America's space program. Like NASA, each mission built on the last, trying out new methods in an incremental fashion. Henry didn't try to shoot to the moon, as they say. He was no Columbus, risking everything for glory, but rather a methodical engineer, pushing the limits a little at a time and collecting data.
  • ...14 more annotations...
  • Europe wasn't kind to the world it discovered. But over time it did force each culture to become aware of all the others; after centuries, a Mongol student might learn about the Aztecs. Instead of a number of isolated worlds, each believing itself to be the center of the Earth, each new discovery fed the concept of a single world.
  • On this cape, early in the 15th century, well before Columbus sailed, Henry planned Europe's assault on the world. In the process, he laid the foundation of the modern world and modern Europe.
  • If Henry created his school solely for knowledge, then perhaps sending messages in a bottle and waiting for a reply would have done that. But Henry, the prince who became a monk, also acted for wealth, God's glory and to claim his place in history.
  • Today, we have entered a phase of history where the buccaneering spirit has left us. The desire for knowledge has separated itself from the hunger we have for wealth and glory. Glory is not big today, cool is. Cool does not challenge the gates of heaven, it accepts what is and conforms to it.
  • This is a passing phase, however. Humans will return to space to own it, discover unknown wealth and bring glory.
  • Out in West Texas and other desolate places, private companies -- privateers -- are reinventing the space program. They are searching for what Henry sought -- namely, wealth and glory.
  • Certainly, European imperialism brought misery to the world. But the world was making itself miserable before, and has since: One group of people has always been stealing land from other groups in a constant flow of history. What culture did not live on land stolen from another culture, either annihilated or absorbed? Ours has always been a brutal world. And the Europe Henry founded did not merely oppress and exploit, although it surely did those things. It also left as its legacy something extraordinary: a world that knew itself and all of its parts.
  • Only the dead leave legacies, and Europe is not dead. Yet something in it has died. The swagger and confidence of a great civilization is simply not there, at least not on the European peninsula.
  • Instead, there is caution and fear. You get the sense in Europe -- and here I think of conversations I had on previous trips in the last year or so -- of a fear that any decisive action will tear the place apart.
  • The European search for comfort and safety is not trivial, not after the horrors of the 20th century. The British and French have given up empires, Russia has given up communism, Germany and Italy have given up fascism and racism. The world is better off without these things. But what follows, what is left?
  • I am not talking here of the economic crisis that is gripping Europe, leaving Portugal with 17 percent unemployment and Spain with 26 percent. These are agonizing realities for those living through them. But Europeans have lived through more and worse.
  • Instead, I am speaking of a crisis in the European soul, the death of hubris and of risk-taking. Yes, these resulted in the Europeans trying to convert the world to Christianity and commerce, in Russia trying to create a new man and in Germany becoming willing to annihilate what it thought of as inferior men.
  • The Europeans are content to put all that behind them. Their great search for the holy grail is now reduced to finding a way to resume the comforts of the unexceptional. There is something to be said for the unexceptional life. But it cannot be all there is. 
  • We humans are caught between the hunger for glory and the price you pay and the crimes you commit in pursuing it. To me, the tension between the hunger for ordinary comforts and the need for transcendence seems to lie at the heart of the human condition. Europe has chosen comfort, and now has lost it. It sought transcendence and tore itself apart. The latter might have been Henry's legacy, but ah, to have gone to his school with da Gama and Magellan.
  •  
    "We flew into Lisbon and immediately rented a car to drive to the edge of the Earth and the beginning of the world. This edge has a name: Cabo de Sao Vicente. A small cape jutting into the Atlantic Ocean, it is the bitter end of Europe. Beyond this point, the world was once unknown to Europeans, becoming a realm inhabited by legends of sea monsters and fantastic civilizations. Cabo de Sao Vicente still makes you feel these fantasies are more than realistic. Even on a bright sunny day, the sea is forbidding and the wind howls at you, while on a gloomy day you peer into the abyss. Just 3 miles east of Cabo de Sao Vicente at the bas"
anonymous

Abstract Science - 5 views

  •  
    "Scientific abstracts are the hooks attempting to capture a discerning reader's attention, the shortcuts saving the busy reader some time and the keys unlocking scientific knowledge for those lacking a portfolio of academic journal subscriptions. But don't be dismayed if you're still confused after reading an abstract multiple times. When writing this leading, summarizing paragraph of a scientific manuscript, researchers often make mistakes. Some authors include too much information about the experimental methods while forgetting to announce what they actually discovered. Others forget to include any methodology at all. Sometimes the scientists fail to divulge why they even conducted the study in the first place, yet feel comfortable boldly speculating with a loose-fitting claim of general importance. Nevertheless, the abstract serves a critical importance and every science enthusiast needs to become comfortable with reading them."
  • ...4 more comments...
  •  
    Took at class (well, more than one) with the UChicago professional writing program (http://writing-program.uchicago.edu/courses/index.htm). There was a lot of hammering this home to the writers of those abstracts, too. We've got all these forms, and it's not always clear to the reader or writer what's expected of those forms. This does not lead to effective communication.
  •  
    Too true. Sadly, it's a lesson that still lost on some pretty senior P.I.'s, who think that 'lay summary' means simply spelling out all their acronyms.
  •  
    Honestly, this can be really hard and time-intensive work for some people. Some people understand what they need to do, but end up taking (usually very jargon-filled) shortcuts. I understand that, but I also know that it gets faster and easier with practice.
  •  
    Or hire an editor.
  •  
    It would be interesting to see how much purchase a suggestion like that receives. I suspect more than a few PI's would find the notion insulting because they've been doing it for years, and some of these really technical publications have been tolerating it for so long. For my part as an Admin, I would review the lay summary and give my impressions, which would then get (mostly) completely ignored. :)
  •  
    A _lot_ of people don't think they need professional writing and editing help. After all, they learned to write years ago.
anonymous

A Bet is a Tax on Bullshit - 0 views

  •  
    "Overall, I am for betting because I am against bullshit. Bullshit is polluting our discourse and drowning the facts. A bet costs the bullshitter more than the non-bullshitter so the willingness to bet signals honest belief. A bet is a tax on bullshit; and it is a just tax, tribute paid by the bullshitters to those with genuine knowledge."
anonymous

In Praise of Idleness By Bertrand Russell - 0 views

shared by anonymous on 08 Oct 13 - Cached
  • Work is of two kinds: first, altering the position of matter at or near the earth's surface relatively to other such matter; second, telling other people to do so.
  • Usually two opposite kinds of advice are given simultaneously by two organized bodies of men; this is called politics. The skill required for this kind of work is not knowledge of the subjects as to which advice is given, but knowledge of the art of persuasive speaking and writing, i.e. of advertising.
  • A system which lasted so long and ended so recently has naturally left a profound impress upon men's thoughts and opinions. Much that we take for granted about the desirability of work is derived from this system, and, being pre-industrial, is not adapted to the modern world.
  • ...10 more annotations...
  • The morality of work is the morality of slaves, and the modern world has no need of slavery.
  • Leisure is essential to civilization, and in former times leisure for the few was only rendered possible by the labors of the many.
  • But their labors were valuable, not because work is good, but because leisure is good.
  • And with modern technique it would be possible to distribute leisure justly without injury to civilization.
  • Why? Because work is a duty, and a man should not receive wages in proportion to what he has produced, but in proportion to his virtue as exemplified by his industry.
  • This is the morality of the Slave State, applied in circumstances totally unlike those in which it arose.
  • The idea that the poor should have leisure has always been shocking to the rich. In England, in the early nineteenth century, fifteen hours was the ordinary day's work for a man; children sometimes did as much, and very commonly did twelve hours a day.
  • When meddlesome busybodies suggested that perhaps these hours were rather long, they were told that work kept adults from drink and children from mischief. When I was a child, shortly after urban working men had acquired the vote, certain public holidays were established by law, to the great indignation of the upper classes. I remember hearing an old Duchess say: 'What do the poor want with holidays? They ought to work.' People nowadays are less frank, but the sentiment persists, and is the source of much of our economic confusion.
  • We have no attempt at economic justice, so that a large proportion of the total produce goes to a small minority of the population, many of whom do no work at all.
  • By a combination of all these devices we manage, though with difficulty, to keep alive the notion that a great deal of severe manual work must be the lot of the average man.
  •  
    "I want to say, in all seriousness, that a great deal of harm is being done in the modern world by belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work."
anonymous

Problems with scientific research: How science goes wrong - 0 views

  • Too many of the findings that fill the academic ether are the result of shoddy experiments or poor analysis (see article).
  • A rule of thumb among biotechnology venture-capitalists is that half of published research cannot be replicated. Even that may be optimistic.
  • Even when flawed research does not put people’s lives at risk—and much of it is too far from the market to do so—it squanders money and the efforts of some of the world’s best minds.
  • ...16 more annotations...
  • In the 1950s, when modern academic research took shape after its successes in the second world war, it was still a rarefied pastime.
  • Nowadays verification (the replication of other people’s results) does little to advance a researcher’s career. And without verification, dubious findings live on to mislead.
  • In order to safeguard their exclusivity, the leading journals impose high rejection rates: in excess of 90% of submitted manuscripts. The most striking findings have the greatest chance of making it onto the page.
  • And as more research teams around the world work on a problem, the odds shorten that at least one will fall prey to an honest confusion between the sweet signal of a genuine discovery and a freak of the statistical noise.
  • “Negative results” now account for only 14% of published papers, down from 30% in 1990.
  • The failure to report failures means that researchers waste money and effort exploring blind alleys already investigated by other scientists.
  • When a prominent medical journal ran research past other experts in the field, it found that most of the reviewers failed to spot mistakes it had deliberately inserted into papers, even after being told they were being tested.
  • What might be done to shore it up?
  • One priority should be for all disciplines to follow the example of those that have done most to tighten standards. A start would be getting to grips with statistics, especially in the growing number of fields that sift through untold oodles of data looking for patterns.
  • Geneticists have done this, and turned an early torrent of specious results from genome sequencing into a trickle of truly significant ones.
  • Ideally, research protocols should be registered in advance and monitored in virtual notebooks. This would curb the temptation to fiddle with the experiment’s design midstream so as to make the results look more substantial than they are.
  • (It is already meant to happen in clinical trials of drugs, but compliance is patchy.) Where possible, trial data also should be open for other researchers to inspect and test.
  • Some government funding agencies, including America’s National Institutes of Health, which dish out $30 billion on research each year, are working out how best to encourage replication.
  • Journals should allocate space for “uninteresting” work, and grant-givers should set aside money to pay for it.
  • Peer review should be tightened—or perhaps dispensed with altogether, in favour of post-publication evaluation in the form of appended comments. That system has worked well in recent years in physics and mathematics. Lastly, policymakers should ensure that institutions using public money also respect the rules.
  • Science still commands enormous—if sometimes bemused—respect. But its privileged status is founded on the capacity to be right most of the time and to correct its mistakes when it gets things wrong.
  •  
    "A SIMPLE idea underpins science: "trust, but verify". Results should always be subject to challenge from experiment. That simple but powerful idea has generated a vast body of knowledge. Since its birth in the 17th century, modern science has changed the world beyond recognition, and overwhelmingly for the better."
anonymous

Russian Spies and Strategic Intelligence - 0 views

  • The way the media has reported on the issue falls into three groups: That the Cold War is back, That, given that the Cold War is over, the point of such outmoded intelligence operations is questionable, And that the Russian spy ring was spending its time aimlessly nosing around in think tanks and open meetings in an archaic and incompetent effort.
  • First, it needs to know what other nations are capable of doing.
  • Second, the nation needs to know what other nations intend to do.
  • ...25 more annotations...
  • The more powerful a nation is, the more important it is to understand what it is doing.
  • Knowing what the United States will do, and shifting policy based on that, can save countries from difficulties and even disaster.
  • What they excelled at, however, was placing undetectable operatives in key positions. Soviet talent scouts would range around left-wing meetings to discover potential recruits. These would be young people with impeccable backgrounds and only limited contact with the left. They would be recruited based on ideology, and less often via money, sex or blackmail. They would never again be in contact with communists or fellow travelers.
  • Recruiting people who were not yet agents, creating psychological and material bonds over long years of management and allowing them to mature into senior intelligence or ministry officials allowed ample time for testing loyalty and positioning. The Soviets not only got more reliable information this way but also the ability to influence the other country’s decision-making.
  • There were four phases: Identifying likely candidates, Evaluating and recruiting them, Placing them and managing their rise in the organization, And exploiting them.
  • It is difficult to know what the Russian team was up to in the United States from news reports, but there are two things we know about the Russians: They are not stupid, and they are extremely patient.
  • If we were to guess — and we are guessing — this was a team of talent scouts.
  • One of the Russian operatives, Don Heathfield, once approached a STRATFOR employee in a series of five meetings.
  • We would guess that Anna Chapman was brought in as part of the recruitment phase of talent scouting.
  • Each of the phases of the operatives’ tasks required a tremendous amount of time, patience and, above all, cover. The operatives had to blend in (in this case, they didn’t do so well enough).
  • Were the Americans to try the same thing, they would have to convince people to spend years learning Russian to near-native perfection and then to spend 20-30 years of their lives in Russia. Some would be willing to do so, but not nearly as many as there are Russians prepared to spend that amount of time in the United States or Western Europe.
  • The United States has substituted technical intelligence for this process. Thus, the most important U.S. intelligence-collection agency is not the CIA; it is the National Security Agency (NSA).
  • In many ways, this provides better and faster intelligence than the placement of agents, except that this does not provide influence.
  • it assumes that what senior (and other) individuals say, write or even think reveals the most important things about the country in question.
  • The fall of the Shah of Iran and the collapse of the Soviet empire were events of towering importance for the United States.
  • Either of those scenarios would not have made any difference to how events played out. This is because, in the end, the respective senior leadership didn’t know how events were going to play out. Partly this is because they were in denial, but mostly this is because they didn’t have the facts and they didn’t interpret the facts they did have properly. At these critical turning points in history, the most thorough penetration using either American or Russian techniques would have failed to provide warning of the change ahead.
  • The people being spied on and penetrated simply didn’t understand their own capabilities — i.e., the reality on the ground in their respective countries — and therefore their intentions about what to do were irrelevant and actually misleading.
  • if we regard anticipating systemic changes as one of the most important categories of intelligence, then these are cases where the targets of intelligence may well know the least and know it last.
  • We started with three classes of intelligence: capabilities, intentions and what will actually happen.
  • The first is an objective measure that can sometimes be seen directly but more frequently is obtained through data held by someone in the target country. The most important issue is not what this data says but how accurate it is.
  • For example, George W. Bush did not intend to get bogged down in a guerrilla war in Iraq. What he intended and what happened were two different things because his view of American and Iraqi capabilities were not tied to reality.
  • But in the end, the most important question to ask is whether the most highly placed source has any clue as to what is going to happen.
  • Knowledge of what is being thought is essential. But gaming out how the objective and impersonal forces will interact and play out it is the most important thing of all.
  • The events of the past few weeks show intelligence doing the necessary work of recruiting and rescuing agents. The measure of all of this activity is not whether one has penetrated the other side, but in the end, whether your intelligence organization knew what was going to happen and told you regardless of what well-placed sources believed. Sometimes sources are indispensable. Sometimes they are misleading. And sometimes they are the way an intelligence organization justifies being wrong.
    • anonymous
       
      This feels like that old saying, amateurs study tactics but experts study logistics. Perhaps that's the angle on this spying stuff that we haven't taken because we subconsciously imagine the crap of popular culture where knowledge should be.
    • anonymous
       
      It certainly makes my thoughts here (http://longgame.org/2010/07/spies-like-them/) feel pretty damned quaint.
  • There appeared to be no goal of recruitment; rather, the Russian operative tried to get the STRATFOR employee to try out software he said his company had developed. We suspect that had this been done, our servers would be outputting to Moscow. We did not know at the time who he was.
  •  
    Some amount of spying is the cost of doing business for any power. By George Friedman at StratFor on July 13, 2010.
anonymous

The Roadmap to a High-Speed Recovery - 0 views

  • Let me say first that the bailouts and stimulus programs of the last two years were not a complete mistake. Economic policymakers don’t have the luxury of hindsight in the heat of a crisis; there is tremendous pressure on them to do something. It would have been suicidal not to give the banks the capital infusions they needed when the whole financial system was on the brink of meltdown or to refuse to help states avoid laying off thousands of teachers and police and other workers.
  • this is no bump in the business cycle that we are going through; it is an epochal event, comparable in magnitude and scope to the Great Depression of the 1930s, and even more so, as historian Scott Reynolds Nelson has observed, to the decades-long crisis that began in 1873. Back then our economy was undergoing a fundamental shift from agriculture to industry. We are in the midst of an equally tectonic transition today, as our industrial economy gives way to a post-industrial knowledge economy—but by focusing all our attention of whether we need a bigger stimulus or a smaller deficit, we’re flying blind.
  • More R&D labs opened in the first four years of the Great Depression than in the entire preceding decade, 73 compared to 66. By 1940, the number of people employed in R&D had quadrupled, increasing from fewer than 7,000 in 1929 to nearly 28,000 by 1940
  • ...7 more annotations...
  • Between 1980 and 2006, the U.S. economy added some 20 million new jobs in its creative, professional, and knowledge sectors. Even today, unemployment in this sector of the economy has remained relatively low, and according to Bureau of Labor Statistics projections, is likely to add another seven million jobs in the next decade. By contrast, the manufacturing sector added only one million jobs from 1980 to 2006, and, according to the BLS, will lose 1.2 million by 2020.
  • Our whole education system needs a drastic overhaul to make its teaching styles less rote and more dynamic, to encourage more hands-on, interactive creativity.
  • Home ownership provided a powerful form of geographic Keynsianism. But that system has reached the end of its useful life. It has led to overinvestment in housing, autos, and energy and contributed to the crises we are trying so hard to extricate ourselves from today. It’s also no longer an engine of economic growth. With the rise of a globalized economy, many if not most of the products that filled those suburban homes are made abroad. Home ownership worked well for a nation whose workers had secure, long-term jobs. But now it impedes the flexibility of a labor market that requires people to move around.
  • Federal policy needs to encourage less home ownership and a greater density of development
  • Concentration and clustering are the underlying motor forces of real economic development. As Jane Jacobs identified and the Nobel Prize-winning economist Robert Lucas later formalized, clustering speeds the transmission of new ideas, increases the underlying productivity of people and firms, and generates the diversity required for new ideas to fertilize and turn into new innovations and new industries.
  • the key to understanding America’s historic ability to respond to great economic crises lies in what economic geographers call the “spatial fix”—the creation of new development patterns, new ways of living and working, and new economic landscapes that simultaneously expand space and intensify our use of it.
  • That means high-speed rail, which is the only infrastructure fix that promises to speed the velocity of moving people, goods, and ideas while also expanding and intensifying our development patterns. If the government is truly looking for a shovel-ready infrastructure project to invest in that will create short-term jobs across the country while laying a foundation for lasting prosperity, high-speed rail works perfectly. It is central to the redevelopment of cities and the growth of mega-regions and will do more than anything to wean us from our dependency on cars. High-speed rail may be our best hope for revitalizing the once-great industrial cities of the Great Lakes. By connecting declining places to thriving ones—Milwaukee and Detroit to Chicago, Buffalo to Toronto—it will greatly expand the economic options and opportunities available to their residents. And by providing the connective fibers within and between America’s emerging mega-regions, it will allow them to function as truly integrated economic units.
anonymous

Achieving Techno-Literacy - 0 views

  • • Every new technology will bite back. The more powerful its gifts, the more powerfully it can be abused. Look for its costs. • Technologies improve so fast you should postpone getting anything you need until the last second. Get comfortable with the fact that anything you buy is already obsolete. • Before you can master a device, program or invention, it will be superseded; you will always be a beginner. Get good at it. • Be suspicious of any technology that requires walls. If you can fix it, modify it or hack it yourself, that is a good sign. • The proper response to a stupid technology is to make a better one, just as the proper response to a stupid idea is not to outlaw it but to replace it with a better idea. • Every technology is biased by its embedded defaults: what does it assume? • Nobody has any idea of what a new invention will really be good for. The crucial question is, what happens when everyone has one? • The older the technology, the more likely it will continue to be useful. • Find the minimum amount of technology that will maximize your options.
  •  
    "Technology will change faster than we can teach it. My son studied the popular programming language C++ in his home-school year; that knowledge could be economically useless soon. The accelerating pace of technology means his eventual adult career does not exist yet. Of course it won't be taught in school. But technological smartness can be. Here is the kind of literacy that we tried to impart:" By Kevin Kelly at The New York Times on September 16, 2010.
anonymous

Bacteria 'R' Us - 0 views

  • Regardless of the scale at which we explore the biosphere — whether we delve into the global ocean or the internal seas of individual organisms — bacteria are now known to be larger players than humans ever imagined.
  • Strictly by the numbers, the vast majority — estimated by many scientists at 90 percent — of the cells in what you think of as your body are actually bacteria, not human cells.
  • The number of bacterial species in the human gut is estimated to be about 40,000, according to Daniel Frank and Norman Pace, writing in the January 2008 Current Opinion in Gastroenterology. The total number of individual bacterial cells in the gut is projected to be on the order of 100 trillion, according to Xing Yang and colleagues at the Shanghai Center for Bioinformation Technology, reporting in the June 2009 issue of PLoS One, a peer-reviewed online science journal. Xing calculated a ballpark figure for the number of unique bacterial genes in a human gut at about 9 million.
  • ...22 more annotations...
  • These facts by themselves may trigger existential shock: People are partly made of pond scum.
  • For the purposes of this article, we’ll focus on the fundamental difference between two major types of life-forms: those that have a cell wall but few or no internal subdivisions, and those that possess cells containing a nucleus, mitochondria, chloroplasts and other smaller substructures, or organelles.
  • The former life-forms — often termed prokaryotes — include bacteria and the most ancient of Earth’s life-forms, the archaea.
  • The tree-of-life notion remains a reasonable fit for the eukaryotes, but emerging knowledge about bacteria suggests that the micro-biosphere is much more like a web, with information of all kinds, including genes, traveling in all directions simultaneously.
  • In principle, every bacterium can exchange genes with every other bacterium on the planet. A side effect of this reality: The notion of separate bacterial species is somewhat shaky, although the term is still in use for lack of a better alternative.
  • Even before quorum sensing was discovered in V. fischeri, scientists had noted many examples of coordinated action, such as “swarming,” in which a colony of bacteria moves as a unit across a surface, and the development of “fruiting bodies,” in which bacteria glom together to form inert spores as a means of surviving severe environmental conditions.
  • Bacteria can live solitary lives, of course, but they prefer to aggregate in biofilms, also known as “slime cities.” Biofilms usually form on a surface, whether it’s the inner lining of the intestines or inside water pipes or on your teeth. In these close-knit colonies, bacteria coordinate group production of a slimy translucent coating and fibers called “curli” and “pili” that attach the colony to something else. Biofilms can harbor multiple types of bacteria as well as fungi and protists (microscopic eukaryotes). A complex vascular system for transporting nutrients and chemical signals through a biofilm may also develop. As Tim Friend described in his book The Third Domain, explorers diving to the wreck of the Titanic found these features in “rusticles” — draped colonies of microbes — feeding on the iron in the Titanic‘s hull and skeleton, more than 2 miles under the surface.
  • The import of this distribution of microorganisms is unclear, but its existence reinforces the notion that humans should start thinking of themselves as ecosystems, rather than discrete individuals.
  • A microbe’s effects on the human body can depend on conditions. And if you approach the human body as an ecosystem, some researchers are finding, it may be possible to tune that system and prevent many diseases — from acute infections to chronic debilitating conditions — and even to foster mental health, through bacteria.
  • in practice, the medical notion of friendly microbes has yet to extend much past the idea that eating yogurt is good for you. For most doctors and medical microbiologists, microbes are enemies in a permanent war. Medicine certainly has good reason to view microbes as dangerous, since the germ theory of disease and the subsequent development of antibiotics are two of medical science’s greatest accomplishments.
  • When threatened, bacteria become defensive, often producing toxins that make the host even sicker. They also tend to speed up their acquisition of and purging of genes when under external selection pressure, of which antibiotics are an obvious and powerful example.
  • Gut bacteria play a role in obesity, which affects about a third of American adults.
  • Research in animals supports the idea that gut bacteria play a role in weight regulation.
  • bacteria produce some of the same types of neurotransmitters that regulate the function of the human brain.
  • it’s been known for a while that sick people get depressed and anxious. This seems so obvious as to be a no-brainer, but research suggests that some of the fear and fatigue associated with infections stems from immune responses affecting the brain.
  • As it turns out, however, very few bacteria can be grown in the relatively austere conditions of laboratories. In fact, only about 0.1 percent of all bacteria are currently culturable. Many bacteria don’t do well in monoculture, preferring to live in mixed communities of microorganisms. Those living in extreme temperatures and pressures require very specialized equipment to grow in a typical lab.
  • In fact, they wrote, the genes that enable these processes today “may have been distributed across a common global gene pool, before cellular differentiation and vertical genetic transmission evolved as we know it today.”
  • In other words, bacteria are supreme code monkeys that probably perfected the packages of genes and the regulation necessary to produce just about every form of life, trading genetic information among themselves long before there was anything resembling a eukaryotic cell, let alone the masters of the universe that humans believe humans to be.
  • Giovannoni stops short of claiming that bacteria are actually thinking. But the litany of bacterial talents does nibble at conventional assumptions about thinking: Bacteria can distinguish “self” from “other,” and between their relatives and strangers; they can sense how big a space they’re in; they can move as a unit; they can produce a wide variety of signaling compounds, including at least one human neurotransmitter; they can also engage in numerous mutually beneficial relationships with their host’s cells. Even more impressive, some bacteria, such as Myxococcus xanthus, practice predation in packs, swarming as a group over prey microbes such as E. coli and dissolving their cell walls.
  • These phenomena, Herbert Levine’s group argues, reveal a capacity for language long considered unique to humans.
  • That bacteria-centric argument is, of course, a hazy, metaphysical Gaian fantasy worthy of Avatar. In a more down-to-earth assessment, it is clear that bacteria are not what the general run of humans thought they were, and neither are humans.
  • The grand story of human exceptionalism — the idea that humans are separate from and superior to everything else in the biosphere — has taken a terminal blow from the new knowledge about bacteria. Whether humanity decides to sanctify them in some way or merely admire them and learn what they’re really doing, there’s no going back.
  •  
    "Emerging research shows that bacteria have powers to engineer the environment, to communicate and to affect human well-being. They may even think." By Valerie Brown at Miller-McCune on October 18, 2010.
anonymous

Objectivism & "Metaphysics," Part 17 - 0 views

  • Rand’s explicates her views on this issue as follows:The primacy of existence (of reality) is the axiom that existence exists, i.e., that the universe exists independent of consciousness (of any consciousness), that things are what they are, that they possess a specific nature, an identity. The epistemological corollary is the axiom that consciousness is the faculty of perceiving that which exists—and that man gains knowledge of reality by looking outward. The rejection of these axioms represents a reversal: the primacy of consciousness—the notion that the universe has no independent existence, that it is the product of a consciousness (either human or divine or both). The epistemological corollary is the notion that man gains knowledge of reality by looking inward (either at his own consciousness or at the revelations it receives from another, superior consciousness).
  • 1. To begin with, how meaningful is it to declare the “primacy of existence” against the “primacy of consciousness,” given that consciousness is itself a part of existence?
  • apologists for Rand will declare that the primacy of existence simply means that consciousness does not create reality, as various forms of idealism imply. But if so, why didn’t Rand just say that consciousness doesn’t create “existence” and be done with it?
  • ...10 more annotations...
  • 2. Rand’s primacy of existence construct is a rather confusing way of stating realism. However, Rand’s equation of the primacy of existence (i.e., realism) with the axiom existence exists leads to a palpable contradiction.
  • The term existence, when Rand first introduces it, is equated with the content of consciousness.
  • Both the elephant at the zoo perceived by onlookers and the pink elephant perceived by the drunken sot “exist” according to the logic of Rand’s statement.
  • The only way to escape this conclusion is through equivocation, i.e., by claiming that the drunk does not in fact “perceive” the pink elephant, but merely hallucinates it.
  • To sum up: it is illogical, even on Rand’s premises, to deduce or infer realism (i.e., the primacy of existence) from the axiom existence exists.
  • 3. Rand’s “primacy of existence” and it’s opposition to the “primacy of consciousness” implies materialism, which contradicts the predominant anti-materialistic tone of Objectivism.
  • It's also amusing to see how Rand unashamedly chose the viewpoint of the primacy of consciousness when she quoted approvingly the line "I will not die, it's the world that will end", a nice illustration of solipsism...
  • What's even more annoying is that when you try to engage an Objectivist in an intelligent discussion of the axioms and why they are not what they are hyped up to be, the usual result is having a bundle of epithets tossed in your direction related to your capacity for rationality.
  • Now what exactly is this “existence” that Rand declares as “primary”? Rand’s equation of the term existence with the term reality hardly improves matters, since Rand does not go on to define very clearly what she means by the term “reality."
  • The real problem here is not that Rand is an unwitting materialist, but that her metaphysical speculations are so loose that one can draw whatever conclusion one likes from them.
  •  
    By Greg Nyquist at Ayn Rand Contra Human Nature on October 25, 2010.
anonymous

Intellectual Sources of the Latest Objectischism 2 - 0 views

  • Rand never considered the implications of this principle in other venues, such as a voluntary organization such as ARI.
  • The fact that such a conflict exists at all indicates that one (if not both) of the parties are "irrational."
  • Indeed, the fact that conflicts exist within orthodox Objectivism -- conflicts so intense and irresolvable that they can only be ended by one of the parties exiting the scene -- suggests something profoundly amiss.
  • ...23 more annotations...
  • I have suggested in previous posts on this blog that "reason" is a mythical faculty. None of its champions have ever provided empirical evidence demonstrating it's reported efficacy. It's merely a term used by those seeking to justify contentions based on insufficient evidence.
  • Seeking justification for a theory in "reason" is merely an invitation for rationalization, which is the bane of rational inquiry.
  • Nothing could be more to the purpose along these lines then an empirical examination of how reason works to solve disputes within an organization run by leading Objectivists.
  • Differences of opinion can be settled by "reasoned" discussion.
  • Peikoff admits, for example, that "Ultimately, someone has to decide who is qualified to hold such positions [on the ARI board] and where the line is to be drawn."
  • Someone has to decide? Shouldn't "reason" decide? Since reality is objective and "reason" the only "valid" means of knowing reality, what need is there for an individual to decide these things at all?
  • Within the Objectivist ideology, the idea of context is used as a kind of conceptual escape hatch to explain, for instance, why a moral absolute may not apply in all instances (because moral absolutes are "contextual") or why an individual may be certain yet wrong (because certainty is "contextual").
  • Those with a wider context of knowledge will (presumably) achieve a higher level of "certainty." They will know more and will hence be in a better position to make rational decisions.
  • If differing contexts of knowledge cause rational men to arrive at different conclusions, then Rand's contention about "no conflicts of interest" among rational men must be dropped.
  • Different contexts lead to different assessments of interests, even among rational men; and differing assessment of interests will inevitably lead to conflicts.
  • Neither Rand nor any of her disciples have ever provided us with a detailed description of how to distinguish a rational interest from a non-rational interest. If we go by Objectivist writings, a "rational" interest is merely any interest that Rand and her disciples approve of, while a non-rational (or "irrational") interest is an any interest they disapprove of.
  • Conflict of interests are therefore a built-in feature of the human condition. To deny this is to live in fairy-tale world.
  • Objectivists are not supposed to be concerned with status. It is a product of that horror or horrors, social metaphysics. It reeks of authoritarianism and the appeal to faith. Yet status can no more be exorcised from man's "emotional mechanism" than sex or hunger can.
  • The "formal" meaning is the literal, conscious meaning; it's the rationalized meaning, meant to persuade and deceive both the rationalizer and his audience. The "real" meaning accords with the unconscious motives that are prompting the whole business.
  • It's not enough to conceal one's motives; one must also believe in the "truth" of one's deception. In short, one must accept one's own lies and become, if you will, a sincere hypocrit.
  • Hence their inability to engage in reasoned discourse with those who disagree with them. Hence their inability to even understand, let alone refute, their critics. Hence their inability to use "reason" to resolve differences among themselves.
  • When people are forced to repress and conceal their true motives under a veneer of logic, rationalization becomes the order of the day.
  • Rand actually never bothers to explain, in a clear, detailed, empirically testable fashion, how one goes about using "reason." About as detailed as she gets is the following:
  • Rand's inclusion of concept-formation in her conception of reason is deeply problematical.
  • Concept-formation is an extremely complex process involving unconscious process that cannot be directed by the conscious mind.
  • without an articulable, formalized technique, reason cannot be "followed."
  • Rand's "reason" is therefore mythical. No such technique exists or is possible. What is possible, instead, is rational and empirical criticism.
  • If Leonard Peikoff did not exist, Objectivists would be forced to invent him. Without a central authority, Objectivism would splinter into hundreds of fragments, each claiming to follow "reason" and crying anathema on all other fragments. The Objectivist movement, precisely because it follows "reason," which is entirely mythical faculty, must be authoritarian at its core. It cannot exist on any other basis.
  •  
    "According to Rand, the Objectivist Ethics "holds that the rational interests of men do not clash-that there is no conflict of interests among men who do not desire the unearned, who do not make sacrifices nor accept them, who deal with one another as traders, giving value for value." Now it seems likely that this principle was devised primarily (and perhaps solely) to convince herself and her followers that it is never in an individual's rational self-interest to violate the rights of another person. Rand never considered the implications of this principle in other venues, such as a voluntary organization such as ARI." By Greg Nyquist at Ayn Rand Contra Human Nature on November 22, 2010.
anonymous

Bin Laden's Death and the Implications for Jihadism - 0 views

  • a deep-seated thirst for vengeance led the United States to invade Afghanistan in October 2001 and to declare a “global war on terrorism.”
  • In spite of the sense of justice and closure the killing of bin Laden brings, however, his death will likely have very little practical impact on the jihadist movement.
  • the phenomenon of jihadism is far wider than just the al Qaeda core leadership of bin Laden and his closest followers.
  • ...3 more annotations...
  • All of this has caused the al Qaeda core to become primarily an organization that produces propaganda and provides guidance and inspiration to the other jihadist elements rather than an organization focused on conducting operations.
  • As STRATFOR has analyzed the war between the jihadist movement and the rest of the world, we have come to view the battlefield as being divided into two distinct parts, the physical battlefield and the ideological battlefield.
  • While the al Qaeda core has been marginalized recently, it has practiced good operational security and has been able to protect its apex leadership for nearly 10 years from one of the most intense manhunts in human history.
  •  
    "U.S. President Barack Obama appeared in a hastily arranged televised address the night of May 1, 2011, to inform the world that U.S. counterterrorism forces had located and killed Osama bin Laden. The operation, which reportedly happened in the early hours of May 2 local time, targeted a compound in Abbottabad, a city located some 31 miles north of Islamabad, Pakistan's capital. The nighttime raid resulted in a brief firefight that left bin Laden and several others dead. A U.S. helicopter reportedly was damaged in the raid and later destroyed by U.S. forces. Obama reported that no U.S. personnel were lost in the operation. After a brief search of the compound, the U.S. forces left with bin Laden's body and presumably anything else that appeared to have intelligence value. From Obama's carefully scripted speech, it would appear that the U.S. conducted the operation unilaterally with no Pakistani assistance - or even knowledge."
anonymous

Klip.me - Google Reader to Kindle - 6 views

shared by anonymous on 13 Oct 12 - No Cached
Erik Hanson liked it
  •  
    "It allows you to send subscriptions that in specific folder of Google Reader Periodical format, include article index"
  • ...3 more comments...
  •  
    Blowing my mind here. I'll have to check up on this.
  •  
    I'm going nuts trying to figure out how to make a Kindle Paperwhite happen. It gets so much right and it's focuse - like a laser - on what I want: reading and writing and notetaking. I spent (wasted) most of this past weekend trying to get my netbook to become somewhat useful in this regard. What junk. I actually have deep regrets about a seven-hour hole of failure yesterday as I tried all these linux builds that are so desperate for the cloud that you cant' make use of them offline. So that leaves me with... Windows: which is the whole thing I left because it was too slow on the netbook. But it's the best option given how much offline stuff I need to do. Now, the KINDLE, on the other hand... the thing is perfect. Sooo many ways to get data onto it now.
  •  
    A lot of the stuff you shared won't work for me, since I'm on the Kindle App mostly (and a Kindle 1). The app on iOS is good, but I don't have a point of comparison against more recent Kindles.
  •  
    Totally. The app version (I have it on Android) doesn't begin to cut it. Which is funny, to me, because you'd think the straight-up software version of something would be eaiser... but I'm clueless. In the communications department, we have a very strong need to have a "book" that contains all the relevant information about X right at hand. The scientists could benefit from this, as well, since they're printing 20 reams of paper's worth of stuff they barely read, probably weekly (and that's conservative). I just see the thing on the cusp of becoming a ubiquitous tool. I know that tablets are awesome, but I'm coming at this from a single (or few) purpose device. All that happens when we roll out tablets for people is they start playing Angry Birds.
  •  
    I vote for online knowledge sites with dynamic interfaces.
anonymous

Obama's Second Term - 1 views

  • The foreign policy story of U.S. President Barack Obama's first term could be told through three personalities: former Defense Secretary Robert Gates, Secretary of State Hillary Clinton and former Special Representative to Afghanistan and Pakistan Richard Holbrooke.
  • Because of Gates, Obama did not go "soft" as Democrats are supposedly liable to do. Guantanamo Bay prison remained open, there was no initial rush to the exits in Iraq, a robust campaign of assassinations against al Qaeda proceeded apace, and so forth.
  • In other words, rhetoric aside, Obama's first two years were not much different from George W. Bush's last two.
  • ...16 more annotations...
  • Holbrooke, though, may be the most significant member of the Obama story thus far because of his negative value: He was a larger-than-life personality who was crucially ignored.
  • By thwarting Holbrooke, White House advisers like Tom Donnelly signaled that while practical and hard-edged, Obama was not a risk taker with a grand strategy like Richard Nixon or George H.W. Bush.
  • Judging by his new appointees, Obama's second term will be like his first, only more so. Pragmatism will reign supreme, even as there will be little appetite to take authentically risky initiatives, whether diplomatic, military or otherwise.
  • Some in the media have celebrated Secretary of State-designate John Kerry as bold. Nonsense. Boldness is not necessarily about diplomacy for diplomacy's sake, which is all Kerry seems to be about thus far. Rather, boldness is often about backing up diplomacy with the threat or use of some kind of force in creative combinations toward a larger strategy.
  • Hagel is essentially a moderate Republican who is now closer to Democrats (he is distinguished by the fact that -- unusual for Washington -- he actually speaks his mind).
  • the emphasis at the Pentagon will be on smart cost-cutting; withdrawing from a high-maintenance, low-payoff conflict in Afghanistan; and avoiding -- unless absolutely necessary -- a military strike against Iran.
  • people extremely hesitant to embark on any adventures.
  • Indeed, the East Coast knowledge elite essentially believes that foreign policy is a branch of Holocaust studies, in which a president is judged by his willingness to intervene on behalf of innocent civilians in times of conflict. While it is true that the memory of the Holocaust -- less than a lifetime removed -- must play a role in foreign policy, at the same time it cannot define it.
  • Foreign policy is primarily about the battle of space and power, in which order takes precedence over freedom, and interests take precedence over values.
    • anonymous
       
      I hate that this is right.
  • Such a realist mindset is rejected by the media and academia, even as it is quietly practiced throughout government and, especially, by successful foreign policy administrations. Obama's new appointees will practice realism, even as idealism will infuse their remarks at press conferences.
  • Yes, Obama intervened largely for humanitarian considerations in Libya. But it was a hesitant, unenthusiastic intervention in which no boots were on the ground beyond some Special Operations Forces, ensuring that the United States did not own the security situation of post-Gadhafi Libya.
  • Even if the new secretaries of state and defense are less cautious than they appear, they will steer away from anything that smells of a large-scale, boots-on-the-ground operation, unless it is within an international coalition enjoying near-global consensus.
  • Instead, Obama will want to beat his chest in the Pacific, not in the Middle East.
  • One of the unstated reasons why Obama is intent on continuing his emphasis on the Pacific into his second term is because it allows for a demonstration of American military power without the significant risk of war erupting.
  • foreign policy during his administration is in safe hands, no great initiatives or schemes have been -- or will be -- attempted, and any threats or challenges that arise will be addressed efficiently through procedural responses.
  • The media may turn out to be severely disappointed with Kerry and Hagel, and that might actually -- much of the time, at least -- turn out for the good.
  •  
    "Presidents define themselves by whom they appoint: At the very top of the Washington food chain, personalities matter much more than bureaucratic systems. This is particularly true in a second term, when the need to follow opinion polls is far less intense, allowing the president and his new appointees a freer hand."
anonymous

U.S.: What the Sequester Will Do to the Military - 0 views

  • The current continuing resolution that Congress is using to fund the entire government until March 27 has already affected U.S. forces.
  • Although Stratfor typically does not examine domestic U.S. issues, this one is geopolitically significant.
  • The U.S. military, and particularly the Navy, is the most powerful force projection instrument in the world. When the sequester takes effect, it will immediately reduce military spending by 8 percent, with more than $500 billion in cuts to defense spending over 10 years divided equally among the military branches.
  • ...12 more annotations...
  • It is not the overall amount of the reductions that is damaging, necessarily; it is the way in which the cuts will be implemented. The across-the-board cuts required by the sequestration coupled with the limits set by the continuing resolution are constraining budget planners' options in how to absorb the spending reductions and thus are damaging all the military branches, programs, training, deployments and procurement.
  • Just the threat of continued budget reductions has had an immediate effect on the military's readiness. The Navy decided not to deploy a second carrier to the Persian Gulf, backing down from its standard of two carriers in the region. Instead, the second carrier will serve in a surge capacity for the immediate future. The other branches have extended the deployments of units already in theaters and delayed others from rotating in as replacements since it is relatively less expensive to have units stay in place than move them and their equipment intercontinentally.
  • Maintenance budgets across the forces have been reduced or suspended in anticipation of cuts. Training of all non-deploying forces who are not critical to the national strategic forces is also being heavily curtailed.
  • These options were chosen because they are immediate cost-saving measures that can be reversed quickly as opposed to the big-budget procurement programs, in which changes can cause delays for years.
  • Any given military platform, from a Stryker armored vehicle to an aircraft carrier, requires a lot of money in order to be ready for use at any time at its intended level of performance. These platforms require consistent use to maintain a certain readiness level because machines cannot sit idle for months to years and then operate effectively, if at all, especially if called on for immediate action.
  • Moreover, the people that operate this equipment need to maintain their working knowledge and operational skill through continued use. This use causes wear and tear on the platform and requires consistent maintenance. All of this is necessary just to maintain the status quo. In the end, there must be a balance between a platform's readiness level and the amount of funding required for operations and maintenance, but if the money is no longer available there is no choice but to reduce readiness.
  • For example, the Navy has said it is considering suspending operations of four of its nine carrier air wings while shutting down four of its carriers in various stages of the operations and maintenance process. This would essentially give the United States one carrier deployed with one on call for years. This will be sufficient if the world remains relatively quiet, but one large emergency or multiple small ones would leave the United States able to project limited force compared to previous levels.
  • Procurement cycles are very slow and take decades to implement; for instance, the Navy that the United States wants to have in 20 years is being planned now.
  • The U.S. military has a global presence, and sequestration would have appreciable effects on this in certain areas. Potentially, the hardest hit region will be the Pacific, which has been the focus of the United States' new strategy.
  • The single biggest capability gap that will develop will be the U.S. military's surge capacity. If the Syria-Iraq-Lebanon corridor were to become more unstable, the United States will not be able to respond with the same force structure it had in the past. The U.S. military can still shift its assets to different regions to attain its strategic goals, but those assets will come from a smaller resource pool, and shifting them will lessen the presence in some other region. The military's ability to use one of its softer political tools -- joint military exercises -- will also be at risk.
  • This is not to say that the U.S. military will be wrecked immediately or that its condition is anywhere near that of the Russian military in the 1990s. A military's effectiveness is measured against its potential opponents, and the United States has enjoyed a large gap for decades.
  • Funding cuts are not necessarily abnormal for the United States while winding down into a postwar stance. Historically, the pattern has been a reduction in spending and retrenchment of a large volume of forces from abroad. However, Pentagon planners typically go into a postwar period with the stated goal of not damaging the force through these cuts and reductions. 
  •  
    "Sequestration, the automatic spending reductions scheduled to take effect March 1, will affect the U.S. military's ability to project force around the world. The current continuing resolution that Congress is using to fund the entire government until March 27 has already affected U.S. forces. The longer these funding cuts continue, the more degradation the U.S. military will incur, with longer-lasting effects. "
anonymous

A Woman's De-Liberation: There Never Was a Sexual Revolution - SynaptIQ+ Social Era Kno... - 0 views

  • At the end of my Feminist Reverie, my attention was drawn to two books next to the abridged section on feminist books––Dreams from my Father and The Audacity of Hope, both by Barack Obama. He seemed to me to be the only public person who has talked about the issues that women face in a non-intellectual, non ivory-tower way.
  • When running for President, he often spoke about the challenges his own mother faced while raising her children and pursuing her education and career. I wondered whether he had ever read A Room of One’s Own. I felt sure he would have understood Woolf’s words about a woman poet’s unfulfilled creative destiny, “She died young—alas, she never wrote a word. She lies buried where the omnibuses now stop, opposite the Elephant and Castle. Now my belief is that this poet who never wrote a word and was buried at the cross–roads still lives. She lives in you and in me, and in many other women who are not here to–night, for they are washing up the dishes and putting the children to bed.”
  • Perhaps the modern wave of feminism should have as its champion a man who understands working mothers…because he was raised by, and is married to, a working mother.
  •  
    "When I asked him if they would raise my salary to match my predecessor's if I dumped my boyfriend, got married and had children and bought an apartment, I was told that I should be grateful that I was working in what had always been considered a "man's job." In other words, if a woman was working she should count her blessings and not ask for a decent salary and respect on top of it.  When I took my complaint to the female head of Human Resources her advice was that if I wanted to keep my job, I should keep my mouth shut."
anonymous

How the internet is making us poor - Quartz - 2 views

  • Sixty percent of the jobs in the US are information-processing jobs, notes Erik Brynjolfsson, co-author of a recent book about this disruption, Race Against the Machine. It’s safe to assume that almost all of these jobs are aided by machines that perform routine tasks. These machines make some workers more productive. They make others less essential.
  • The turn of the new millennium is when the automation of middle-class information processing tasks really got under way, according to an analysis by the Associated Press based on data from the Bureau of Labor Statistics. Between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services that made everything from maintaining a calendar to planning trips easier than ever.
  • Economist Andrew McAfee, Brynjolfsson’s co-author, has called these displaced people “routine cognitive workers.” Technology, he says, is now smart enough to automate their often repetitive, programmatic tasks. ”We are in a desperate, serious competition with these machines,” concurs Larry Kotlikoff, a professor of economics at Boston University. “It seems like the machines are taking over all possible jobs.”
  • ...23 more annotations...
  • In the early 1800′s, nine out of ten Americans worked in agriculture—now it’s around 2%. At its peak, about a third of the US population was employed in manufacturing—now it’s less than 10%. How many decades until the figures are similar for the information-processing tasks that typify rich countries’ post-industrial economies?
  • To see how the internet has disproportionately affected the jobs of people who process information, check out the gray bars dipping below the 0% line on the chart, below. (I’ve adapted this chart to show just the types of employment that lost jobs in the US during the great recession. Every other category continued to add jobs or was nearly flat.)
  • Here’s another clue about what’s been going on in the past ten years. “Return on capital” measures the return firms get when they spend money on capital goods like robots, factories, software—anything aside from people. (If this were a graph of return on people hired, it would be called “Return on labor”.)
  • Notice: the only industry where the return on capital is as great as manufacturing is “other industries”—a grab bag which includes all the service and information industries, as well as entertainment, health care and education. In short, you don’t have to be a tech company for investing in technology to be worthwhile.
  • For many years, the question of whether or not spending on information technology (IT) made companies more productive was highly controversial. Many studies found that IT spending either had no effect on productivity or was even counter-productive. But now a clear trend is emerging. More recent studies show that IT—and the organizational changes that go with it—are doing firms, especially multinationals (pdf), a great deal of good.
  • Winner-take-all and the power of capital to exacerbate inequality
  • One thing all our machines have accomplished, and especially the internet, is the ability to reproduce and distribute good work in record time. Barring market distortions like monopolies, the best software, media, business processes and, increasingly, hardware, can be copied and sold seemingly everywhere at once. This benefits “superstars”—the most skilled engineers or content creators. And it benefits the consumer, who can expect a higher average quality of goods.
  • But it can also exacerbate income inequality, says Brynjolfsson. This contributes to a phenomenon called “skill-biased technological [or technical] change.” “The idea is that technology in the past 30 years has tended to favor more skilled and educated workers versus less educated workers,” says Brynjolfsson. “It has been a complement for more skilled workers. It makes their labor more valuable. But for less skilled workers, it makes them less necessary—especially those who do routine, repetitive tasks.”
  • “Certainly the labor market has never been better for very highly-educated workers in the United States, and when I say never, I mean never,” MIT labor economist David Autor told American Public Media’s Marketplace.
  • The other winners in this scenario are anyone who owns capital.
  • As Paul Krugman wrote, “This is an old concern in economics; it’s “capital-biased technological change”, which tends to shift the distribution of income away from workers to the owners of capital.”
  • Computers are more disruptive than, say, the looms smashed by the Luddites, because they are “general-purpose technologies” noted Peter Linert, an economist at University of Californa-Davis.
  • “The spread of computers and the Internet will put jobs in two categories,” said Andreessen. “People who tell computers what to do, and people who are told by computers what to do.” It’s a glib remark—but increasingly true.
  • In March 2009, Amazon acquired Kiva Systems, a warehouse robotics and automation company. In partnership with a company called Quiet Logistics, Kiva’s combination of mobile shelving and robots has already automated a warehouse in Andover, Massachusetts.
  • This time it’s fasterHistory is littered with technological transitions. Many of them seemed at the time to threaten mass unemployment of one type of worker or another, whether it was buggy whip makers or, more recently, travel agents. But here’s what’s different about information-processing jobs: The takeover by technology is happening much faster.
  • From 2000 to 2007, in the years leading up to the great recession, GDP and productivity in the US grew faster than at any point since the 1960s, but job creation did not keep pace.
  • Brynjolfsson thinks he knows why: More and more people were doing work aided by software. And during the great recession, employment growth didn’t just slow. As we saw above, in both manufacturing and information processing, the economy shed jobs, even as employment in the service sector and professional fields remained flat.
  • Especially in the past ten years, economists have seen a reversal of what they call “the great compression“—that period from the second world war through the 1970s when, in the US at least, more people were crowded into the ranks of the middle class than ever before.
  • There are many reasons why the economy has reversed this “compression,” transforming into an “hourglass economy” with many fewer workers in the middle class and more at either the high or the low end of the income spectrum.
  • The hourglass represents an income distribution that has been more nearly the norm for most of the history of the US. That it’s coming back should worry anyone who believes that a healthy middle class is an inevitable outcome of economic progress, a mainstay of democracy and a healthy society, or a driver of further economic development.
    • anonymous
       
      This is the meaty center. It's what I worry about. The "Middle Class" may just be an anomaly.
  • Indeed, some have argued that as technology aids the gutting of the middle class, it destroys the very market required to sustain it—that we’ll see “less of the type of innovation we associate with Steve Jobs, and more of the type you would find at Goldman Sachs.”
  • So how do we deal with this trend? The possible solutions to the problems of disruption by thinking machines are beyond the scope of this piece. As I’ve mentioned in other pieces published at Quartz, there are plenty of optimists ready to declare that the rise of the machines will ultimately enable higher standards of living, or at least forms of unemployment as foreign to us as “big data scientist” would be to a scribe of the 17th century.
  • But that’s only as long as you’re one of the ones telling machines what to do, not being told by them. And that will require self-teaching, creativity, entrepreneurialism and other traits that may or may not be latent in children, as well as retraining adults who aspire to middle class living. For now, sadly, your safest bet is to be a technologist and/or own capital, and use all this automation to grab a bigger-than-ever share of a pie that continues to expand.
  •  
    "Everyone knows the story of how robots replaced humans on the factory floor. But in the broader sweep of automation versus labor, a trend with far greater significance for the middle class-in rich countries, at any rate-has been relatively overlooked: the replacement of knowledge workers with software. One reason for the neglect is that this trend is at most thirty years old, and has become apparent in economic data only in perhaps the past ten years. The first all-in-one commercial microprocessor went on sale in 1971, and like all inventions, it took decades for it to become an ecosystem of technologies pervasive and powerful enough to have a measurable impact on the way we work."
anonymous

How Bayes' Rule Can Make You A Better Thinker - 1 views

  • To find out more about this topic, we spoke to mathematician Spencer Greenberg, co-founder of Rebellion Research and a contributing member of AskAMathematician where he answers questions on math and physics. He has also created a free Bayesian thinking module that's available online.
  • Bayes’s Rule is a theorem in probability theory that answers the question, "When you encounter new information, how much should it change your confidence in a belief?" It’s essentially about making decisions under uncertainty, and how we should update or revise our theories as new evidence emerges. It can also be used to help us reach decisions in those circumstances when very few observations or pieces of evidence are available. And it can also be used to help us avoid common mistakes and fallacies in our thinking.
  • The key to Bayesianism is in understanding the power of probabilistic reasoning. But unlike games of chance, in which there’s no ambiguity and everyone agrees on what’s going on (like the roll of die), Bayesians use probability to express their degree of belief about something.
  • ...11 more annotations...
  • When it comes to the confidence we have in our beliefs — what can be expressed in terms of probability — we can’t just make up any number we want. There’s only one consistent way to handle those degrees in beliefs.
  • In the strictest sense, of course, this requires a bit of mathematical knowledge. But Greenberg says there’s still an easy way to use this principle in daily life — and one that can be converted to plain English.
  • Greenberg says it’s the question of evidence which he should apply, which goes like this:: Assuming that our hypothesis is true, how much more plausible, or likely, is the evidence compared to the hypothesis if it was not true?
  • “It’s important to note that the idea here is not to answer the question in a precise way — like saying that it’s 3.2 times more likely — rather, it’s to get a rough sense. Is it a high number, a modest number, or a small number?”
  • To make Bayes practical, we have to start with the belief of how likely something is. Then we need to ask the question of evidence, and whether or not we should increase the confidence in our beliefs by a lot, a little, and so on.
  • “Much of the time people will automatically try to shoot down evidence, but you can get evidence for things that are not true. Just because you have evidence doesn’t mean you should change your mind. But it does mean that you should change your degree of belief.”
  • Greenberg also describes Representativeness Heuristic in which people tend to look at how similar things are.
  • Greenberg also says that we should shy away from phrases like, “I believe,” or “I don’t believe.” “That’s the wrong way to frame it,” he says. “We should think about things in terms of how probable they are. You almost never have anything close to perfect certainty.”
  • “Let’s say you believe that your nutrition supplement works,” he told us, “Then you get a small amount of evidence against it working, and you completely write that evidence off because you say, ‘well, I still believe it works because it’s just a small amount of evidence.’ But then you get more evidence that it doesn’t work. If you were an ideal reasoner, you’d see that accumulation of evidence, and every time you get that evidence, you should believe less and less that the nutritional supplements are actually working.” Eventually, says Greenberg, you end up tipping things so that you no longer believe. But instead, we end up never changing our mind.
  • “You should never say that you have absolute certainty, because it closes the door to being able to revise your certainty in light of new information,” Greenberg told io9. “And the same thing can be said for having zero percent certainty about something happening. If you’re at 100% certainty, then the correct way of updating is to stay at 100% forever, and no amount of evidence can tip you.”
  • Lastly, he also says that probabilities can depend on the observer — what is a kind of probability relativity. We all have access to different information, so different people should assign different rates of probability to different things based on different sets of evidence.
  •  
    "Having a strong opinion about an issue can make it hard to take in new information about it, or to consider other options when they're presented. Thankfully, there's an old rule that can help us avoid this problem - and even help us make good decisions when we're uncertain. Here's how Bayesian Reasoning works, and why it can make you a better thinker."
anonymous

Tapeworm Logic - 0 views

  • A mature tapeworm has a very simple lifestyle. It lives in the gut of a host animal, anchoring itself to the wall of the intestine with its scolex (or head), from which trails a long string of segments (proglottids) that contain reproductive structures. The tapeworm absorbs nutrients through its skin and gradually extrudes more proglottids, from the head down; as they reach the end of the tape they mature into a sac of fertilized eggs and break off.
  • The adult tapeworm has no knowledge of what happens to its egg sacs after they detach; nor does it know where it came from. It simply finds itself attached to a warm, pulsing wall, surrounded by a rich nutrient flow. Its experience of the human being is limited to this: that the human surrounds it and provides it with a constant stream of nutrients and energy.
  • Welcome to the Fermi paradox, mired in shit. Shall we itemize the errors that the tapeworm is making in its analysis?
  • ...4 more annotations...
  • The first and most grievous offense our tapeworm logician has committed is that of anthropocentrism (or rather, of cestodacentrism); it thinks everything revolves around tapeworms.
  • In reality, the human is unaware of the existence of the tapeworm. This would be a good thing, from the worm's point of view, if it had any grasp of the broader context of its existence: it ought by rights to be doing the wormy equivalent of hiding under the bed covers, gibbering in fear.
  • There are vast, ancient, alien intellects in the macrocosm beyond the well-known human, and they are unsympathetic to tapeworms.
  • Some of the tapeworm's descendants might be able to find another new human to claim as their home, but the same constraints will apply. Only if the tapeworm transcends its tapewormanity and grows legs, lungs, and other organs that essentially turn it into something other than a tapeworm will it be able to make itself at home outside the human.
  •  
    "What use is a human being - to a tapeworm?"
‹ Previous 21 - 40 of 86 Next › Last »
Showing 20 items per page