Skip to main content

Home/ Long Game/ Group items tagged architecture

Rss Feed Group items tagged

anonymous

Bin Laden's special complaint with the World Trade Center - 0 views

  • At the base of the towers, Yamasaki used implied pointed arches—derived from the characteristically pointed arches of Islam—as a transition between the wide column spacing below and the dense structural mesh above.
  • After the attack, Grabar spoke of how these towers related to the architecture of Islam, where "the entire surface is meaningful" and "every part is both construction and ornament." A number of designers from the Middle East agreed, describing the entire façade as a giant "mashrabiya," the tracery that fills the windows of mosques.
  • Having rejected modernism and the Saudi royal family, it's no surprise that Bin Laden would turn against Yamasaki's work in particular. He must have seen how Yamasaki had clothed the World Trade Center, a monument of Western capitalism, in the raiment of Islamic spirituality. Such mixing of the sacred and the profane is old hat to us—after all, Cass Gilbert's classic Woolworth Building, dubbed the Cathedral to Commerce, is decked out in extravagant Gothic regalia. But to someone who wants to purify Islam from commercialism, Yamasaki's implicit Mosque to Commerce would be anathema. To Bin Laden, the World Trade Center was probably not only an international landmark but also a false idol.
  •  
    "We all know the basic reasons why Osama Bin Laden chose to attack the World Trade Center, out of all the buildings in New York. Its towers were the two tallest in the city, synonymous with its skyline. They were richly stocked with potential victims. And as the complex's name declared, it was designed to be a center of American and global commerce. But Bin Laden may have had another, more personal motivation. The World Trade Center's architect, Minoru Yamasaki, was a favorite designer of the Binladin family's patrons-the Saudi royal family-and a leading practitioner of an architectural style that merged modernism with Islamic influences." By Laurie Kerr at Slate Magazine on December 28, 2001.
anonymous

BioShock Infinite: an intelligent, violent videogame? - Read - ABC Arts | Australian co... - 1 views

  • Infinite has the difficulty of an inherited legacy: people like to point to the first BioShock (2007) as an example of how videogames made in studios by hundreds of people and financed by corporations can be artistic. It was, in a way, a beacon of hope for those who dreamed that the sheer industrial scale at the peak of the videogames business could translate into something worth taking seriously.
  • BioShock Infinite is a videogame with ideas. Set in 1912, it’s in part inspired by The Devil In The White City, Erik Larson’s 2003 novelistic account of the 1893 Chicago World’s Fair.
  • The city is beautiful, and possibly unparalleled in terms of visual design in a videogame: along with the expected white American neo-classical architecture, we get an astounding array of poster art and fashion, taking in both the decline of the strong silhouettes and Gibson Girl aesthetics of the 1910s, and the Art Nouveau movement, as well as Kinetoscopes similar to the illusionistic films of Georges Melies.
  • ...20 more annotations...
  • Columbia, according to Infinite, is to have set sail at the 1893 Fair, thus opening up a ripe array of potential themes stemming from real world history and politics, all of which get at least lip service in the game: Manifest Destiny, American Exceptionalism, racism, and religious conflict.
  • This all occurs, as with the first BioShock, within the framework of a first person shooter.
  • The first major choice that players of BioShock Infinite are presented with is whether they would like to publicly punish an interracial couple or not. You may choose to throw a ball at the couple, who are tied up in front of a crowd at a fair, or you may choose to throw the ball at the man who is asking you to do so. The outcome of your choice is mostly the same.
  • Let’s think about that for a moment. BioShock Infinite, the game that many would hope to point to as an example of how art and subtlety might be found in expensive, mainstream videogames, sets up its moral stakes by asking the player if they would like to be a violent bigot.
  • Would you like to be for or against?
  • This is thunderously stupid, and an insipid example of how terrifyingly low the bar is set for ‘intelligence’ in mainstream videogames
  • In taking the game seriously, I want to be as clear as possible: BioShock Infinite uses racism for no other reason than to make itself seem clever. Worse, it uses racism and real events in an incredibly superficial way—BioShock Infinite seeks not to make any meaningful statement about history or racism or America, but instead seeks to use an aesthetics of ‘racism’ and ‘history’ as a barrier to point to and claim importance.
  • puts the lie to the claim that by engaging with these themes, BioShock Infinite is the place to find substance in mainstream videogames.
  • At the real Wounded Knee, over three hundred Native Americans—the Lakota Sioux—were massacred. Many of them were unarmed. Some of them were children. These were real people, with real lives and real families. The victims were buried in a mass grave, and many of the US Cavalry who led the massacre were later awarded the Medal of Honor, a decision that remains shameful today.
  • I am certainly not saying that a videogame has no right to engage with such events. What I am saying is that when you use such a horrific historic event in art—in any media—you have a responsibility to get it right, to use it to say something worthwhile, to make the invocation count.
  • Wounded Knee, I believe, is not something you get to invoke in 2013 without also making a statement of sorts. The idea of publicly punishing interracial relationships, something that of course has happened in reality, is also not something you get to invoke in 2013 without making a statement.
  • “[Letting] the player decide how they feel,” is not respecting your audience’s intelligence in these situations; it is a cop-out of the highest order.
  • For a game that so explicitly aimed to take on racism through its 1912 setting, the politics of BioShock Infinite are defined by evasion.
  • Such nihilistic disapproval is the absence of a political position masquerading as shrewd criticism. It may seem worldly, but it allows BioShock Infinite to be controversial to no-one by treating everyone with equal contempt.
  • Let us get one thing straight, then: despite its desperation to be taken seriously, BioShock Infinite is not an intelligent work of art. It is a history-themed first person shooter, and it deserves no more or less respect than any other first person shooter.
  • You can argue that the faults of BioShock Infinite are the latest and most unfortunate result of the first-person genre that found bedrock in both Doom (reflexes and gore) and Myst (architecture and mystery) in the mid-1990s, two sharply different trajectories that have been bound into problematic convergence ever since. While the two genres remain fruitfully exploited in separation, all attempts at marrying the two—and thus discovering the elusive union of the shooter’s popularity and the exploration game’s more literary aspirations—have remained ill considered. In a way, mainstream videogames are still completely dumbfounded by Edge magazine’s famous 1994 criticism of Doom: “If only you could talk to these creatures.”
  • Maybe this is really the central problem of the game—how do you merge any kind of intelligent thematic exploration while taking unrestrained pleasure in shooting people in the face?
  • Where do those two circles converge in a Venn diagram?
  • By its conclusion, BioShock Infinite quickly forgets that it ever engaged with ideas of racism and American Exceptionalism in favour of a tangled Christopher Nolan puzzle plot about time travel. This is the sound of a thousand popguns going off, taking up the silent report of a giant cannon that failed to fire.
  • it remains difficult to point to a single videogame that is both artful, subtle and a successful mainstream videogame, and BioShock Infinite only muddies the waters further.
  •  
    "Can mainstream videogame makers present an artful, intelligent thematic exploration about real world history within a game dominated by scenes of unrestrained violence, asks Daniel Golding."
anonymous

Once Upon a Time in Syria - 0 views

  • Though the term "Arabist" has been used far more broadly, during the Cold War in Washington it often came to refer to people at the State Department.
  • They tended to have come of age during World War II, were educated at the best private schools in New England, were in some cases descendants of American missionary families in the Middle East in the 19th and early 20th centuries, and by mastering Arabic in their twenties and thirties, spent their entire foreign service careers in one Arab country after another.
  • Syria was often their lodestar: the essential Arab country.
  • ...15 more annotations...
  • Syria was the throbbing heart of Arabism, the most steadfast country in its refusal to compromise with what, in Syrian eyes, was the post-colonial monstrosity known as Israel.
  • For a U.S. State Department Arabist, a posting to Syria was, in a thematic sense, fundamental to a successful career.
  • Given the fact of almost two dozen Arabic-speaking countries and only one Hebrew-speaking country to which one could be posted, an American diplomat's professional lifetime might be spent among Arabs much more easily than among Israelis. Not to develop sympathies would be inhuman.
  • The Arabists knew that it was a myth that Syria did not experiment with democracy like Israel. Syria had held three relatively free elections in 1947, 1949 and 1954, and all broke down more or less along ethnic, sectarian, tribal or regional lines, with military rule resulting after each failure.
  • The Arabists understood better than anyone else (except, that is, for the locals) that Syria was an artificial state built on a mass of contradictions.
  • It is tempting to deride the old-time Arabists of the Cold War era from the vantage point of 20/20 hindsight. After all, they dutifully communicated the diplomatic positions of one Syrian dictator after another to Washington, and especially so during the three-decade-long rule of Hafez al Assad.
  • But one is forced to argue: What else were they supposed to do?
  • the Arabists dealt with the political reality as they found it.
  • The tragedy of al Assad family rule in Syria is not that it produced tyranny: That tyranny, remember, produced sustained domestic peace after 21 changes of government in the 24 years preceding the elder al Assad's coup.
  • The tragedy is that the al Assads did nothing useful with the domestic peace they had established.
  • Citizens rise above sectarianism, whereas subjects have only sectarianism to fall back on.
  • The Arabists were in Syria and other countries not to plan American grand strategy or to tell dictators how to behave in the midst of a titanic struggle with the Soviet Union (in which America, perforce, supported many dictatorships) but to report perceptively on what was going on in their parcel of the world.
  • The Middle East, wracked by clan, tribal, ethnic and sectarian unrest, will always require area specialists with years of experience living in the region to help Washington make sense of it all.
    • anonymous
       
      See also: Internet friend Ed Webb.
  • Supporters of the Iraq War lauded Crocker's efforts to help stabilize Iraq, even as Crocker himself had warned in a 2002 memo that an American invasion of Iraq would unleash internal and regional chaos.
  • The 21st century, in other words, demands individuals with a 19th century sense of the world: people who think in terms of geography, indigenous cultures and local traditions.
  •  
    "Once upon a time, Syria was among the most enthralling and beautiful countries on earth, without the forest of hideous concrete architecture that came to deface the outskirts of its cities, without the pollution, and, of course, without the violence and lawless roads of today. Syria was a stage set for the merger of the Bible and the Mediterranean: wind-ransacked plains, littered with archaeological ruins in all the earthen colors of a rich palette, in places like Palmyra and Qala'at Samaan."
anonymous

Science-Based Medicine » It's a part of my paleo fantasy, it's a part of my p... - 0 views

  • If I had to pick one fallacy that rules above all among proponents of CAM/IM, it would have to be either the naturalistic fallacy (i.e., that if it’s natural—whatever that means—it must be better) or the fallacy of antiquity (i.e., that if it’s really old, it must be better).
  • Basically, it’s a rejection of modernity, and from it flow the interest in herbalism, various religious practices rebranded as treatments
  • there is a definite belief underlying much of CAM that technology and pharmaceuticals are automatically bad and that “natural” must be better.
  • ...33 more annotations...
  • it’s hard not to note that cancer and heart disease are primarily diseases of aging, and life expectancy was so much lower back in the day that a much smaller percentage of the population lived to advanced ages than is the case today.
  • Even so, an implicit assumption among many CAM advocates is that cardiovascular disease is largely a disease of modern lifestyle and diet and that, if modern humans could somehow mimic preindustrial or, according to some, even preagricultural, lifestyles, that cardiovascular disease could be avoided.
  • Over the last decade, Cordain has become the most prominent promoter of the so-called “Paleo diet,” having written The Paleo Diet: Lose Weight and Get Healthy by Eating the Foods You Were Designed to Eat and multiple other books advocating a paleolithic-mimetic diet as the cure for what ails modern humans.
  • But how does one determine what the prevalence of cardiovascular disease was in the ancient past?
  • there have been indications that the idea that ancient humans didn’t suffer from atherosclerosis is a comforting myth, the most recent of which is a study published a week ago online in The Lancet by Prof. Randall C. Thompson of Saint Luke’s Mid America Heart Institute and an international team of investigators entitled Atherosclerosis across 4000 years of human history: the Horus study of four ancient populations.
  • Basically, it was a study of 137 different mummies from four different geographic locations spanning 4,000 years.
  • So, although there was a fair amount of evidence from studies of Egyptian mummies that atherosclerosis was not uncommon, in Egypt it was mainly the wealthy and powerful who were mummified after their deaths. Conceivably, they could have lived a very different lifestyle and consumed a very different diet than the average Egyptian living around that time.
  • So the authors obtained whole-body CT scans of the 137 mummies, either pre-existing scans or scans prospectively done, and analyzed them for calcifications.
  • The mummies to be included in the study were chosen primarily based on two factors, being in a good state of preservation with identifiable vascular tissue, and being adults.
  • The authors obtained identifying information from an extensive search of museum and other databases by a team of archeologists and experts in mummy restoration, and sex was determined by either analysis of the genitals and reproductive organs when present and by pelvic morphology when they were not present.
  • Age was estimated by standard analysis of architectural changes in the clavicle, femur, and humerus.
  • Finally, multiple anthropological and archeological sources were used in an attempt to estimate likely risk factors for the mummies.
  • Figure 2 summarizes the findings nicely: There’s also this video featured in a Nature report on the study showing the reconstructed scan of one of the mummies with atherosclerotic plaques in the coronary arteries.
  • As expected, more atherosclerosis correlates with advanced age, and the amount of atherosclerosis in the young and middle-aged (although the times in which the people who became these mummies after death lived age 50 was old) was less.
  • Although the sample number was far too small to draw definitive conclusions (as is often the case in archeological research), the prevalence of atherosclerotic disease in these mummies did not appear to correlate with the cultures in which the mummies lived.
  • As is noted in Thompson’s article, ancient Egyptians and Peruvians were agricultural cultures with farms and domesticated animals, Ancestral Puebloans were forager-farmers, and the Unangans were hunter-gatherers without agriculture. Indeed, the Peruvians and Ancestral Puebloans predated the written word and were thus prehistoric cultures.
  • One notes that no one, including the authors of this study, is saying that lifestyle and diet are not important factors for the development of atherosclerotic heart disease.
  • What they are saying is that atherosclerosis appears to be associated with aging and that the claims that mimicking paleolithic diets (which, one notes, were definitely not vegan) are overblown. In other words, there is a certain inherent risk of atherosclerosis that is related to aging that is likely not possible to lower further
  • I actually think that the authors probably went too far with that last statement in that, while they might be correct that atherosclerosis is an inherent component of human aging, it is quite well established that this inherent component of aging can at least be worsened by sedentary lifestyle and probably certain diets.
  • One notes that, although the Paleo Diet is not, strictly speaking, always sold as CAM/IM, the ideas behind it are popular among CAM advocates, and the diet is frequently included as part of “integrative medicine,” for example, here at the University of Connecticut website, where it’s under integrative nutrition.
  • In particular, the appeal to ancient wisdom and ancient civilizations as yet untouched by the evil of modernity is the same sort of arguments that are made in favor of various CAM modalities ranging from herbalism to vegan diets rebranded as being somehow CAM to the appeal to “natural” cures.
  • Indeed, the fetish for the “natural” in CAM is such that even a treatment like Stanislaw Burzynski’s antineoplaston therapy is represented as “natural” when in fact, if it were ever shown to work against cancer, it would be chemotherapy and has toxicities greater than that of some of our current chemotherapy drugs.
  • The book is by Marlene Zuk and entitled Paleofantasy: What Evolution Really Tells Us About Sex, Diet, and How We Live. Zuk is an evolutionary biologist, and in particular she points out how the evolutionary arguments favored by advocates of the Paleo diet don’t stand up to scrutiny.
  • The interview begins with Zuk confronting Cordain at a conference on evolution and diseases of modern environments. At his lecture, Cordain pronounced several foods to be the cause of fatal conditions in people carrying certain genes.
  • These foods included, predictably, cultivated foods such as bread (made from grain), rice, and potatoes. Zuk couldn’t resist asking a question, namely why the inability to digest so many common foods would persist in the population, observing, “Surely it would have been selected out of the population.” Cordain’s response? That humans had not had time to adapt to these foods, to which Zuk retorted, “Plenty of time.” Apparently, in her book, Zuk produces numerous examples of evolution in humans occurring in a time frame of less than 10,000 years, including:
  • Blue eyes arose 6,000 to 10,000 years ago
  • Rapid selection for the CCR5-D gene variant that makes some people immune to HIV
  • Lactase persistence (production past the age of weening of the lactase enzyme that digests lactose in milk) probably dates back only around 7,500 to 10,000 years, around the time that cattle were domesticated
  • there is no one diet or climate that predominated among our Paleolithic ancestors:
  • Zuk detects an unspoken, barely formed assumption that humanity essentially stopped evolving in the Stone Age and that our bodies are “stuck” in a state that was perfectly adapted to survive in the paleolithic environment. Sometimes you hear that the intervention of “culture” has halted the process of natural selection. This, “Paleofantasy” points out, flies in the face of facts. Living things are always and continuously in the process of adapting to the changing conditions of their environment, and the emergence of lactase persistence indicates that culture (in this case, the practice of keeping livestock for meat and hides) simply becomes another one of those conditions.
  • For this reason, generalizations about the typical hunter-gatherer lifestyle are spurious; it doesn’t exist. With respect to what people ate (especially how much meat), the only safe assumption was “whatever they could get,” something that to this day varies greatly depending on where they live. Recently, researchers discovered evidence that people in Europe were grinding and cooking grain (a paleo-diet bugaboo) as far back as 30,000 years ago, even if they weren’t actually cultivating it. “A strong body of evidence,” Zuk writes, “points to many changes in our genome since humans spread across the planet and developed agriculture, making it difficult at best to point to a single way of eating to which we were, and remain, best suited.”
  • Oh, and, as Zuk tells us, paleolithic people got cancer, too.
  • we humans have long been known to abuse and despoil our environment, even back in those “paleo” days. Indeed, when I took a prehistoric archeology course, which was largely dedicated to the period of time of the hunter-gatherers, one thing I remember my professor pointing out, and that was that what he did was largely the study of prehistoric garbage and that humans have always produced a lot of it.
  •  
    "There are many fallacies that undergird alternative medicine, which evolved into "complementary and alternative medicine" (CAM), and for which the preferred term among its advocates is now "integrative medicine," meant to imply the "best of both worlds.""
anonymous

The Insanity of Our Food Policy - NYTimes.com - 0 views

  • The House has proposed cutting food stamp benefits by $40 billion over 10 years — that’s on top of $5 billion in cuts that already came into effect this month with the expiration of increases to the food stamp program that were included in the 2009 stimulus law.
  • Meanwhile, House Republicans appear satisfied to allow farm subsidies, which totaled some $14.9 billion last year, to continue apace.
  • The proposal is a perfect example of how growing inequality has been fed by what economists call rent-seeking. As small numbers of Americans have grown extremely wealthy, their political power has also ballooned to a disproportionate size.
  • ...20 more annotations...
  • While the money that they’ve picked from each individual American’s pocket is small, the aggregate is huge for the rent-seeker. And this in turn deepens inequality.
  • FARM subsidies were much more sensible when they began eight decades ago, in 1933, at a time when more than 40 percent of Americans lived in rural areas. Farm incomes had fallen by about a half in the first three years of the Great Depression. In that context, the subsidies were an anti-poverty program.
  • Some three-quarters of the subsidies went to just 10 percent of farms. These farms received an average of more than $30,000 a year — about 20 times the amount received by the average individual beneficiary last year from the federal Supplemental Nutrition Assistant Program, or SNAP, commonly called food stamps.
  • More than 80 percent of the 45 million or so Americans who participated in SNAP in 2011, the last year for which there is comprehensive data from the United States Department of Agriculture, had gross household incomes below the poverty level.
  • Historically, food stamp programs and agricultural subsidies have been tied together.
  • The Nobel Prize winning economist Amartya Sen has reminded us that even famines are not necessarily caused by a lack of supply, but by a failure to get the food that exists to the people who need it. This was true in the Bengal famine of 1943 and in the Irish potato famine a century earlier: Ireland, controlled by its British masters, was exporting food even as its citizens died of starvation.
  • A similar dynamic is playing out in the United States. American farmers are heralded as among the most efficient in the world. Our country is the largest producer and exporter of corn and soybeans, to name just two of its biggest crops. And yet millions of Americans still suffer from hunger, and millions more would, were it not for the vital programs that government provides to prevent hunger and malnutrition — the programs that the Republicans are now seeking to cut back.
  • While they encourage overproduction, they pay little attention to the quality and diversity of foods our farms produce. The heavy subsidization of corn, for instance, means that many unhealthful foods are relatively cheap.
  • This is part of the reason that Americans face the paradox of hunger out of proportion to their wealth, along with some of the world’s highest obesity rates, and a high incidence of Type 2 diabetes. Poor Americans are especially at risk for obesity.
    • anonymous
       
      This is such a raw example of Unintend Consequences. The intention of policy architecture just can't account for ingenious manipulation 
  • Indian friends I met that day and in the following week were puzzled by this news: How could it be that in the richest country of the world there was still hunger?
  • Their puzzlement was understandable: Hunger in this rich land is unnecessary. What my Indian friends didn’t understand is that 15 percent of Americans — and 22 percent of America’s children — live in poverty.
  • Someone working full time (2,080 hours a year) at the minimum wage of $7.25 would earn about $15,000 a year, far less than the poverty threshold for a family of four ($23,492 in 2012), and even less than the poverty level of a family of three.
  • In his famous 1941 “four freedoms” speech, Franklin D. Roosevelt enunciated the principle that all Americans should have certain basic economic rights, including “freedom from want.”
  • And those numbers increased drastically with the onset of the Great Recession. The number of Americans on food stamps went up by more than 80 percent between 2007 and 2013.
  • In 2012, for example, two in five SNAP recipients had gross incomes that were less than half of the poverty line.
  • The amount they get from the program is very small — $4.39 a day per recipient.
  • The Center on Budget and Policy Priorities estimates that SNAP lifted four million Americans out of poverty in 2010.
  • with American consumption diminished from what it otherwise would be and production increased, food exports will inevitably increase.
  • By cutting back on food stamps, we are ensuring the perpetuation of inequality, and at that, one of its worst manifestations: the inequality of opportunity.
  • All of this exposes the Republicans’ argument in favor of these food policies — a concern for our future, particularly the impact of the national debt on our children — as a dishonest and deeply cynical pretense.
  •  
    "American food policy has long been rife with head-scratching illogic. We spend billions every year on farm subsidies, many of which help wealthy commercial operations to plant more crops than we need. The glut depresses world crop prices, harming farmers in developing countries. Meanwhile, millions of Americans live tenuously close to hunger, which is barely kept at bay by a food stamp program that gives most beneficiaries just a little more than $4 a day."
anonymous

The World's First Printed Building - 0 views

  • In a small shed on an industrial park near Pisa is a machine that can print buildings. The machine itself looks like a prototype for the automotive industry.
  • Four columns independently support a frame with a single armature on it. Driven by CAD software installed on a dust-covered computer terminal, the armature moves just millimetres above a pile of sand, expressing a magnesium-based solution from hundreds of nozzles on its lower side.
  • Not that Dini shows much respect for his invention. His brother Ricardo is a talented mechanical engineer who also works on the project and proposed some of its defining features – the single armature for example. Today though he is beating recalcitrant parts of it with a hammer. Enrico refers to a pin system for calibrating the height of the frame as ‘this fucking device’. He is exasperated by its limitations. ‘My machine is stupid,’ he fumes. Perhaps there is certain dumbness to the binary logic of its on/off secretions compared to the complexity of the robots he once made for the shoe industry.
  •  
    "In a small shed on an industrial park near Pisa is a machine that can print buildings." By Tim Abrahams at Blueprint Magazine on March 8, 2010.
anonymous

Sorkin vs. Zuckerberg - 0 views

  • What is important in Zuckerberg’s story is not that he’s a boy genius. He plainly is, but many are. It’s not that he’s a socially clumsy (relative to the Harvard elite) boy genius. Every one of them is. And it’s not that he invented an amazing product through hard work and insight that millions love. The history of American entrepreneurism is just that history, told with different technologies at different times and places. Instead, what’s important here is that Zuckerberg’s genius could be embraced by half-a-billion people within six years of its first being launched, without (and here is the critical bit) asking permission of anyone. The real story is not the invention. It is the platform that makes the invention sing. Zuckerberg didn’t invent that platform. He was a hacker (a term of praise) who built for it. And as much as Zuckerberg deserves endless respect from every decent soul for his success, the real hero in this story doesn’t even get a credit. It’s something Sorkin doesn’t even notice. For comparison’s sake, consider another pair of Massachusetts entrepreneurs, Tom First and Tom Scott. After graduating from Brown in 1989, they started a delivery service to boats on Nantucket Sound. During their first winter, they invented a juice drink. People liked their juice. Slowly, it dawned on First and Scott that maybe there was a business here. Nantucket Nectars was born. The two Toms started the long slog of getting distribution. Ocean Spray bought the company. It later sold the business to Cadbury Schweppes. At each step after the first, along the way to giving their customers what they wanted, the two Toms had to ask permission from someone. They needed permission from a manufacturer to get into his plant. Permission from a distributor to get into her network. And permission from stores to get before the customer. Each step between the idea and the customer was a slog. They made the slog, and succeeded. But many try to make that slog and fail. Sometimes for good reasons. Sometimes not. Zuckerberg faced no such barrier. For less than $1,000, he could get his idea onto the Internet. He needed no permission from the network provider. He needed no clearance from Harvard to offer it to Harvard students. Neither with Yale, or Princeton, or Stanford. Nor with every other community he invited in. Because the platform of the Internet is open and free, or in the language of the day, because it is a “neutral network,” a billion Mark Zuckerbergs have the opportunity to invent for the platform.
    • anonymous
       
      This is akin to the conceit that genius and innovation happens in compartments instead of in an ecosystem. For instance: We value and treasure our freedom, but the freedom we idolize (self determination) comes because there are a plethora of institutions (law, legal, federal and local governments) that make the freedom possible. In other words, there's a bit of social architecture that drives opportunity. Tangent: Libertarians and Objectivists dream of how much *better* it would be if the government simply rolled out of key industries and responsibilities. But instead of Galt's Gulch, we'd get Somalia.
  • Zuckerberg is a rightful hero of our time. I want my kids to admire him. To his credit, Sorkin gives him the only lines of true insight in the film: In response to the twins’ lawsuit, he asks, does “a guy who makes a really good chair owe money to anyone who ever made a chair?”
  •  
    "'The Social Network' is wonderful entertainment, but its message is actually kind of evil." By Lawrence Lessig at The New Republic on October 1, 2010.
anonymous

Geithner's Speech on the Global Economy | STRATFOR - 0 views

  • three points where the United States sees dangers to the global system.
  • The first is maintaining growth.
  • Next, Geithner pointed to differences in exchange rate systems.
  • ...3 more annotations...
  • Third, Geithner spoke about the reformation of the global financial architecture and evoked the framework agreement signed at the September 2009 G-20 summit.
  • The problem for China is that while Germany and Japan are U.S. allies, firmly lashed to the American-dominated international system, and have already been forced to change in response to American demands — such as the 1985 Plaza Accord in which Washington forced them to adopt market-oriented exchange rate policies — China is not.
  • If the United States is serious about enforcing such a policy, it will require changes to the next three biggest economies — China, Japan and Germany — as well as to those who have grown accustomed to the status quo, that is, in some way, almost everyone else in the world.
  •  
    "U.S. Treasury Secretary Timothy Geithner spoke at the Brookings Institution on Oct. 6 and outlined Washington's economic and financial goals for a series of major upcoming international meetings. He called for G-20 countries to continue working together on global economic and financial challenges, and presented three points where the United States sees dangers to the global system. " At Stratfor on October 7, 2010.
anonymous

Obama: After the Election - 0 views

  • That means it is entirely possible that a slew of miscalculations are being made today. One of the most widespread misconceptions about the U.S. political system is that a president who is weak at home is by default weak abroad. This is a belief primarily promulgated by Americans themselves. After all, if one cannot get behind one’s leader, what business does that leader have engaging in global affairs? But in reality, a president who is weak at home often wields remarkable power abroad. The U.S. Constitution forces the American president to share domestic power with Congress, so a split government leads to domestic policy gridlock. However, the Constitution also expressly reserves all foreign policy — particularly military policy — for the presidency. In fact, a weak president often has no options before him except foreign policy. This is something that the rest of the world repeatedly has failed to grasp. Domestically weakened American presidents have often done more than engage in foreign policy: They have overturned entire international orders. Former U.S. President George W. Bush defied expectations after his 2006 midterm electoral defeat and launched the surge in Iraq, utterly changing the calculus of that war. Clinton launched the Kosovo War, which undid what remained of the Cold War security architecture. Most famously, John Kennedy, whom the Soviets had written off as a weak and naive dilettante who had surrounded himself with incompetent advisers (sound familiar?), gave the Russians their biggest Cold War diplomatic defeat in the Cuban Missile Crisis. The United States might be distracted and its president domestically weakened, and undoubtedly most of the world will assume that they know what this means. But history tells a very different story, and this president — like his predecessors — is not done just yet.
  •  
    "Nov. 2 marked midterm elections in the United States with more than 600 electoral contests, enough of which were resolved in favor of the Republicans to deny the Democrats full control of Congress. The country will be digesting the results and their implications for weeks. What STRATFOR will do now is address this simple fact: U.S. President Barack Obama, whose time in office began with a supportive Congress, has lost his ability to dictate the domestic policy agenda." At StratFor on November 3, 2010.
anonymous

The United States, Europe and Bretton Woods II - 0 views

  • The conventional wisdom is that Bretton Woods crafted the modern international economic architecture, lashing the trading and currency systems to the gold standard to achieve global stability.
  • The origin of Bretton Woods lies in the Great Depression.
  • Economically, World War II was a godsend. The military effort generated demand for goods and labor. The goods part is pretty straightforward, but the labor issue is what really allowed the global economy to turn the corner.
  • ...24 more annotations...
  • The war removed tens of millions of men from the labor force, shipping them off to — economically speaking — nonproductive endeavors.
  • Policymakers of the time realized that the prosecution of the war had suspended the depression, but few were confident that the war had actually ended the conditions that made the depression possible.
  • When all was said and done, the delegates agreed to a system of exchangeable currencies and broadly open rules of trade. The system would be based on the gold standard to prevent currency fluctuations, and a pair of institutions — what would become known as the International Monetary Fund (IMF) and the World Bank — would serve as guardians of the system’s financial and fiduciary particulars.
  • In fact, we are still using Bretton Woods, and while nothing that has been discussed to this point is wrong exactly, it is only part of the story.
  • Think back to July 1944. The Normandy invasion was in its first month. The United Kingdom served as the staging ground, but with London exhausted, its military commitment to the operation was modest.
  • The shape of the Cold War was already beginning to unfold. Between the United States and the Soviet Union, the rest of the modern world — namely, Europe — was going to either experience Soviet occupation or become a U.S. protectorate.
  • The Continental states — and even the United Kingdom — were not simply economically spent and indebted but were, to be perfectly blunt, destitute.
  • This was not World War I, where most of the fighting had occurred along a single series of trenches. This was blitzkrieg and saturation bombings, which left the Continent in ruins, and there was almost nothing left from which to rebuild.
  • For the United States, the issue was one of seizing a historic opportunity.
  • The United States entered World War II late and the war did not occur on U.S. soil. So — uniquely among all the world’s major powers of the day — U.S. infrastructure and industrial capacity would emerge from the war larger (far, far larger) than when it entered.
  • The United States had to have not just the participation of the Western Europeans in holding back the Soviet tide, it needed the Europeans to defer to American political and military demands — and to do so willingly. Considering the desperation and destitution of the Europeans, and the unprecedented and unparalleled U.S. economic strength, economic carrots were the obvious way to go.
  • Put another way, Bretton Woods was part of a broader American effort to extend the wartime alliance — sans the Soviets — beyond Germany’s surrender.
  • The United States would allow Europe near tariff-free access to its markets, and turn a blind eye to Europe’s own tariffs so long as they did not become too egregious — something that at least in part flew in the face of the Great Depression’s lessons.
  • The “free world” alliance would not consist of a series of equal states. Instead, it would consist of the United States and everyone else.
  • When loans to fund Western Europe’s redevelopment failed to stimulate growth, those loans became grants, aka the Marshall Plan.
  • And fast-forwarding to when the world went off of the gold standard and Bretton Woods supposedly died, gold was actually replaced by the U.S. dollar. Far from dying, the political/military understanding that underpinned Bretton Woods had only become more entrenched.
  • For many of the states that will be attending what is already being dubbed Bretton Woods II, having this American centrality as such a key pillar of the system is the core of the problem.
  • a crisis in the U.S. economy becomes global.
  • The U.S. economy remains the largest, and dysfunctions there affect the world. That is the reality of the international system, and that is ultimately what the French call for a new Bretton Woods is about.
  • Relying on a currency that is not in the hands of a sovereign taxing power, but dependent on the political will of (so far) 15 countries with very different interests, does not make for a reliable reserve currency.
  • The French in particular look at the current crisis as the result of a failure in the U.S. regulatory system.
  • The Bretton Woods institutions — specifically the IMF, which is supposed to serve the role of financial lighthouse and crisis manager — proved irrelevant to the problems the world is currently passing through.
  • Fundamentally, the Europeans are not simply hoping to modernize Bretton Woods, but instead to Europeanize the American financial markets. This is ultimately not a financial question, but a political one.
  • Far more important, any international system that oversees aspects of American finance would, by definition, not be under full American control, but under some sort of quasi-Brussels-like organization. And no American president is going to engage gleefully on that sort of topic.
  •  
    An article about this misunderstood institution. From StratFor back on October 20, 2008.
anonymous

Obama Is Making Bush's Big Mistake on Russia - 0 views

  • Putin's treatment of Clinton raises doubts about the Barack Obama administration's strategy toward Russia, which has focused on building up the supposedly moderate President Dmitri Medvedev, reportedly one of the few foreign leaders Obama has bonded with, as a counterweight to Putin.
    • anonymous
       
      If true, this could be a grevious mistake, as Russia has shown a historic knack for tightly managed foreign policy under strong leaders (which Putin is).
  • After his first meeting with then-President Putin in June 2001, George W. Bush famously said: "I looked the man in the eye. I was able to get a sense of his soul."
    • anonymous
       
      That was hilarious, even at the time. My sincere hope was that the statement was intended for the domestic audience (to give comfort), because if it was for the international audience, then Bush very likely came off as very, very naive.
  • And now, we're hearing that Obama believes he has a different and promising relationship with Medvedev -- one independent of Putin.
    • anonymous
       
      My hope is that *this* is a conservative, careful way to say that Obama will give the benefit of the doubt. While I have only epheremal reasons to think this, Obama seems a bit shrewder than Bush.
  • ...3 more annotations...
  • For all his talk of reform -- and so far it is just that, talk -- Medvedev still claims that Russia is a working democracy that protects the liberties of individual Russians despite overwhelming evidence to the contrary.
    • anonymous
       
      Which is as laughable as that earlier Bush quote about "sensing his soul."
  • On Medvedev's watch, Georgia has been invaded and Abkhazia and South Ossetia effectively annexed, and Russia has continued to threaten its neighbors and put forward a "new security architecture" whose obvious goal is to undermine NATO's role in Europe.
    • anonymous
       
      Aggressively reclaiming Russia's near abroad is still their aim. Can you blame them? What's important here is that Medvedev really *is* tightly in line with Putin. It's best to think of his presidency as the continuation of the Putin administration, not a thing that's distinct from it.
  • In short, there is little reason to believe that basing a "reset" of U.S.-Russian relations on increased personal ties between presidents Medvedev and Obama will buy Obama any particular advantage. If anything, doing so reinforces Moscow's incentive to continue the "good cop, bad cop" routine.
  •  
    Tagline: "Remember when George W. Bush thought he could get things done by making nice with Vladimir Putin? Barack Obama is repeating the same error with Dmitry Medvedev. " By Jamie Fly and Gary Schmitt in Foreign Policy on March 22, 2010
anonymous

Forget Anonymous: Evidence Suggests GOP Hacked, Stole 2004 Election - 1 views

  • "A new filing in the King Lincoln Bronzeville v. Blackwell case includes a copy of the Ohio Secretary of State election production system configuration that was in use in Ohio's 2004 presidential election when there was a sudden and unexpected shift in votes for George W. Bush," according to Bob Fitrakis, columnist at http://www.freepress.org and co-counsel in the litigation and investigation.
  • Ohio was the battleground state that provided George Bush with the electoral votes needed to win re-election. Had Senator John Kerry won Ohio's electoral votes, he would have been elected instead.
  • SmarTech, a private company, had the ability in the 2004 election to add or subtract votes without anyone knowing they did so.
  • ...12 more annotations...
  • The filing today shows how, detailing the computer network system's design structure, including a map of how the data moved from one unit to the next. Right smack in the middle of that structure? Inexplicably, it was SmarTech.
  • A "man in the middle" is not just an accidental happenstance of computing. It is a deliberate computer hacking setup, one where the hacker sits, literally, in the middle of the communication stream, intercepting and (when desired, as in this case) altering the data.
  • Until now, the architectural maps and contracts from the Ohio 2004 election were never made public, which may indicate that the entire system was designed for fraud.
  • SmarTech was part of three computer companies brought in to manage the elections process for Ohio Secretary of State Ken Blackwell, a Republican. The other two were Triad and GovTech Solutions. All three companies have extensive ties to the Republican party and Republican causes.
  • Connell was outed as the one who stole the 2004 election by Spoonamore, who, despite being a conservative Republican himself, came forward to blow the whistle on the stolen election scandal. Connell gave a deposition on the matter, but stonewalled. After the deposition, and fearing perjury/obstruction charges for withholding information, Connell expressed an interest in testifying further as to the extent of the scandal.
  • Connell was so scared for his security that he asked for protection from the attorney general, then Attorney General Michael Mukasey. Connell told close friends that he was expecting to get thrown under the bus by the Rove team, because Connell had evidence linking the GOP operative to the scandal and the stolen election, including knowledge of where Rove's missing emails disappeared to.
  • Before he could testify, Connell died in a plane crash.
  • "The 2004 election was stolen. There is absolutely no doubt about it. A 6.7% shift in exit polls does not happen by chance. And, you know, so finally, we have irrefutable confirmation that what we were saying was true and that every piece of the puzzle in the Ohio 2004 election was flawed," Wasserman said.
  • There were three phases of chicanery.
  • First, there was a pre-election period, during which the Secretary of State in Ohio, Ken Blackwell, was also co-chair of the Bush-Cheney campaign in Ohio, which is in itself mind-boggling, engaged in all sorts of bureaucratic and legal tricks to cut down on the number of people who could register
  • On Election Day, there was clearly a systematic undersupply of working voting machines in Democratic areas, primarily inner city and student towns, you know, college towns. And the Conyers people found that in some of the most undersupplied places, there were scores of perfectly good voting machines held back and kept in warehouses, you know, and there are many similar stories to this.
  • After Election Day, there is explicit evidence that a company called Triad, which manufactures all of the tabulators, the vote-counting tabulators that were used in Ohio in the last election, was systematically going around from county to county in Ohio and subverting the recount, which was court ordered and which never did take place.
  •  
    "Three generations from now, when our great-grandchildren are sitting barefoot in their shanties and wondering how in the hell America turned from the high-point of civilization to a third-world banana republic, they will shake their fists and mutter one name: George Effin' Bush." If this is true, it's incredibly depressing...
anonymous

USENIX 2011 Keynote: Network Security in the Medium Term, 2061-2561 AD - 1 views

  • if we should meet up in 2061, much less in the 26th century, you’re welcome to rib me about this talk. Because I’ll be happy to still be alive to rib.
  • The question I’m going to spin entertaining lies around is this: what is network security going to be about once we get past the current sigmoid curve of accelerating progress and into a steady state, when Moore’s first law is long since burned out, and networked computing appliances have been around for as long as steam engines?
  • a few basic assumptions about the future
  • ...82 more annotations...
  • it’s not immediately obvious that I can say anything useful about a civilization run by beings vastly more intelligent than us. I’d be like an australopithecine trying to visualize daytime cable TV.
  • The idea of an AI singularity
  • the whole idea of artificial general intelligence strikes me as being as questionable as 19th century fantasies about steam-powered tin men.
  • if you start trying to visualize a coherent future that includes aliens, telepathy, faster than light travel, or time machines, your futurology is going to rapidly run off the road and go crashing around in the blank bits of the map that say HERE BE DRAGONS.
  • at least one barkingly implausible innovation will come along between now and 2061 and turn everything we do upside down
  • My crystal ball is currently predicting that base load electricity will come from a mix of advanced nuclear fission reactor designs and predictable renewables such as tidal and hydroelectric power.
  • We are, I think, going to have molecular nanotechnology and atomic scale integrated circuitry.
  • engineered solutions that work a bit like biological systems
  • Mature nanotechnology is going to resemble organic life forms the way a Boeing 737 resembles thirty tons of seagull biomass.
  • without a technological civilization questions of network security take second place to where to get a new flint arrowhead.
  • if we’re still alive in the 26th century you’re welcome to remind me of what I got wrong in this talk.
  • we’re living through the early days of a revolution in genomics and biology
  • We haven’t yet managed to raise the upper limit on human life expectancy (it’s currently around 120 years), but an increasing number of us are going to get close to it.
  • it’s quite likely that within another century the mechanisms underlying cellular senescence will be understood and treatable like other inborn errors of metabolism
  • another prediction: something outwardly resembling democracy everywhere.
  • Since 1911, democractic government by a republic has gone from being an eccentric minority practice to the default system of government world-wide
  • Democracy is a lousy form of government in some respects – it is particularly bad at long-term planning, for no event that lies beyond the electoral event horizon can compel a politician to pay attention to it
  • but it has two gigantic benefits: it handles transfers of power peacefully, and provides a pressure relief valve for internal social dissent.
  • there are problems
  • . In general, democratically elected politicians are forced to focus on short-term solutions to long-term problems because their performance is evaluated by elections held on a time scale of single-digit years
  • Democratic systems are prone to capture by special interest groups that exploit the information asymmetry that’s endemic in complex societies
  • The adversarial two-party model is a very bad tool for generating consensus on how to tackle difficult problems with no precedents
  • Finally, representative democracy scales up badly
  • Nor are governments as important as they used to be.
  • the US government, the largest superpower on the block right now, is tightly constrained by the international trade system it promoted in the wake of the second world war.
  • we have democratic forms of government, without the transparency and accountability.
  • At least, until we invent something better – which I expect will become an urgent priority before the end of the century.
  • The good news is, we’re a lot richer than our ancestors. Relative decline is not tragic in a positive-sum world.
  • Assuming that they survive the obstacles on the road to development, this process is going to end fairly predictably: both India and China will eventually converge with a developed world standard of living, while undergoing the demographic transition to stable or slowly declining populations that appears to be an inevitable correlate of development.
  • a quiet economic revolution is sweeping Africa
  • In 2006, for the first time, more than half of the planet’s human population lived in cities. And by 2061 I expect more than half of the planet’s human population will live in conditions that correspond to the middle class citizens of developed nations.
  • by 2061 we or our children are going to be living on an urban middle-class planet, with a globalized economic and financial infrastructure recognizably descended from today’s system, and governments that at least try to pay lip service to democratic norms.
  • And let me say, before I do, that the picture I just painted – of the world circa 2061, which is to say of the starting point from which the world of 2561 will evolve – is bunk.
  • It’s a normative projection
  • I’m pretty certain that something utterly unexpected will come along and up-end all these projections – something as weird as the world wide web would have looked in 1961.
  • And while the outer forms of that comfortable, middle-class urban developed-world planetary experience might look familiar to us, the internal architecture will be unbelievably different.
  • Let’s imagine that, circa 1961 – just fifty years ago – a budding Nikolai Tesla or Bill Packard somewhere in big-city USA is tinkering in his garage and succeeds in building a time machine. Being adventurous – but not too adventurous – he sets the controls for fifty years in the future, and arrives in downtown San Francisco. What will he see, and how will he interpret it?
  • a lot of the buildings are going to be familiar
  • Automobiles are automobiles, even if the ones he sees look kind of melted
  • Fashion? Hats are out, clothing has mutated in strange directions
  • He may be thrown by the number of pedestrians walking around with wires in their ears, or holding these cigarette-pack-sized boxes with glowing screens.
  • But there seem to be an awful lot of mad people walking around with bits of plastic clipped to their ears, talking to themselves
  • The outward shape of the future contains the present and the past, embedded within it like flies in amber.
  • Our visitor from 1961 is familiar with cars and clothes and buildings
  • But he hasn’t heard of packet switched networks
  • Our time traveller from 1961 has a steep learning curve if he wants to understand the technology the folks with the cordless headsets are using.
  • The social consequences of a new technology are almost always impossible to guess in advance.
  • Let me take mobile phones as an example. They let people talk to one another – that much is obvious. What is less obvious is that for the first time the telephone network connects people, not places
  • For example, we’re currently raising the first generation of kids who won’t know what it means to be lost – everywhere they go, they have GPS service and a moving map that will helpfully show them how to get wherever they want to go.
  • to our time traveller from 1961, it’s magic: you have a little glowing box, and if you tell it “I want to visit my cousin Bill, wherever he is,” a taxi will pull up and take you to Bill’s house
  • The whole question of whether a mature technosphere needs three or four billion full-time employees is an open one, as is the question of what we’re all going to do if it turns out that the future can’t deliver jobs.
  • We’re still in the first decade of mass mobile internet uptake, and we still haven’t seen what it really means when the internet becomes a pervasive part of our social environment, rather than something we have to specifically sit down and plug ourselves in to, usually at a desk.
  • So let me start by trying to predict the mobile internet of 2061.
  • the shape of the future depends on whether whoever provides the basic service of communication
  • funds their service by charging for bandwidth or charging for a fixed infrastructure cost.
  • These two models for pricing imply very different network topologies.
  • This leaves aside a third model, that of peer to peer mesh networks with no actual cellcos as such – just lots of folks with cheap routers. I’m going to provisionally assume that this one is hopelessly utopian
  • the security problems of a home-brew mesh network are enormous and gnarly; when any enterprising gang of scammers can set up a public router, who can you trust?
  • Let’s hypothesize a very high density, non-volatile serial storage medium that might be manufactured using molecular nanotechnology: I call it memory diamond.
  • wireless bandwidth appears to be constrained fundamentally by the transparency of air to electromagnetic radiation. I’ve seen some estimates that we may be able to punch as much as 2 tb/sec through air; then we run into problems.
  • What can you do with 2 terabits per second per human being on the planet?
  • One thing you can do trivially with that kind of capacity is full lifelogging for everyone. Lifelogging today is in its infancy, but it’s going to be a major disruptive technology within two decades.
  • the resulting search technology essentially gives you a prosthetic memory.
  • Lifelogging offers the promise of indexing and retrieving the unwritten and undocmented. And this is both a huge promise and an enormous threat.
  • Lifelogging raises huge privacy concerns, of course.
  • The security implications are monstrous: if you rely on lifelogging for your memory or your ability to do your job, then the importance of security is pushed down Maslow’s hierarchy of needs.
  • if done right, widespread lifelogging to cloud based storage would have immense advantages for combating crime and preventing identity theft.
  • whether lifelogging becomes a big social issue depends partly on the nature of our pricing model for bandwidth, and how we hammer out the security issues surrounding the idea of our sensory inputs being logged for posterity.
  • at least until the self-driving automobile matches and then exceeds human driver safety.
  • We’re currently living through a period in genomics research that is roughly equivalent to the early 1960s in computing.
  • In particular, there’s a huge boom in new technologies for high speed gene sequencing.
  • full genome sequencing for individuals now available for around US $30,000, and expected to drop to around $1000–3000 within a couple of years.
  • Each of us is carrying around a cargo of 1–3 kilograms of bacteria and other unicellular organisms, which collectively outnumber the cells of our own bodies by a thousand to one.
  • These are for the most part commensal organisms – they live in our guts and predigest our food, or on our skin – and they play a significant role in the functioning of our immune system.
  • Only the rapid development of DNA assays for SARS – it was sequenced within 48 hours of its identification as a new pathogenic virus – made it possible to build and enforce the strict quarantine regime that saved us from somewhere between two hundred million and a billion deaths.
  • A second crisis we face is that of cancer
  • we can expect eventually to see home genome monitoring – both looking for indicators of precancerous conditions or immune disorders within our bodies, and performing metagenomic analysis on our environment.
  • If our metagenomic environment is routinely included in lifelogs, we have the holy grail of epidemiology within reach; the ability to exhaustively track the spread of pathogens and identify how they adapt to their host environment, right down to the level of individual victims.
  • In each of these three examples of situations where personal privacy may be invaded, there exists a strong argument for doing so in the name of the common good – for prevention of epidemics, for prevention of crime, and for prevention of traffic accidents. They differ fundamentally from the currently familiar arguments for invasion of our data privacy by law enforcement – for example, to read our email or to look for evidence of copyright violation. Reading our email involves our public and private speech, and looking for warez involves our public and private assertion of intellectual property rights …. but eavesdropping on our metagenomic environment and our sensory environment impinges directly on the very core of our identities.
  • With lifelogging and other forms of ubiquitous computing mediated by wireless broadband, securing our personal data will become as important to individuals as securing our physical bodies.
  • the shifting sands of software obsolescence have for the most part buried our ancient learning mistakes.
  • So, to summarize: we’re moving towards an age where we may have enough bandwidth to capture pretty much the totality of a human lifespan, everything except for what’s going on inside our skulls.
  •  
    "Good afternoon, and thank you for inviting me to speak at USENIX Security." A fun read by Charlie Stoss."
  •  
    I feel like cancer may be a bit played up. I freak out more about dementia.
anonymous

Eurozone Crisis: Not a Greek Drama - 0 views

  • Lost in the coverage is the fact that Greece constitutes 2.5 percent of Eurozone GDP and Eurozone member states’ direct exposure to Greece is manageable.
  • After a year and a half of watching the Eurozone sovereign debt crisis unfold, we should put one notion to rest: no one event, crisis or decision will cause the Eurozone to collapse. Such a complex system of financial and monetary relationships will not unravel in a day, a month or a year.
  • Eurozone member states have proven highly flexible in their handling of the crisis.
  • ...10 more annotations...
  • Skeptics contend that because the Eurozone was primarily a political creation, its economic logic is fundamentally flawed. A singular economic or political shock — such as the collapse of the Greek government — could therefore unravel the entire bloc by exposing a slew of economic problems.
  • Precisely because the Eurozone is a political creation, however, fundamental changes in the geopolitics of Europe are required to undermine it. Furthermore, the greater the imminent financial crisis, the greater the likelihood that Eurozone member states will find flexible means to resolve it. This resourcefulness has been evidenced throughout the crisis.
  • Therefore if all else fails, the ECB will print money.
  • The idea that the ECB would participate in its own dissolution because it is committed to its independence, or to maintaining 2 percent inflation, is a theoretical assumption that takes little account of the ECB’s behavior over the last 24 months.
  • This analysis leads us to two conclusions.
  • First, the Eurozone is not going to collapse in the middle of the sovereign debt crisis.
  • Second, fundamental political changes underway in Europe — such as the weakening of the NATO alliance, the regionalization of security alliances, and especially the developing Russian-German relationship — are far more important to the future of the Eurozone than a Greek confidence vote.
  • Because the Eurozone is fundamentally a political project, the weakening of the political bonds that tie Eurozone member states into a currency union are what will ultimately lead to its dissolution or modification.
  • Monumental shifts are underway in Europe. We have no reason to believe that Greece is at the center of them. What is most interesting is that the focus, both in terms of risks and solutions, continues to be on both short-term effects and singular events. This myopia is in part because Eurozone member states, in particular Germany, have not offered a long-term solution or plan.
  • The question that needs to be asked is: what do Europeans, and specifically the Germans, plan to do with Europe’s security and political architecture in the long term? The answer to that question cannot be found in the financial databases of Eurostat or the Bank of International Settlement, nor especially in the coverage of 24-hour investor-news stations.
  •  
    "It has been 2,000 years since Athenian legislators last received the kind of global attention fixed upon them Tuesday. News coverage of the Greek parliament's June 21 confidence vote captivated the global financial sector. The vote was carried live on most global 24-hour investment-news stations and links to live online feeds of the Greek vote were posted across the world wide web. The vote passed, giving Greek Prime Minister George Papandreou the political authority to try to pass further austerity measures mandated by the Eurozone in another vote on June 28."
1 - 14 of 14
Showing 20 items per page