Skip to main content

Home/ Long Game/ Group items tagged internet

Rss Feed Group items tagged

anonymous

We Are All Hayekians Now: The Internet Generation and Knowledge Problems - 1 views

  • Primarily in his The Use of Knowledge in Society but also in his other contributions to the socialist calculation debate, Hayek crafted a brilliant statement of a perennial problem.
  • In the world of human endeavor, we have two types of problems: economic and technological.
  • Technological problems involve effectively allocating given resources to accomplish a single valuable goal.
  • ...21 more annotations...
  • The choice to build the bridge is a choice between this bridge or that skyscraper as well as any other alternative use of those resources. Each alternative use would have different benefits (and unseen costs).
  • This is not a mere question of engineering the strongest or even the most cost-effective structure to get across the Hudson, this is a question of what is the strongest or most cost-effective possible future version of New York City.
  • “We are building the world’s 20th search engine at a time when most of the others have been abandoned as being commoditized money losers. We’ll strip out all of the ad-supported news and portal features so you won’t be distracted from using the free search stuff.”
  • But, of course, Google survived, prospered, and continues towards its apparent goal of eating the entire internet (while also making cars drive themselves, putting cameras on everyone’s heads, and generally making Steve Ballmer very very angry). So, why did Google win? The answer is, perhaps surprisingly, in Hayek’s theory.
    • anonymous
       
      Very embarassing videos.
  • “Our goal always has been to index all the world’s data.” Talk about anemic goals, come on Google, show some ambition!
  • So, is this one of Hayek’s technical problems or is this an economic one?
  • Our gut might first tell us that it is technical.
  • Sure, all this data is now hanging out in one place for free, but to make a useful index you need to determine how much people value different data. We need data about the data.
  • In Soviet Russia, failed attempts at arranging resources destroyed the information about the resources. The free market is the best way to figure out how individual people value individual resources. When left to trade voluntarily, people reveal their preferences with their willingness to pay. By arranging resources through coercion you’ve blinded yourself to the emergent value of the resources because you’ve forbidden voluntary arrangement in the economy.
  • This is different on the internet.
  • The data resources are not rivalrous
  • Search used to be really bad. Why? Because search companies were using either (a) content-producer willingness to pay for indexing, (b) mere keyword search or (c) some combination of editorial centralized decision-making to organize lists of sites.
  • These methods only work if you think that the best site about ducks is either (a) the site that has the most money to pay Altavista for prime “duck” listing, (b) the site that has the most “ducks” in its text, or (c) the site that was most appealing to your employees tasked with finding duck sites.
  • If 999 other websites linked to one website about ducks, you can bet that most people think that this site is better at explaining ducks than a site with only one link to it (even if that link was horse-sized).
  • So Google uses the decentralized Hayekian knowledge of the masses to function. Why does this mean we’re all Hayekians?
  • All of the questions of organizing activity on the internet are solved (when they are, in fact, solved successfully) using Hayekian decentralized knowledge.
  • Amazon customer reviews are how we find good products. Ebay feedback is how we find good individual sellers. And, moreover, whole brick and mortar services are moving to a crowd-sourced model, with sites like AirBnB for lodging and RelayRides for car rental.
  • the giant firms of tomorrow will be those that empower people to freely share their knowledge and resources in a vibrant marketplace.
  • Today, the central challenge for a firm is not to develop careful internal management but rather the non-trivial task of building marketplaces and forums to encourage decentralized knowledge production and cooperation.
  • Our generation already understands this on a gut level. We Google everything.  We defend freedom on the internet as if it was our own personal real-world liberty at stake. We mock the antiquated central planners of the early web, looking at you AOL, Prodigy, for their ineffectual obviousness and denial of crowd-sourced knowledge.
  • We all know where the best economic knowledge lies, in the many and never the few.
  •  
    "We are all Hayekians now. Specifically, the "we all" is not quite everyone. The "all" to which I'm referring is people of the internet-people who've grown up with the net and use it for a majority of their day-to-day activities. And, the "Hayekian" to which I'm referring is not his theories on capital, or the rule of law, but, specifically his vision of knowledge."
anonymous

How the internet is making us poor - Quartz - 2 views

  • Sixty percent of the jobs in the US are information-processing jobs, notes Erik Brynjolfsson, co-author of a recent book about this disruption, Race Against the Machine. It’s safe to assume that almost all of these jobs are aided by machines that perform routine tasks. These machines make some workers more productive. They make others less essential.
  • The turn of the new millennium is when the automation of middle-class information processing tasks really got under way, according to an analysis by the Associated Press based on data from the Bureau of Labor Statistics. Between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services that made everything from maintaining a calendar to planning trips easier than ever.
  • Economist Andrew McAfee, Brynjolfsson’s co-author, has called these displaced people “routine cognitive workers.” Technology, he says, is now smart enough to automate their often repetitive, programmatic tasks. ”We are in a desperate, serious competition with these machines,” concurs Larry Kotlikoff, a professor of economics at Boston University. “It seems like the machines are taking over all possible jobs.”
  • ...23 more annotations...
  • In the early 1800′s, nine out of ten Americans worked in agriculture—now it’s around 2%. At its peak, about a third of the US population was employed in manufacturing—now it’s less than 10%. How many decades until the figures are similar for the information-processing tasks that typify rich countries’ post-industrial economies?
  • To see how the internet has disproportionately affected the jobs of people who process information, check out the gray bars dipping below the 0% line on the chart, below. (I’ve adapted this chart to show just the types of employment that lost jobs in the US during the great recession. Every other category continued to add jobs or was nearly flat.)
  • Here’s another clue about what’s been going on in the past ten years. “Return on capital” measures the return firms get when they spend money on capital goods like robots, factories, software—anything aside from people. (If this were a graph of return on people hired, it would be called “Return on labor”.)
  • Notice: the only industry where the return on capital is as great as manufacturing is “other industries”—a grab bag which includes all the service and information industries, as well as entertainment, health care and education. In short, you don’t have to be a tech company for investing in technology to be worthwhile.
  • For many years, the question of whether or not spending on information technology (IT) made companies more productive was highly controversial. Many studies found that IT spending either had no effect on productivity or was even counter-productive. But now a clear trend is emerging. More recent studies show that IT—and the organizational changes that go with it—are doing firms, especially multinationals (pdf), a great deal of good.
  • Winner-take-all and the power of capital to exacerbate inequality
  • One thing all our machines have accomplished, and especially the internet, is the ability to reproduce and distribute good work in record time. Barring market distortions like monopolies, the best software, media, business processes and, increasingly, hardware, can be copied and sold seemingly everywhere at once. This benefits “superstars”—the most skilled engineers or content creators. And it benefits the consumer, who can expect a higher average quality of goods.
  • But it can also exacerbate income inequality, says Brynjolfsson. This contributes to a phenomenon called “skill-biased technological [or technical] change.” “The idea is that technology in the past 30 years has tended to favor more skilled and educated workers versus less educated workers,” says Brynjolfsson. “It has been a complement for more skilled workers. It makes their labor more valuable. But for less skilled workers, it makes them less necessary—especially those who do routine, repetitive tasks.”
  • “Certainly the labor market has never been better for very highly-educated workers in the United States, and when I say never, I mean never,” MIT labor economist David Autor told American Public Media’s Marketplace.
  • The other winners in this scenario are anyone who owns capital.
  • As Paul Krugman wrote, “This is an old concern in economics; it’s “capital-biased technological change”, which tends to shift the distribution of income away from workers to the owners of capital.”
  • Computers are more disruptive than, say, the looms smashed by the Luddites, because they are “general-purpose technologies” noted Peter Linert, an economist at University of Californa-Davis.
  • “The spread of computers and the Internet will put jobs in two categories,” said Andreessen. “People who tell computers what to do, and people who are told by computers what to do.” It’s a glib remark—but increasingly true.
  • In March 2009, Amazon acquired Kiva Systems, a warehouse robotics and automation company. In partnership with a company called Quiet Logistics, Kiva’s combination of mobile shelving and robots has already automated a warehouse in Andover, Massachusetts.
  • This time it’s fasterHistory is littered with technological transitions. Many of them seemed at the time to threaten mass unemployment of one type of worker or another, whether it was buggy whip makers or, more recently, travel agents. But here’s what’s different about information-processing jobs: The takeover by technology is happening much faster.
  • From 2000 to 2007, in the years leading up to the great recession, GDP and productivity in the US grew faster than at any point since the 1960s, but job creation did not keep pace.
  • Brynjolfsson thinks he knows why: More and more people were doing work aided by software. And during the great recession, employment growth didn’t just slow. As we saw above, in both manufacturing and information processing, the economy shed jobs, even as employment in the service sector and professional fields remained flat.
  • Especially in the past ten years, economists have seen a reversal of what they call “the great compression“—that period from the second world war through the 1970s when, in the US at least, more people were crowded into the ranks of the middle class than ever before.
  • There are many reasons why the economy has reversed this “compression,” transforming into an “hourglass economy” with many fewer workers in the middle class and more at either the high or the low end of the income spectrum.
  • The hourglass represents an income distribution that has been more nearly the norm for most of the history of the US. That it’s coming back should worry anyone who believes that a healthy middle class is an inevitable outcome of economic progress, a mainstay of democracy and a healthy society, or a driver of further economic development.
    • anonymous
       
      This is the meaty center. It's what I worry about. The "Middle Class" may just be an anomaly.
  • Indeed, some have argued that as technology aids the gutting of the middle class, it destroys the very market required to sustain it—that we’ll see “less of the type of innovation we associate with Steve Jobs, and more of the type you would find at Goldman Sachs.”
  • So how do we deal with this trend? The possible solutions to the problems of disruption by thinking machines are beyond the scope of this piece. As I’ve mentioned in other pieces published at Quartz, there are plenty of optimists ready to declare that the rise of the machines will ultimately enable higher standards of living, or at least forms of unemployment as foreign to us as “big data scientist” would be to a scribe of the 17th century.
  • But that’s only as long as you’re one of the ones telling machines what to do, not being told by them. And that will require self-teaching, creativity, entrepreneurialism and other traits that may or may not be latent in children, as well as retraining adults who aspire to middle class living. For now, sadly, your safest bet is to be a technologist and/or own capital, and use all this automation to grab a bigger-than-ever share of a pie that continues to expand.
  •  
    "Everyone knows the story of how robots replaced humans on the factory floor. But in the broader sweep of automation versus labor, a trend with far greater significance for the middle class-in rich countries, at any rate-has been relatively overlooked: the replacement of knowledge workers with software. One reason for the neglect is that this trend is at most thirty years old, and has become apparent in economic data only in perhaps the past ten years. The first all-in-one commercial microprocessor went on sale in 1971, and like all inventions, it took decades for it to become an ecosystem of technologies pervasive and powerful enough to have a measurable impact on the way we work."
anonymous

Information Consumerism: The Price of Hypocrisy - 0 views

  • let us not pass over America’s surveillance addiction in silence. It is real; it has consequences; and the world would do itself a service by sending America to a Big Data rehab. But there’s more to learn from the Snowden affair.
  • It has also busted a number of myths that are only peripherally related to surveillance: myths about the supposed benefits of decentralized and commercially-operated digital infrastructure, about the current state of technologically-mediated geopolitics, about the existence of a separate realm known as “cyberspace.”
  • First of all, many Europeans are finally grasping, to their great dismay, that the word “cloud” in “cloud computing” is just a euphemism for “some dark bunker in Idaho or Utah.”
  • ...50 more annotations...
  • Second, ideas that once looked silly suddenly look wise. Just a few months ago, it was customary to make fun of Iranians, Russians and Chinese who, with their automatic distrust of all things American, spoke the bizarre language of “information sovereignty.”
  • Look who’s laughing now: Iran’s national email system launched a few weeks ago. Granted the Iranians want their own national email system, in part, so that they can shut it down during protests and spy on their own people AT other times. Still, they got the geopolitics exactly right: over-reliance on foreign communications infrastructure is no way to boost one’s sovereignty. If you wouldn’t want another nation to run your postal system, why surrender control over electronic communications?
    • anonymous
       
      This could have been written by StratFor.
  • Third, the sense of unconditional victory that civil society in both Europe and America felt over the defeat of the Total Information Awareness program – a much earlier effort to establish comprehensive surveillance – was premature.
  • The problem with Total Information Awareness was that it was too big, too flashy, too dependent on government bureaucracy. What we got instead, a decade later, is a much nimbler, leaner, more decentralized system, run by the private sector and enabled by a social contract between Silicon Valley and Washington
  • This is today’s America in full splendor: what cannot be accomplished through controversial legislation will be accomplished through privatization, only with far less oversight and public control.
  • From privately-run healthcare providers to privately-run prisons to privately-run militias dispatched to war zones, this is the public-private partnership model on which much of American infrastructure operates these days.
  • Communications is no exception. Decentralization is liberating only if there’s no powerful actor that can rip off the benefits after the network has been put in place.
  • Fourth, the idea that digitization has ushered in a new world, where the good old rules of realpolitik no longer apply, has proved to be bunk. There’s no separate realm that gives rise to a new brand of “digital” power; it’s one world, one power, with America at the helm.
    • anonymous
       
      THIS right here, is crucial.
  • The sheer naivete of statements like this – predicated on the assumption that somehow one can “live” online the way one lives in the physical world and that virtual politics works on a logic different from regular politics – is illustrated by the sad case of Edward Snowden, a man with a noble mission and awful trip-planning skills.
  • Fifth, the once powerful myth that there exists a separate, virtual space where one can have more privacy and independence from social and political institutions is dead.
  • Microsoft’s general counsel wrote that “looking forward, as Internet-based voice and video communications increase, it is clear that governments will have an interest in using (or establishing) legal powers to secure access to this kind of content to investigate crimes or tackle terrorism. We therefore assume that all calls, whether over the Internet or by fixed line or mobile phone, will offer similar levels of privacy and security.”
  • Read this again: here’s a senior Microsoft executive arguing that making new forms of communication less secure is inevitable – and probably a good thing.
  • Convergence did happen – we weren’t fooled! – but, miraculously, technologies converged on the least secure and most wiretap-friendly option available.
  • This has disastrous implications for anyone living in dictatorships. Once Microsoft and its peers start building software that is insecure by design, it turbocharges the already comprehensive spying schemes of authoritarian governments. What neither NSA nor elected officials seem to grasp is that, on matters of digital infrastructure, domestic policy is also foreign policy; it’s futile to address them in isolation.
  • This brings us to the most problematic consequence of Snowden’s revelations. As bad as the situation is for Europeans, it’s the users in authoritarian states who will suffer the most.
  • And not from American surveillance, but from domestic censorship. How so? The already mentioned push towards “information sovereignty” by Russia, China or Iran would involve much more than protecting their citizens from American surveillance. It would also trigger an aggressive push to shift public communication among these citizens – which, to a large extent, still happens on Facebook and Twitter – to domestic equivalents of such services.
  • It’s probably not a coincidence that LiveJournal, Russia’s favorite platform, suddenly had maintenance issues – and was thus unavailable for general use – at the very same time that a Russian court announced its verdict to the popular blogger-activist Alexei Navalny.
  • For all the concerns about Americanization and surveillance, US-based services like Facebook or Twitter still offer better protection for freedom of expression than their Russian, Chinese or Iranian counterparts.
  • This is the real tragedy of America’s “Internet freedom agenda”: it’s going to be the dissidents in China and Iran who will pay for the hypocrisy that drove it from the very beginning.
  • On matters of “Internet freedom” – democracy promotion rebranded under a sexier name – America enjoyed some legitimacy as it claimed that it didn’t engage in the kinds of surveillance that it itself condemned in China or Iran. Likewise, on matters of cyberattacks, it could go after China’s cyber-espionage or Iran’s cyber-attacks because it assured the world that it engaged in neither.
  • Both statements were demonstrably false but lack of specific evidence has allowed America to buy some time and influence.
  • What is to be done? Let’s start with surveillance. So far, most European politicians have reached for the low-hanging fruit – law – thinking that if only they can better regulate American companies – for example, by forcing them to disclose how much data and when they share with NSA – this problem will go away.
  • This is a rather short-sighted, naïve view that reduces a gigantic philosophical problem – the future of privacy – to seemingly manageable size of data retention directives.
  • Our current predicaments start at the level of ideology, not bad policies or their poor implementation.
  • As our gadgets and previously analog objects become “smart,” this Gmail model will spread everywhere. One set of business models will supply us with gadgets and objects that will either be free or be priced at a fraction of their real cost.
  • In other words, you get your smart toothbrush for free – but, in exchange, you allow it to collect data on how you use the toothbrush.
  • If this is, indeed, the future that we are heading towards, it’s obvious that laws won’t be of much help, as citizens would voluntarily opt for such transactions – the way we already opt for free (but monitorable) email and cheaper (but advertising-funded) ereaders.
  • In short, what is now collected through subpoenas and court orders could be collected entirely through commercial transactions alone.
  • Policymakers who think that laws can stop this commodificaton of information are deluding themselves. Such commodification is not happening against the wishes of ordinary citizens but because this is what ordinary citizen-consumer want.
  • Look no further than Google’s email and Amazon’s Kindle to see that no one is forced to use them: people do it willingly. Forget laws: it’s only through political activism and a robust intellectual critique of the very ideology of “information consumerism” that underpins such aspirations that we would be able to avert the inevitable disaster.
  • Where could such critique begin? Consider what might, initially, seem like a bizarre parallel: climate change.
  • For much of the 20th century, we assumed that our energy use was priced correctly and that it existed solely in the consumer paradigm of “I can use as much energy as I can pay for.” Under that paradigm, there was no ethics attached to our energy use: market logic has replaced morality – which is precisely what has enabled fast rates of economic growth and the proliferation of consumer devices that have made our households electronic paradises free from tiresome household work.
  • But as we have discovered in the last decade, such thinking rested on a powerful illusion that our energy use was priced correctly – that we in fact paid our fair share.
  • But of course we had never priced our energy use correctly because we never factored in the possibility that life on Earth might end even if we balance all of our financial statements.
  • The point is that, partly due to successful campaigns by the environmental movement, a set of purely rational, market-based decisions have suddenly acquired political latency, which has given us differently designed cars, lights that go off if no one is in the room, and so forth.
  • It has also produced citizens who – at least in theory – are encouraged to think of implications that extend far beyond the ability to pay their electricity bill.
  • Right now, your decision to buy a smart toothbrush with a sensor in it – and then to sell the data that it generates – is presented to us as just a purely commercial decision that affects no one but us.
  • But this is so only because we cannot imagine an information disaster as easily as we can imagine an environmental disaster.
  • there are profound political and moral consequences to information consumerism– and they are comparable to energy consumerism in scope and importance.
  • We should do our best to suspend the seeming economic normalcy of information sharing. An attitude of “just business!” will no longer suffice. Information sharing might have a vibrant market around it but it has no ethical framework to back it up.
  • NSA surveillance, Big Brother, Prism: all of this is important stuff. But it’s as important to focus on the bigger picture -- and in that bigger picture, what must be subjected to scrutiny is information consumerism itself – and not just the parts of the military-industrial complex responsible for surveillance.
  • As long as we have no good explanation as to why a piece of data shouldn’t be on the market, we should forget about protecting it from the NSA, for, even with tighter regulation, intelligence agencies would simply buy – on the open market – what today they secretly get from programs like Prism.
  • Some might say: If only we could have a digital party modeled on the Green Party but for all things digital. A greater mistake is harder to come by.
  • What we need is the mainstreaming of “digital” topics – not their ghettoization in the hands and agendas of the Pirate Parties or whoever will come to succeed them. We can no longer treat the “Internet” as just another domain – like, say, “the economy” or the “environment” – and hope that we can develop a set of competencies around it.
  • Forget an ambiguous goal like “Internet freedom” – it’s an illusion and it’s not worth pursuing. What we must focus on is creating environments where actual freedom can still be nurtured and preserved.
  • The Pirates’s tragic miscalculation was trying to do too much: they wanted to change both the process of politics and its content. That project was so ambitious that it was doomed to failure from the very beginning.
  • whatever reforms the Pirates have been advancing did not seem to stem from some long critical reflections of the pitfalls of the current political system but, rather, from their belief that the political system, incompatible with the most successful digital platforms from Wikipedia to Facebook, must be reshaped in their image. This was – and is – nonsense.
  • A parliament is, in fact, different from Wikipedia – but the success of the latter tells us absolutely nothing about the viability of the Wikipedia model as a template for remodeling our political institutions
  • In as much as the Snowden affair has forced us to confront these issues, it’s been a good thing for democracy. Let’s face it: most of us would rather not think about the ethical implications of smart toothbrushes or the hypocrisy involved in Western rhetoric towards Iran or the genuflection that more and more European leaders show in front of Silicon Valley and its awful, brain-damaging language, the Siliconese.
  • The least we can do is to acknowledge that the crisis is much deeper and that it stems from intellectual causes as much as from legal ones. Information consumerism, like its older sibling energy consumerism, is a much more dangerous threat to democracy than the NSA.
  •  
    "The problem with the sick, obsessive superpower revealed to us by Edward Snowden is that it cannot bring itself to utter the one line it absolutely must utter before it can move on: "My name is America and I'm a dataholic.""
anonymous

DNA/How to Stop Worrying and Learn to Love the Internet - 0 views

  • I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this: 1) everything that’s already in the world when you’re born is just normal; 2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it; 3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.
  • Because the Internet is so new we still don’t really understand what it is. We mistake it for a type of publishing or broadcasting, because that’s what we’re used to. So people complain that there’s a lot of rubbish online, or that it’s dominated by Americans, or that you can’t necessarily trust what you read on the web.
  • ‘carved in stone.’
    • anonymous
       
      Add: You can carve lies in stone.
  • ...3 more annotations...
  • Another problem with the net is that it’s still ‘technology’, and ‘technology’, as the computer scientist Bran Ferren memorably defined it, is ‘stuff that doesn’t work yet.’
  • In ‘The Language Instinct’, Stephen Pinker explains the generational difference between pidgin and creole languages. A pidgin language is what you get when you put together a bunch of people – typically slaves – who have already grown up with their own language but don’t know each others’. They manage to cobble together a rough and ready lingo made up of bits of each. It lets them get on with things, but has almost no grammatical structure at all. However, the first generation of children born to the community takes these fractured lumps of language and transforms them into something new, with a rich and organic grammar and vocabulary, which is what we call a Creole. Grammar is just a natural function of children’s brains, and they apply it to whatever they find.
  • We are natural villagers. For most of mankind’s history we have lived in very small communities in which we knew everybody and everybody knew us. But gradually there grew to be far too many of us, and our communities became too large and disparate for us to be able to feel a part of them, and our technologies were unequal to the task of drawing us together. But that is changing.
  •  
    "...the change is real. I don't think anybody would argue now that the Internet isn't becoming a major factor in our lives. However, it's very new to us. Newsreaders still feel it is worth a special and rather worrying mention if, for instance, a crime was planned by people 'over the Internet.' They don't bother to mention when criminals use the telephone or the M4, or discuss their dastardly plans 'over a cup of tea,' though each of these was new and controversial in their day." By Douglas Adams at The Sunday Times on August 29, 1999.
anonymous

Know Your Meme - 0 views

  •  
    Part of the Internet Meme Database. "Documenting Internet phenomena: viral videos, image macros, catchphrases, web celebs and more."
anonymous

Keeping Terrorism in Perspective - 0 views

  • By design, terrorist attacks are intended to have a psychological impact far outweighing the physical damage the attack causes. As their name suggests, they are meant to cause terror that amplifies the actual attack. A target population responding to a terrorist attack with panic and hysteria allows the perpetrators to obtain a maximum return on their physical effort.
  • One way to mitigate the psychological impact of terrorism is to remove the mystique and hype associated with it. The first step in this demystification is recognizing that terrorism is a tactic used by a variety of actors and that it will not go away, something we discussed at length in our first analysis in this series.
  • Another way to mitigate the impact of terrorism is recognizing that those who conduct terrorist attacks are not some kind of Hollywood superninja commandos who can conjure attacks out of thin air. Terrorist attacks follow a discernable, predictable planning process that can be detected if it is looked for.
  • ...10 more annotations...
  • A third important component in the demystification process is recognizing and resisting the terror magnifiers terrorist planners use in their efforts to maximize the impact of their attacks.
  • let's first examine the objective of terrorist planners.
  • In the late 1960s and early 1970s, modern terrorist organizations began to conduct operations designed to serve as terrorist theater, an undertaking greatly aided by the advent and spread of broadcast media.
  • Today, the proliferation of 24-hour television news networks and Internet news sites magnifies such media exposure.
  • Such theatrical attacks exert a strange hold over the human imagination. The sense of terror they create can dwarf the reaction to natural disasters many times greater in magnitude. For example, more than 227,000 people died in the 2004 Indian Ocean tsunami compared to fewer than 3,000 people on 9/11. Yet the 9/11 attacks spawned a global sense of terror and a geopolitical reaction that had a profound and unparalleled impact upon world events over the past decade.
  • As noted, the media magnifies this anxiety and terror. Television news, whether broadcast on the airwaves or over the Internet, allows people to experience a terrorist event remotely and vicariously, and the print media reinforces this. While part of this magnification results merely from the nature of television as a medium and the 24-hour news cycle, bad reporting and misunderstanding can build hype and terror.
  • The traditional news media are not alone in the role of terror magnifier. The Internet has become an increasingly effective conduit for panic and alarm. From hysterical (and false) claims in 2005 that al Qaeda had pre-positioned nuclear weapons in the United States and was preparing to attack nine U.S. cities and kill 4 million Americans in operation "American Hiroshima" to 2010 claims that Mexican drug cartels were smuggling nuclear weapons into the United States for Osama bin Laden, a great deal of fearmongering can spread rapidly over the Internet.
  • Website operators who earn advertising revenue based on the number of unique site visitors have an obvious financial incentive to publish outlandish and startling terrorism stories.
  • Sometimes even governments act as terror magnifiers. Certainly, in the early 2000s the media and the American public became fearful every time the U.S. Department of Homeland Security (DHS) raised its color-coded threat level. Politicians' statements also can scare people. Such was the case in 2007 when DHS secretary Michael Chertoff said his gut screamed that a major terrorist attack was imminent and in 2010 when the head of French internal intelligence noted that the threat of terrorism in France was never higher.
  • The world is a dangerous place. Everyone is going to die, and some people are certain to die in a manner that is brutal or painful. Recognizing that terrorist attacks, like car crashes and cancer and natural disasters, are part of the human condition permits people to take prudent, measured actions to prepare for such contingencies and avoid becoming victims (vicarious or otherwise). It is the resilience of the population and their perseverance that determine how much a terrorist attack is allowed to terrorize. By separating terror from terrorism, citizens can deny the practitioners of terror the ability to magnify their reach and power.
  •  
    "As we conclude our series on the fundamentals of terrorism, it is only fitting that we do so with a discussion of the importance of keeping terrorism in perspective."
anonymous

Gapminder Desktop: Explore the World of Data from your own Computer - 0 views

  •  
    "To overcome the online requirement, Gapminder Desktop [gapminder.org has recently been released for all operating systems. Based on Adobe AIR technology, this "No Internet Required" software allows people to explore the same data from their own computer, even when there is no Internet connectivity available. In particular, Gapminder Desktop is aimed to teachers and students to bookmark and present global trends in all sort of situations. It comes preloaded with 600+ indicators on health, environment, economy, education, poverty, technology, and so on."
anonymous

Why I spoofed science journalism, and how to fix it - 0 views

  • The formula I outlined – using a few randomly picked BBC science articles as a guide – isn't necessarily an example of bad journalism; butscience reporting is predictable enough that you can write a formula for it that everyone recognises, and once the formula has been seen it's very hard to un-see, like a faint watermark at the edge of your vision.
  • A science journalist should be capable of, at a minimum, reading a scientific paper and being able to venture a decent opinion.
  • If you are not actually providing any analysis, if you're not effectively 'taking a side', then you are just a messenger, a middleman, a megaphone with ears. If that's your idea of journalism, then my RSS reader is a journalist.
  • ...7 more annotations...
  • thanks to the BBC's multi-platform publishing guidelines, the first few paragraphs of any news story need to be written in such a way that they can be cut and pasted into a Ceefax page.
  • Another issue affecting style is the need to reach a diverse audience. This puts pressure on commercial media groups who need to secure page views to generate advertising revenue
  • As a writer, word limits are both a blessing and a curse. Many bloggers would have their writing immeasurably improved if they stuck to a word limit – doing that forces you to plan, to organise your thoughts, and to avoid redundancy and repetition. On the other hand, some stories need more time to tell, and sticking dogmatically to an arbitrary 800-word limit for stuff that's published on the internet doesn't make a lot of sense. The internet is not running out of space.
  • Science is all about process, context and community, but reporting concentrates on single people, projects and events.
  • Hundreds of interesting things happen in science every week, and yet journalists from all over the media seem driven by a herd mentality that ensures only a handful of stories are covered. And they're not even the most interesting stories in many cases.
  • Members of the public could be forgiven for believing that science involves occasional discoveries interspersed with long periods of 'not very much happening right now'. The reality of science is almost the complete opposite of this.
  • One of the biggest failures of science reporting is the media's belief that a scientific paper or research finding represents a conclusion of some kind. Scientists know that this simply isn't true. A new paper is the start or continuance of a discussion or debate that will often rumble on for years or even decades.
  •  
    What's wrong with science journalism, and how do we fix it? By Martin Robbins at The Guardian on October 5, 2010.
anonymous

Clive Thompson on How Tweets and Texts Nurture In-Depth Analysis - 0 views

  • The long take is the opposite: It’s a deeply considered report and analysis, and it often takes weeks, months, or years to produce. It used to be that only traditional media, like magazines or documentaries or books, delivered the long take. But now, some of the most in-depth stuff I read comes from academics or businesspeople penning big blog essays, Dexter fans writing 5,000-word exegeses of the show, and nonprofits like the Pew Charitable Trusts producing exhaustively researched reports on American life.
  • The real loser here is the middle take.
  • This is what the weeklies like Time and Newsweek have historically offered: reportage and essays produced a few days after major events, with a bit of analysis sprinkled on top. They’re neither fast enough to be conversational nor slow enough to be truly deep. The Internet has essentially demonstrated how unsatisfying that sort of thinking can be.
  •  
    "We're often told that the Internet has destroyed people's patience for long, well-thought-out arguments. After all, the ascendant discussions of our day are text messages, tweets, and status updates. The popularity of this endless fire hose of teensy utterances means we've lost our appetite for consuming-and creating-slower, reasoned contemplation. Right? I'm not so sure. In fact, I think something much more complex and interesting is happening: The torrent of short-form thinking is actually a catalyst for more long-form meditation."
anonymous

5 Myths About the Chinese Communist Party - 0 views

  • "China Is Communist in Name Only." Wrong. If Vladimir Lenin were reincarnated in 21st-century Beijing and managed to avert his eyes from the city's glittering skyscrapers and conspicuous consumption, he would instantly recognize in the ruling Chinese Communist Party a replica of the system he designed nearly a century ago for the victors of the Bolshevik Revolution. One need only look at the party's structure to see how communist -- and Leninist -- China's political system remains.
  • As in the Soviet Union, the party controls the media through its Propaganda Department, which issues daily directives, both formally on paper and in emails and text messages, and informally over the phone, to the media. The directives set out, often in detail, how news considered sensitive by the party -- such as the awarding of the Nobel Peace Prize to Liu Xiaobo -- should be handled or whether it should be run at all.
  • Perhaps most importantly, the party dictates all senior personnel appointments in ministries and companies, universities and the media, through a shadowy and little-known body called the Organization Department. Through the department, the party oversees just about every significant position in every field in the country. Clearly, the Chinese remember Stalin's dictate that the cadres decide everything.
  • ...4 more annotations...
  • "The Party Controls All Aspects of Life in China." Not anymore. No question, China was a totalitarian state under Mao Zedong's rule from 1949 until his death in 1976. In those bad old days, ordinary workers had to ask their supervisors' permission not only to get married, but to move in with their spouses. Even the precise timing for starting a family relied on a nod from on high.
  • "The Internet Will Topple the Party." Nope. Bill Clinton famously remarked a decade ago that the efforts of Chinese leaders to control the Internet were doomed, akin to "nailing Jell-O to a wall." It turns out the former president was right, but not in the way he thought. Far from being a conveyor belt for Western democratic values, the Internet in China has largely done the opposite. The "Great Firewall" works well in keeping out or at least filtering Western ideas. Behind the firewall, however, hypernationalist netizens have a much freer hand.
  • "Other Countries Want to Follow the China Model." Good Luck. Of course, many developing countries are envious of China's rise. Which poor country wouldn't want three decades of 10 percent annual growth? And which despot wouldn't want 10 percent growth and an assurance that he or she would meanwhile stay in power for the long haul? China undoubtedly has important lessons to teach other countries about how to manage development, from fine-tuning reforms by testing them in different parts of the country to managing urbanization so that large cities are not overrun by slums and shantytowns.
  • "The Party Can't Rule Forever." Yes it can. Or at least for the foreseeable future. Unlike in Taiwan and South Korea, China's middle class has not emerged with any clear demand for Western-style democracy. There are some obvious reasons why. All three of China's close Asian neighbors, including Japan, became democracies at different times and in different circumstances. But all were effectively U.S. protectorates, and Washington was crucial in forcing through democratic change or institutionalizing it.
  •  
    "5 Myths About the Chinese Communist Party" - an interesting look at some assumptions that Westerners tend to make, and how they are classically wrong (like so many things we take a magnifying glass to).
anonymous

Cyber Command: We Don't Wanna Defend the Internet (We Just Might Have To) - 0 views

  • Members of the military’s new Cyber Command insist that they’ve got no interest in taking over civilian Internet security – or even in becoming the Pentagon’s primary information protectors. But the push to intertwine military and civilian network defenses is gaining momentum, nevertheless. At a gathering this week of top cybersecurity officials and defense contractors, the Pentagon’s number two floated the idea that the Defense Department might start a protective program for civilian networks, based on a deeply controversial effort to keep hackers out of the government’s pipes.
  • Privacy rights organizations and military insiders also wonder whether CYBERCOM is just another way to extend the NSA’s reach. After all, both organizations are headquartered at Ft. Meade. And both are headed by Gen. Keith Alexander. The CYBERCOM official swears that won’t happen. “It’s not NSA taking over military cyber,” he said. “And it’s not military cyber taking over NSA.”
  •  
    By Noah Shactman at Danger Room (Wired.com) on May 28, 2010. Thanks to http://alexkessinger.posterous.com/cyber-command-we-dont-wanna-defend-the-intern-2
anonymous

USENIX 2011 Keynote: Network Security in the Medium Term, 2061-2561 AD - 1 views

  • if we should meet up in 2061, much less in the 26th century, you’re welcome to rib me about this talk. Because I’ll be happy to still be alive to rib.
  • The question I’m going to spin entertaining lies around is this: what is network security going to be about once we get past the current sigmoid curve of accelerating progress and into a steady state, when Moore’s first law is long since burned out, and networked computing appliances have been around for as long as steam engines?
  • a few basic assumptions about the future
  • ...82 more annotations...
  • it’s not immediately obvious that I can say anything useful about a civilization run by beings vastly more intelligent than us. I’d be like an australopithecine trying to visualize daytime cable TV.
  • The idea of an AI singularity
  • the whole idea of artificial general intelligence strikes me as being as questionable as 19th century fantasies about steam-powered tin men.
  • if you start trying to visualize a coherent future that includes aliens, telepathy, faster than light travel, or time machines, your futurology is going to rapidly run off the road and go crashing around in the blank bits of the map that say HERE BE DRAGONS.
  • at least one barkingly implausible innovation will come along between now and 2061 and turn everything we do upside down
  • My crystal ball is currently predicting that base load electricity will come from a mix of advanced nuclear fission reactor designs and predictable renewables such as tidal and hydroelectric power.
  • We are, I think, going to have molecular nanotechnology and atomic scale integrated circuitry.
  • engineered solutions that work a bit like biological systems
  • Mature nanotechnology is going to resemble organic life forms the way a Boeing 737 resembles thirty tons of seagull biomass.
  • without a technological civilization questions of network security take second place to where to get a new flint arrowhead.
  • if we’re still alive in the 26th century you’re welcome to remind me of what I got wrong in this talk.
  • we’re living through the early days of a revolution in genomics and biology
  • We haven’t yet managed to raise the upper limit on human life expectancy (it’s currently around 120 years), but an increasing number of us are going to get close to it.
  • it’s quite likely that within another century the mechanisms underlying cellular senescence will be understood and treatable like other inborn errors of metabolism
  • another prediction: something outwardly resembling democracy everywhere.
  • Since 1911, democractic government by a republic has gone from being an eccentric minority practice to the default system of government world-wide
  • Democracy is a lousy form of government in some respects – it is particularly bad at long-term planning, for no event that lies beyond the electoral event horizon can compel a politician to pay attention to it
  • but it has two gigantic benefits: it handles transfers of power peacefully, and provides a pressure relief valve for internal social dissent.
  • there are problems
  • . In general, democratically elected politicians are forced to focus on short-term solutions to long-term problems because their performance is evaluated by elections held on a time scale of single-digit years
  • Democratic systems are prone to capture by special interest groups that exploit the information asymmetry that’s endemic in complex societies
  • The adversarial two-party model is a very bad tool for generating consensus on how to tackle difficult problems with no precedents
  • Finally, representative democracy scales up badly
  • Nor are governments as important as they used to be.
  • the US government, the largest superpower on the block right now, is tightly constrained by the international trade system it promoted in the wake of the second world war.
  • we have democratic forms of government, without the transparency and accountability.
  • At least, until we invent something better – which I expect will become an urgent priority before the end of the century.
  • The good news is, we’re a lot richer than our ancestors. Relative decline is not tragic in a positive-sum world.
  • Assuming that they survive the obstacles on the road to development, this process is going to end fairly predictably: both India and China will eventually converge with a developed world standard of living, while undergoing the demographic transition to stable or slowly declining populations that appears to be an inevitable correlate of development.
  • a quiet economic revolution is sweeping Africa
  • In 2006, for the first time, more than half of the planet’s human population lived in cities. And by 2061 I expect more than half of the planet’s human population will live in conditions that correspond to the middle class citizens of developed nations.
  • by 2061 we or our children are going to be living on an urban middle-class planet, with a globalized economic and financial infrastructure recognizably descended from today’s system, and governments that at least try to pay lip service to democratic norms.
  • And let me say, before I do, that the picture I just painted – of the world circa 2061, which is to say of the starting point from which the world of 2561 will evolve – is bunk.
  • It’s a normative projection
  • I’m pretty certain that something utterly unexpected will come along and up-end all these projections – something as weird as the world wide web would have looked in 1961.
  • And while the outer forms of that comfortable, middle-class urban developed-world planetary experience might look familiar to us, the internal architecture will be unbelievably different.
  • Let’s imagine that, circa 1961 – just fifty years ago – a budding Nikolai Tesla or Bill Packard somewhere in big-city USA is tinkering in his garage and succeeds in building a time machine. Being adventurous – but not too adventurous – he sets the controls for fifty years in the future, and arrives in downtown San Francisco. What will he see, and how will he interpret it?
  • a lot of the buildings are going to be familiar
  • Automobiles are automobiles, even if the ones he sees look kind of melted
  • Fashion? Hats are out, clothing has mutated in strange directions
  • He may be thrown by the number of pedestrians walking around with wires in their ears, or holding these cigarette-pack-sized boxes with glowing screens.
  • But there seem to be an awful lot of mad people walking around with bits of plastic clipped to their ears, talking to themselves
  • The outward shape of the future contains the present and the past, embedded within it like flies in amber.
  • Our visitor from 1961 is familiar with cars and clothes and buildings
  • But he hasn’t heard of packet switched networks
  • Our time traveller from 1961 has a steep learning curve if he wants to understand the technology the folks with the cordless headsets are using.
  • The social consequences of a new technology are almost always impossible to guess in advance.
  • Let me take mobile phones as an example. They let people talk to one another – that much is obvious. What is less obvious is that for the first time the telephone network connects people, not places
  • For example, we’re currently raising the first generation of kids who won’t know what it means to be lost – everywhere they go, they have GPS service and a moving map that will helpfully show them how to get wherever they want to go.
  • to our time traveller from 1961, it’s magic: you have a little glowing box, and if you tell it “I want to visit my cousin Bill, wherever he is,” a taxi will pull up and take you to Bill’s house
  • The whole question of whether a mature technosphere needs three or four billion full-time employees is an open one, as is the question of what we’re all going to do if it turns out that the future can’t deliver jobs.
  • We’re still in the first decade of mass mobile internet uptake, and we still haven’t seen what it really means when the internet becomes a pervasive part of our social environment, rather than something we have to specifically sit down and plug ourselves in to, usually at a desk.
  • So let me start by trying to predict the mobile internet of 2061.
  • the shape of the future depends on whether whoever provides the basic service of communication
  • funds their service by charging for bandwidth or charging for a fixed infrastructure cost.
  • These two models for pricing imply very different network topologies.
  • This leaves aside a third model, that of peer to peer mesh networks with no actual cellcos as such – just lots of folks with cheap routers. I’m going to provisionally assume that this one is hopelessly utopian
  • the security problems of a home-brew mesh network are enormous and gnarly; when any enterprising gang of scammers can set up a public router, who can you trust?
  • Let’s hypothesize a very high density, non-volatile serial storage medium that might be manufactured using molecular nanotechnology: I call it memory diamond.
  • wireless bandwidth appears to be constrained fundamentally by the transparency of air to electromagnetic radiation. I’ve seen some estimates that we may be able to punch as much as 2 tb/sec through air; then we run into problems.
  • What can you do with 2 terabits per second per human being on the planet?
  • One thing you can do trivially with that kind of capacity is full lifelogging for everyone. Lifelogging today is in its infancy, but it’s going to be a major disruptive technology within two decades.
  • the resulting search technology essentially gives you a prosthetic memory.
  • Lifelogging offers the promise of indexing and retrieving the unwritten and undocmented. And this is both a huge promise and an enormous threat.
  • Lifelogging raises huge privacy concerns, of course.
  • The security implications are monstrous: if you rely on lifelogging for your memory or your ability to do your job, then the importance of security is pushed down Maslow’s hierarchy of needs.
  • if done right, widespread lifelogging to cloud based storage would have immense advantages for combating crime and preventing identity theft.
  • whether lifelogging becomes a big social issue depends partly on the nature of our pricing model for bandwidth, and how we hammer out the security issues surrounding the idea of our sensory inputs being logged for posterity.
  • at least until the self-driving automobile matches and then exceeds human driver safety.
  • We’re currently living through a period in genomics research that is roughly equivalent to the early 1960s in computing.
  • In particular, there’s a huge boom in new technologies for high speed gene sequencing.
  • full genome sequencing for individuals now available for around US $30,000, and expected to drop to around $1000–3000 within a couple of years.
  • Each of us is carrying around a cargo of 1–3 kilograms of bacteria and other unicellular organisms, which collectively outnumber the cells of our own bodies by a thousand to one.
  • These are for the most part commensal organisms – they live in our guts and predigest our food, or on our skin – and they play a significant role in the functioning of our immune system.
  • Only the rapid development of DNA assays for SARS – it was sequenced within 48 hours of its identification as a new pathogenic virus – made it possible to build and enforce the strict quarantine regime that saved us from somewhere between two hundred million and a billion deaths.
  • A second crisis we face is that of cancer
  • we can expect eventually to see home genome monitoring – both looking for indicators of precancerous conditions or immune disorders within our bodies, and performing metagenomic analysis on our environment.
  • If our metagenomic environment is routinely included in lifelogs, we have the holy grail of epidemiology within reach; the ability to exhaustively track the spread of pathogens and identify how they adapt to their host environment, right down to the level of individual victims.
  • In each of these three examples of situations where personal privacy may be invaded, there exists a strong argument for doing so in the name of the common good – for prevention of epidemics, for prevention of crime, and for prevention of traffic accidents. They differ fundamentally from the currently familiar arguments for invasion of our data privacy by law enforcement – for example, to read our email or to look for evidence of copyright violation. Reading our email involves our public and private speech, and looking for warez involves our public and private assertion of intellectual property rights …. but eavesdropping on our metagenomic environment and our sensory environment impinges directly on the very core of our identities.
  • With lifelogging and other forms of ubiquitous computing mediated by wireless broadband, securing our personal data will become as important to individuals as securing our physical bodies.
  • the shifting sands of software obsolescence have for the most part buried our ancient learning mistakes.
  • So, to summarize: we’re moving towards an age where we may have enough bandwidth to capture pretty much the totality of a human lifespan, everything except for what’s going on inside our skulls.
  •  
    "Good afternoon, and thank you for inviting me to speak at USENIX Security." A fun read by Charlie Stoss."
  •  
    I feel like cancer may be a bit played up. I freak out more about dementia.
anonymous

Jaron Lanier: The Internet destroyed the middle class - 2 views

  • His book continues his war on digital utopianism and his assertion of humanist and individualistic values in a hive-mind world. But Lanier still sees potential in digital technology: He just wants it reoriented away from its main role so far, which involves “spying” on citizens, creating a winner-take-all society, eroding professions and, in exchange, throwing bonbons to the crowd.
  • This week sees the publication of “Who Owns the Future?,” which digs into technology, economics and culture in unconventional ways.
  • Much of the book looks at the way Internet technology threatens to destroy the middle class by first eroding employment and job security, along with various “levees” that give the economic middle stability.
  • ...55 more annotations...
  • “Here’s a current example of the challenge we face,” he writes in the book’s prelude: “At the height of its power, the photography company Kodak employed more than 140,000 people and was worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only 13 people. Where did all those jobs disappear? And what happened to the wealth that all those middle-class jobs created?”
  • But more important than Lanier’s hopes for a cure is his diagnosis of the digital disease. Eccentric as it is, “Future” is one of the best skeptical books about the online world, alongside Nicholas Carr’s “The Shallows,” Robert Levine’s “Free Ride” and Lanier’s own “You Are Not a Gadget.”
  • One is that the number of people who are contributing to the system to make it viable is probably the same.
  • And furthermore, many people kind of have to use social networks for them to be functional besides being valuable.
  • So there’s still a lot of human effort, but the difference is that whereas before when people made contributions to the system that they used, they received formal benefits, which means not only salary but pensions and certain kinds of social safety nets. Now, instead, they receive benefits on an informal basis. And what an informal economy is like is the economy in a developing country slum. It’s reputation, it’s barter, it’s that kind of stuff.
  • Yeah, and I remember there was this fascination with the idea of the informal economy about 10 years ago. Stewart Brand was talking about how brilliant it is that people get by in slums on an informal economy. He’s a friend so I don’t want to rag on him too much. But he was talking about how wonderful it is to live in an informal economy and how beautiful trust is and all that.
  • And you know, that’s all kind of true when you’re young and if you’re not sick, but if you look at the infant mortality rate and the life expectancy and the education of the people who live in those slums, you really see what the benefit of the formal economy is if you’re a person in the West, in the developed world.
  • So Kodak has 140,000 really good middle-class employees, and Instagram has 13 employees, period. You have this intense concentration of the formal benefits, and that winner-take-all feeling is not just for the people who are on the computers but also from the people who are using them. So there’s this tiny token number of people who will get by from using YouTube or Kickstarter, and everybody else lives on hope. There’s not a middle-class hump. It’s an all-or-nothing society.
  • the person who lost his job at Kodak still has to pay rent with old-fashioned money he or she is no longer earning. He can’t pay his rent with cultural capital that’s replaced it.
  • The informal way of getting by doesn’t tide you over when you’re sick and it doesn’t let you raise kids and it doesn’t let you grow old. It’s not biologically real.
  • If we go back to the 19th century, photography was kind of born as a labor-saving device, although we don’t think of it that way.
  • And then, you know, along a similar vein at that time early audio recordings, which today would sound horrible to us, were indistinguishable between real music to people who did double blind tests and whatnot.
  • So in the beginning photography was kind of a labor saving device. And whenever you have a technological advance that’s less hassle than the previous thing, there’s still a choice to make. And the choice is, do you still get paid for doing the thing that’s easier?
  • And so you could make the argument that a transition to cars should create a world where drivers don’t get paid, because, after all, it’s fun to drive.
  • We kind of made a bargain, a social contract, in the 20th century that even if jobs were pleasant people could still get paid for them. Because otherwise we would have had a massive unemployment. And so to my mind, the right question to ask is, why are we abandoning that bargain that worked so well?
    • anonymous
       
      I think that's a worthy question considering the high-speed with which we adopt every possible technology; to hell with foresight.
  • Of course jobs become obsolete. But the only reason that new jobs were created was because there was a social contract in which a more pleasant, less boring job was still considered a job that you could be paid for. That’s the only reason it worked. If we decided that driving was such an easy thing [compared to] dealing with horses that no one should be paid for it, then there wouldn’t be all of those people being paid to be Teamsters or to drive cabs. It was a decision that it was OK to have jobs that weren’t terrible.
  • I mean, the whole idea of a job is entirely social construct. The United States was built on slave labor. Those people didn’t have jobs, they were just slaves. The idea of a job is that you can participate in a formal economy even if you’re not a baron. That there can be, that everybody can participate in the formal economy and the benefit of having everybody participate in the formal economy, there are annoyances with the formal economy because capitalism is really annoying sometimes.
  • But the benefits are really huge, which is you get a middle-class distribution of wealth and clout so the mass of people can outspend the top, and if you don’t have that you can’t really have democracy. Democracy is destabilized if there isn’t a broad distribution of wealth.
  • And then the other thing is that if you like market capitalism, if you’re an Ayn Rand person, you have to admit that markets can only function if there are customers and customers can only come if there’s a middle hump. So you have to have a broad distribution of wealth.
    • anonymous
       
      Ha ha. Ayn Rand people don't have to admit to *anything,* trust me, dude.
  • It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism.
  • But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining.
    • anonymous
       
      This makes me curious. Is he arguing that there are fewer *nodes* because the information access closes them?
  • You argue that the middle class, unlike the rich and the poor, is not a natural class but was built and sustained through some kind of intervention.
    • anonymous
       
      My understanding was that the U.S. heads of business got the nod to go ahead and start manufacturing things *other* than weapons, because our industrial capabilities weren't anhialated (sp?) relative to so many others.
  • There’s always academic tenure, or a taxi medallion, or a cosmetology license, or a pension. There’s often some kind of license or some kind of ratcheting scheme that allows people to keep their middle-class status.
  • In a raw kind of capitalism there tend to be unstable events that wipe away the middle and tend to separate people into rich and poor. So these mechanisms are undone by a particular kind of style that is called the digital open network.
  • Music is a great example where value is copied. And so once you have it, again it’s this winner-take-all thing where the people who really win are the people who run the biggest computers. And a few tokens, an incredibly tiny number of token people who will get very successful YouTube videos, and everybody else lives on hope or lives with their parents or something.
  • I guess all orthodoxies are built on lies. But there’s this idea that there must be tens of thousands of people who are making a great living as freelance musicians because you can market yourself on social media.
  • And whenever I look for these people – I mean when I wrote “Gadget” I looked around and found a handful – and at this point three years later, I went around to everybody I could to get actual lists of people who are doing this and to verify them, and there are more now. But like in the hip-hop world I counted them all and I could find about 50. And I really talked to everybody I could. The reason I mention hip-hop is because that’s where it happens the most right now.
  • The interesting thing about it is that people advertise, “Oh, what an incredible life. She’s this incredibly lucky person who’s worked really hard.” And that’s all true. She’s in her 20s, and it’s great that she’s found this success, but what this success is that she makes maybe $250,000 a year, and she rents a house that’s worth $1.1 million in L.A.. And this is all breathlessly reported as this great success.
  • And that’s good for a 20-year-old, but she’s at the very top of, I mean, the people at the very top of the game now and doing as well as what used to be considered good for a middle-class life.
    • anonymous
       
      Quite true. She's obviously not rolling in solid gold cadillacs.
  • But for someone who’s out there, a star with a billion views, that’s a crazy low expectation. She’s not even in the 1 percent. For the tiny token number of people who make it to the top of YouTube, they’re not even making it into the 1 percent.
  • The issue is if we’re going to have a middle class anymore, and if that’s our expectation, we won’t. And then we won’t have democracy.
  • I think in the total of music in America, there are a low number of hundreds. It’s really small. I wish all of those people my deepest blessings, and I celebrate the success they find, but it’s just not a way you can build a society.
  • The other problem is they would have to self-fund. This is getting back to the informal economy where you’re living in the slum or something, so you’re desperate to get out so you impress the boss man with your music skills or your basketball skills. And the idea of doing that for the whole of society is not progress. It should be the reverse. What we should be doing is bringing all the people who are in that into the formal economy. That’s what’s called development. But this is the opposite of that. It’s taking all the people from the developed world and putting them into a cycle of the developing world of the informal economy.
  • We don’t realize that our society and our democracy ultimately rest on the stability of middle-class jobs. When I talk to libertarians and socialists, they have this weird belief that everybody’s this abstract robot that won’t ever get sick or have kids or get old. It’s like everybody’s this eternal freelancer who can afford downtime and can self-fund until they find their magic moment or something.
  • The way society actually works is there’s some mechanism of basic stability so that the majority of people can outspend the elite so we can have a democracy. That’s the thing we’re destroying, and that’s really the thing I’m hoping to preserve. So we can look at musicians and artists and journalists as the canaries in the coal mine, and is this the precedent that we want to follow for our doctors and lawyers and nurses and everybody else? Because technology will get to everybody eventually.
  • I have 14-year-old kids who come to my talks who say, “But isn’t open source software the best thing in life? Isn’t it the future?” It’s a perfect thought system. It reminds me of communists I knew when growing up or Ayn Rand libertarians.
  • It’s one of these things where you have a simplistic model that suggests this perfect society so you just believe in it totally. These perfect societies don’t work. We’ve already seen hyper-communism come to tears. And hyper-capitalism come to tears. And I just don’t want to have to see that for cyber-hacker culture. We should have learned that these perfect simple systems are illusions.
  • You’re concerned with equality and a shrinking middle class. And yet you don’t seem to consider yourself a progressive or a man of the left — why not?
  • I am culturally a man on the left. I get a lot of people on the left. I live in Berkeley and everything. I want to live in a world where outcomes for people are not predetermined in advance with outcomes.
  • The problem I have with socialist utopias is there’s some kind of committees trying to soften outcomes for people. I think that imposes models of outcomes for other people’s lives. So in a spiritual sense there’s some bit of libertarian in me. But the critical thing for me is moderation. And if you let that go too far you do end up with a winner-take-all society that ultimately crushes everybody even worse. So it has to be moderated.
  • I think seeking perfection in human affairs is a perfect way to destroy them.
  • All of these things are magisterial, where the people who become involved in them tend to wish they could be the only ones.
  • Libertarians tend to think the economy can totally close its own loops, that you can get rid of government. And I ridicule that in the book. There are other people who believe that if you could get everybody to talk over social networks, if we could just cooperate, we wouldn’t need money anymore. And I recommend they try living in a group house and then they’ll see it’s not true.
    • anonymous
       
      Group House. HAH!
  • So what we have to demand of digital technology is that it not try to be a perfect system that takes over everything. That it balances the excess of the other magisteria.
  • And that is doesn’t concentrate power too much, and if we can just get to that point, then we’ll really be fine. I’m actually modest. People have been accusing me of being super-ambitious lately, but I feel like in a way I’m the most modest person in the conversation.
  • I’m just trying to avoid total dysfunction.
    • anonymous
       
      See, now I like this guy. This is like the political equivalent of aiming for the realist view in geopolitics. We separate what is likely from what is unlikely and aim not for "the best" situation, but a situation where the worst aspects have been mitigated. It's backwards thinking that both parties would have a hard time integrating into their (ughhh) brand.
  • Let’s stick with politics for one more. Is there something dissonant about the fact that the greatest fortunes in human history have been created with a system developed largely by taxpayers dollars?
  • Yeah, no kidding. I was there. I gotta say, every little step of this thing was really funded by either the military or public research agencies. If you look at something like Facebook, Facebook is adding the tiniest little rind of value over the basic structure that’s there anyway. In fact, it’s even worse than that. The original designs for networking, going back to Ted Nelson, kept track of everything everybody was pointing at so that you would know who was pointing at your website. In a way Facebook is just recovering information that was deliberately lost because of the fetish for being anonymous. That’s also true of Google.
  • I don’t hate anything about e-books or e-book readers or tablets. There’s a lot of discussion about that, and I think it’s misplaced. The problem I have is whether we believe in the book itself.
  • Books are really, really hard to write. They represent a kind of a summit of grappling with what one really has to say. And what I’m concerned with is when Silicon Valley looks at books, they often think of them as really differently as just data points that you can mush together. They’re divorcing books from their role in personhood.
    • anonymous
       
      Again, a take I rarely encounter.
  • I was in a cafe this morning where I heard some stuff I was interested in, and nobody could figure out. It was Spotify or one of these … so they knew what stream they were getting, but they didn’t know what music it was. Then it changed to other music, and they didn’t know what that was. And I tried to use one of the services that determines what music you’re listening to, but it was a noisy place and that didn’t work. So what’s supposed to be an open information system serves to obscure the source of the musician. It serves as a closed information system. It actually loses the information.
    • anonymous
       
      I have had this very thing happen to. I didn't get to have my moment of discovery. I think Google Glass is going to fix that. Hah. :)
  • And if we start to see that with books in general – and I say if – if you look at the approach that Google has taken to the Google library project, they do have the tendency to want to move things together. You see the thing decontextualized.
  • I have sort of resisted putting my music out lately because I know it just turns into these mushes. Without context, what does my music mean? I make very novel sounds, but I don’t see any value in me sharing novel sounds that are decontextualized. Why would I write if people are just going to get weird snippets that are just mushed together and they don’t know the overall position or the history of the writer or anything? What would be the point in that. The day books become mush is the day I stop writing.
  • So to realize how much better musical instruments were to use as human interfaces, it helped me to be skeptical about the whole digital enterprise. Which I think helped me be a better computer scientist, actually.
  • Sure. If you go way back I was one of the people who started the whole music-should-be-free thing. You can find the fire-breathing essays where I was trying to articulate the thing that’s now the orthodoxy. Oh, we should free ourselves from the labels and the middleman and this will be better.I believed it at the time because it sounds better, it really does. I know a lot of these musicians, and I could see that it wasn’t actually working. I think fundamentally you have to be an empiricist. I just saw that in the real lives I know — both older and younger people coming up — I just saw that it was not as good as what it had once been. So that there must be something wrong with our theory, as good as it sounded. It was really that simple.
  •  
    "Kodak employed 140,000 people. Instagram, 13. A digital visionary says the Web kills jobs, wealth -- even democracy"
anonymous

The Declining Relevance of Generation Gaps - 1 views

  • In terms of cultural artifacts, we are shifting to an on-demand system, in which all the media from all of the ages just exists in a giant pile on the internet for anyone to peruse at any time.
  • The increasing fragmentation of entertainment outlets suggests that what will matter most is not so much what generation you’re from, but what micro niche you belong to.
  • Computers interfaces are getting easier to use and increasingly dumbed down.
  • ...3 more annotations...
  • Relatively fast adoption of new technologies is already pretty much a necessity
  • Better health and medical technology will make the physical differences between the young and the old increasingly less salient.
  • The increasing difficulty of finding a job, the growing impermanence of jobs that exist, the inevitable transformation of higher education, and the continued decoupling of education from work
  •  
    "Something I think is already happening and will accelerate in the future, is that traditional generation gaps are going to stop being relevant."
  •  
    My comment to the post: What I'd add is that the more traditional elements of generation gaps - namely the cohort/group you identify with - will remain. I'm thinking here of "You were in *this* age group when *that* global event happened." Still, on the surface I can't see anything to disagree with. Surely, the maturation of IT is definitely levelling the operational playing field quite a lot. When I started using PC's, it was considered more akin to, say, having a "chemistry set." Now, my son, my parents, and my grandparents all use the computer as a productivity device in a variety of overlapping fashions. I suppose one could argue against this, claiming (correctly) that all generations have enjoyed TV, but that's a consumption device, a small but very important distinction. As for education, you ain't kidding. In fact, noticing how my son and his peers use or do not use the internet with sufficient interest gives rise to an INTEREST gap. Namely: If you care to invest the effort, you can excel. If not, you don't have too many excuses. Regarding point #5, that's (at least) true for Gen-X'ers and younger. The idea of workplace stability seems almost anachronistic at this point. :) Great post!
anonymous

Why Office 365 and Office 2013 may not be right for you - 1 views

  • Unlike Office 2010, Office 2013 does not work with Windows XP or Windows Vista. Yet the latest data from NetApplications shows that roughly 45 percent of all Internet users still rock those two aging operating systems.
  • One of the big draws of an Office 365 subscription is Office on Demand, a full-fledged, Internet-streamed version of the productivity suite that Microsoft calls "Your Office away from home."
  • And it really, truly is—if the host computer meets the suite's fairly stringent requirements.
  • ...5 more annotations...
  • Sync is Google's implementation of Microsoft's Exchange ActiveSync protocol. Without it, you can't natively sync your Google Calendar or Contacts to the Outlook 2013 mail client
  • Office 365 Home Premium sounds like a killer deal for SMBs.
  • The licenses for Office 365 Home Premium and Office 2013 Home & Student prohibit using the software for commercial purposes.
  • Don't despair, though: Microsoft plans to launch Office 365 Small Business Premium on February 27, at a cost of $150 per user per year.
  • between Skype, Office on Demand, and SkyDrive storage enabled by default, Office 365 definitely has its head in the cloud—but its feet are firmly planted on the desktop.
  •  
    "The next generation of Office is here, and while it's not necessarily an essential upgrade for Office 2010 users, it's easily the best Office suite to date. Editing complicated financial spreadsheets has never been so semi-seamless! That said, with this particular $100-plus investment, you'll want to look before you leap. Whether you're opting for a straightforward Office 2013 installation or the multi-PC, cloud-connected ubiquity of an Office 365 subscription, there are four potentially crippling gotchas to consider before you plunk down your hard-earned cash. I've also identified a supposed gotcha that you can actually ignore entirely."
anonymous

This EULA Will Make You Rethink Every App and Online Service You Use - 1 views

  • Can we compare the internet to the road that must precede a lemonade stand?
    • anonymous
       
      On the face of it? No. Unless you're, I dunno, high...
  • The government built the road.
  • The whole idea of a public road is to push entrepreneurship up to a higher level. Without the government there would have most likely been a set of incompatible digital networks, mostly private, instead of a prominent unified internet.
  • ...4 more annotations...
  • (Al Gore actually played a crucial role in bringing that unity about when he was a senator, following in the footsteps of his father, who had facilitated the national system of interstate highways.)
  • Without the public road, and utterly unencumbered access to it, a child’s lemonade stand would never turn a profit. The real business opportunity would be in privatizing other people’s roads.
    • anonymous
       
      This looks to be the start of another classic example of why markets are not Perfect. Again, the notion that you can somehow decouple politics *from the near-only way that people have a voice in it* - MONEY - seems a quaint libertarian fantasy.
  • Here’s the EULA no one would read in the utopia they pine for:
  • Dear parents or legal guardians of ___________ As you may be aware, your daughter is one of ______ children in your neighborhood who recently applied for a jointly operated StreetApp® of the category “Lemonade Stand.”
  •  
    "We aren't creating enough opportunity for enough people online. The proof is simple. The wide adoption of transformative connecting technology should create a middle-class wealth boom, as happened when the Interstate Highway System gave rise to a world of new jobs in transportation and tourism, for instance, and generally widened commercial prospects. Instead we've seen recession, unemployment, and austerity."
anonymous

In Japan, the Fax Machine Is Anything but a Relic - 0 views

  • The Japanese government’s Cabinet Office said that almost 100 percent of business offices and 45 percent of private homes had a fax machine as of 2011.
  • “There is still something in Japanese culture that demands the warm, personal feelings that you get with a handwritten fax,” said Mr. Sugahara, 43.
  • Japan’s reluctance to give up its fax machines offers a revealing glimpse into an aging nation that can often seem quietly determined to stick to its tried-and-true ways, even if the rest of the world seems to be passing it rapidly by. The fax addiction helps explain why Japan, which once revolutionized consumer electronics with its hand-held calculators, Walkmans and, yes, fax machines, has become a latecomer in the digital age, and has allowed itself to fall behind nimbler competitors like South Korea and China.
    • anonymous
       
      This would sure explain Nintendo's baffling lag in the online arena.
  •  
    "Japan is renowned for its robots and bullet trains, and has some of the world's fastest broadband networks. But it also remains firmly wedded to a pre-Internet technology - the fax machine - that in most other developed nations has joined answering machines, eight-tracks and cassette tapes in the dustbin of outmoded technologies."
anonymous

Misinformation and Its Correction - 1 views

shared by anonymous on 17 Oct 12 - Cached
  •  
    "Abstract The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation. We first examine the mechanisms by which such misinformation is disseminated in society, both inadvertently and purposely. Misinformation can originate from rumors but also from works of fiction, governments and politicians, and vested interests. Moreover, changes in the media landscape, including the arrival of the Internet, have fundamentally influenced the ways in which information is communicated and misinformation is spread. We next move to misinformation at the level of the individual, and review the cognitive factors that often render misinformation resistant to correction. We consider how people assess the truth of statements and what makes people believe certain things but not others. We look at people's memory for misinformation and answer the questions of why retractions of misinformation are so ineffective in memory updating and why efforts to retract misinformation can even backfire and, ironically, increase misbelief. Though ideology and personal worldviews can be major obstacles for debiasing, there nonetheless are a number of effective techniques for reducing the impact of misinformation, and we pay special attention to these factors that aid in debiasing. We conclude by providing specific recommendations for the debunking of misinformation. These recommendations pertain to the ways in which corrections should be designed, structured, and applied in order to maximize their impact. Grounded in cognitive psychological theory, these rec
anonymous

There Is No God, There Is No Devil, And Innovation Is The Work Of Multitudes - 0 views

shared by anonymous on 21 Jun 12 - Cached
  • Nationalism, tribalism, us-vs-them-ism, the perpetual aggrievement of this or that identity culture – these are the real villains, if we must have villains, because this state of mind can only see competition as a threat rather than as a challenge.
  •  
    After all, when it comes to the Ire of the Geeks, no controversy is too small. As Freddie deBoer puts it, in likening geek culture to the Tea Party, geeks are "so invested in certain grievances [...] that they seem completely incapable of judging whether those grievances are rational." Thanks to Erik Hanson for the pointer. Which might help explain the tone of The Oatmeal's response to Alex's piece. Freddie calls it "a whiny, petulant reply" and there's no doubt, that for all the creativity of a drawn response, it was petulant. Good faith is in short enough supply on the internet, of course, but still, one can't help but wonder what's at the root of such a response.
anonymous

Tech startups: A Cambrian moment | The Economist - 1 views

  • Digital startups are bubbling up in an astonishing variety of services and products, penetrating every nook and cranny of the economy. They are reshaping entire industries and even changing the very notion of the firm. “Software is eating the world,” says Marc Andreessen, a Silicon Valley venture capitalist.
  • “Anyone who writes code can become an entrepreneur—anywhere in the world,” says Simon Levene, a venture capitalist in London. Here we go again, you may think: yet another dotcom bubble that is bound to pop. Indeed, the number of pure software startups may have peaked already. And many new offerings are simply iterations on existing ones.
  • The danger is that once again too much money is being pumped into startups, warns Mr Andreessen, who as co-founder of Netscape saw the bubble from close by: “When things popped last time it took ten years to reset the psychology.” And even without another internet bust, more than 90% of startups will crash and burn.
  • ...9 more annotations...
  • But this time is also different, in an important way. Today’s entrepreneurial boom is based on more solid foundations than the 1990s internet bubble, which makes it more likely to continue for the foreseeable future.
  • One explanation for the Cambrian explosion of 540m years ago is that at that time the basic building blocks of life had just been perfected, allowing more complex organisms to be assembled more rapidly. Similarly, the basic building blocks for digital services and products—the “technologies of startup production”, in the words of Josh Lerner of Harvard Business School—have become so evolved, cheap and ubiquitous that they can be easily combined and recombined.
  • Some will work out, many will not. Hal Varian, Google’s chief economist, calls this “combinatorial innovation”. In a way, these startups are doing what humans have always done: apply known techniques to new problems.
  • Economic and social shifts have provided added momentum for startups. The prolonged economic crisis that began in 2008 has caused many millennials—people born since the early 1980s—to abandon hope of finding a conventional job, so it makes sense for them to strike out on their own or join a startup.
  • startups are a big part of a new movement back to the city.
  • In essence, software (which is at the heart of these startups) is eating away at the structures established in the analogue age. LinkedIn, a social network, for instance, has fundamentally changed the recruitment business. Airbnb, a website on which private owners offer rooms and flats for short-term rent, is disrupting the hotel industry. And Uber, a service that connects would-be passengers with drivers, is doing the same for the taxi business.
  • It is a story of technological change creating a set of new institutions which governments around the world are increasingly supporting.
  • Startups run on hype; things are always “awesome” and people “super-excited”. But this world has its dark side as well. Failure can be devastating. Being an entrepreneur often means having no private life, getting little sleep and living on noodles, which may be one reason why few women are interested. More ominously, startups may destroy more jobs than they create, at least in the shorter term.
  • Yet this report will argue that the world of startups today offers a preview of how large swathes of the economy will be organised tomorrow. The prevailing model will be platforms with small, innovative firms operating on top of them. This pattern is already emerging in such sectors as banking, telecommunications, electricity and even government. As Archimedes, the leading scientist of classical antiquity, once said: “Give me a place to stand on, and I will move the Earth.”
  •  
    "Cheap and ubiquitous building blocks for digital products and services have caused an explosion in startups. Ludwig Siegele weighs its significance"
1 - 20 of 65 Next › Last »
Showing 20 items per page