Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Crowd-Sourcing

Rss Feed Group items tagged

Weiye Loh

Sharing Information Corrupts Wisdom of Crowds | Wired Science | Wired.com - 0 views

  • The effect — perhaps better described as the accuracy of crowds, since it best applies to questions involving quantifiable estimates — has been described for decades, beginning with Francis Galton’s 1907 account of fairgoers guessing an ox’s weight. It reached mainstream prominence with economist James Surowiecki’s 2004 bestseller, The Wisdom of Crowds.
  • As Surowiecki explained, certain conditions must be met for crowd wisdom to emerge. Members of the crowd ought to have a variety of opinions, and to arrive at those opinions independently.
  • Take those away, and crowd intelligence fails, as evidenced in some market bubbles. Computer modeling of crowd behavior also hints at dynamics underlying crowd breakdowns, with he balance between information flow and diverse opinions becoming skewed.
  •  
    When people can learn what others think, the wisdom of crowds may veer towards ignorance. In a new study of crowd wisdom - the statistical phenomenon by which individual biases cancel each other out, distilling hundreds or thousands of individual guesses into uncannily accurate average answers - researchers told test participants about their peers' guesses. As a result, their group insight went awry.
Weiye Loh

Cash-Strapped Scientists Turn to Public Crowd-Funding | The Utopianist - Think Bigger - 0 views

  •  
    These websites started primarily as a way to provide funding for creative projects involving film, music and other art mediums. Scientists using crowd funding is a relatively new phenomenon. Here's how The New York Times described it: "As research budgets tighten at universities and federal financing agencies, a new crop of Web-savvy scientists is hoping the wisdom - and generosity - of the crowds will come to the rescue." "Most crowd funding platforms thrive on transparency and a healthy dose of self-promotion but lack the safeguards and expert assessment of a traditional review process. Instead, money talks: the public decides which projects are worth pursuing by fully financing them."
Weiye Loh

Jonathan Stray » Measuring and improving accuracy in journalism - 0 views

  • Accuracy is a hard thing to measure because it’s a hard thing to define. There are subjective and objective errors, and no standard way of determining whether a reported fact is true or false
  • The last big study of mainstream reporting accuracy found errors (defined below) in 59% of 4,800 stories across 14 metro newspapers. This level of inaccuracy — where about one in every two articles contains an error — has persisted for as long as news accuracy has been studied, over seven decades now.
  • With the explosion of available information, more than ever it’s time to get serious about accuracy, about knowing which sources can be trusted. Fortunately, there are emerging techniques that might help us to measure media accuracy cheaply, and then increase it.
  • ...7 more annotations...
  • We could continuously sample a news source’s output to produce ongoing accuracy estimates, and build social software to help the audience report and filter errors. Meticulously applied, this approach would give a measure of the accuracy of each information source, and a measure of the efficiency of their corrections process (currently only about 3% of all errors are corrected.)
  • Real world reporting isn’t always clearly “right” or “wrong,” so it will often be hard to decide whether something is an error or not. But we’re not going for ultimate Truth here,  just a general way of measuring some important aspect of the idea we call “accuracy.” In practice it’s important that the error counting method is simple, clear and repeatable, so that you can compare error rates of different times and sources.
  • Subjective errors, though by definition involving judgment, should not be dismissed as merely differences in opinion. Sources found such errors to be about as common as factual errors and often more egregious [as rated by the sources.] But subjective errors are a very complex category
  • One of the major problems with previous news accuracy metrics is the effort and time required to produce them. In short, existing accuracy measurement methods are expensive and slow. I’ve been wondering if we can do better, and a simple idea comes to mind: sampling. The core idea is this: news sources could take an ongoing random sample of their output and check it for accuracy — a fact check spot check
  • Standard statistical theory tells us what the error on that estimate will be for any given number of samples (If I’ve got this right, the relevant formula is standard error of a population proportion estimate without replacement.) At a sample rate of a few stories per day, daily estimates of error rate won’t be worth much. But weekly and monthly aggregates will start to produce useful accuracy estimates
  • the first step would be admitting how inaccurate journalism has historically been. Then we have to come up with standardized accuracy evaluation procedures, in pursuit of metrics that capture enough of what we mean by “true” to be worth optimizing. Meanwhile, we can ramp up the efficiency of our online corrections processes until we find as many useful, legitimate errors as possible with as little staff time as possible. It might also be possible do data mining on types of errors and types of stories to figure out if there are patterns in how an organization fails to get facts right.
  • I’d love to live in a world where I could compare the accuracy of information sources, where errors got found and fixed with crowd-sourced ease, and where news organizations weren’t shy about telling me what they did and did not know. Basic factual accuracy is far from the only measure of good journalism, but perhaps it’s an improvement over the current sad state of affairs
  •  
    Professional journalism is supposed to be "factual," "accurate," or just plain true. Is it? Has news accuracy been getting better or worse in the last decade? How does it vary between news organizations, and how do other information sources rate? Is professional journalism more or less accurate than everything else on the internet? These all seem like important questions, so I've been poking around, trying to figure out what we know and don't know about the accuracy of our news sources. Meanwhile, the online news corrections process continues to evolve, which gives us hope that the news will become more accurate in the future.
Weiye Loh

Iceland Crowd-Sources its New Constitution on Facebook | The Utopianist - Think Bigger - 0 views

  • The Guardian reports: In creating the new document, the council has been posting draft clauses on its website every week since the project launched in April. The public can comment underneath or join a discussion on the council’s Facebook page. The council also has a Twitter account, a YouTube page where interviews with its members are regularly posted, and a Flickr account containing pictures of the 25 members at work, all intended to maximise interaction with citizens. Meetings of the council are open to the public and streamed live on to the website and Facebook page. The latter has more than 1,300 likes in a country of 320,000 people. The crowdsourcing follows a national forum last year where 950 randomly selected people spent a day discussing the constitution. If the committee has its way the draft bill, due to be ready at the end of July, will be put to a referendum without any changes imposed by parliament – so it will genuinely be a document by the people, for the people.
Weiye Loh

How wise are crowds? - 0 views

  • n the past, economists trying to model the propagation of information through a population would allow any given member of the population to observe the decisions of all the other members, or of a random sampling of them. That made the models easier to deal with mathematically, but it also made them less representative of the real world.
    • Weiye Loh
       
      Random sampling is not representative
  • this paper does is add the important component that this process is typically happening in a social network where you can’t observe what everyone has done, nor can you randomly sample the population to find out what a random sample has done, but rather you see what your particular friends in the network have done,” says Jon Kleinberg, Tisch University Professor in the Cornell University Department of Computer Science, who was not involved in the research. “That introduces a much more complex structure to the problem, but arguably one that’s representative of what typically happens in real settings.”
    • Weiye Loh
       
      So random sampling is actually more accurate?
  • Earlier models, Kleinberg explains, indicated the danger of what economists call information cascades. “If you have a few crucial ingredients — namely, that people are making decisions in order, that they can observe the past actions of other people but they can’t know what those people actually knew — then you have the potential for information cascades to occur, in which large groups of people abandon whatever private information they have and actually, for perfectly rational reasons, follow the crowd,”
  • ...8 more annotations...
  • The MIT researchers’ paper, however, suggests that the danger of information cascades may not be as dire as it previously seemed.
  • a mathematical model that describes attempts by members of a social network to make binary decisions — such as which of two brands of cell phone to buy — on the basis of decisions made by their neighbors. The model assumes that for all members of the population, there is a single right decision: one of the cell phones is intrinsically better than the other. But some members of the network have bad information about which is which.
  • The MIT researchers analyzed the propagation of information under two different conditions. In one case, there’s a cap on how much any one person can know about the state of the world: even if one cell phone is intrinsically better than the other, no one can determine that with 100 percent certainty. In the other case, there’s no such cap. There’s debate among economists and information theorists about which of these two conditions better reflects reality, and Kleinberg suggests that the answer may vary depending on the type of information propagating through the network. But previous models had suggested that, if there is a cap, information cascades are almost inevitable.
  • if there’s no cap on certainty, an expanding social network will eventually converge on an accurate representation of the state of the world; that wasn’t a big surprise. But they also showed that in many common types of networks, even if there is a cap on certainty, convergence will still occur.
  • people in the past have looked at it using more myopic models,” says Acemoglu. “They would be averaging type of models: so my opinion is an average of the opinions of my neighbors’.” In such a model, Acemoglu says, the views of people who are “oversampled” — who are connected with a large enough number of other people — will end up distorting the conclusions of the group as a whole.
  • What we’re doing is looking at it in a much more game-theoretic manner, where individuals are realizing where the information comes from. So there will be some correction factor,” Acemoglu says. “If I’m seeing you, your action, and I’m seeing Munzer’s action, and I also know that there is some probability that you might have observed Munzer, then I discount his opinion appropriately, because I know that I don’t want to overweight it. And that’s the reason why, even though you have these influential agents — it might be that Munzer is everywhere, and everybody observes him — that still doesn’t create a herd on his opinion.”
  • the new paper leaves a few salient questions unanswered, such as how quickly the network will converge on the correct answer, and what happens when the model of agents’ knowledge becomes more complex.
  • the MIT researchers begin to address both questions. One paper examines rate of convergence, although Dahleh and Acemoglu note that that its results are “somewhat weaker” than those about the conditions for convergence. Another paper examines cases in which different agents make different decisions given the same information: some people might prefer one type of cell phone, others another. In such cases, “if you know the percentage of people that are of one type, it’s enough — at least in certain networks — to guarantee learning,” Dahleh says. “I don’t need to know, for every individual, whether they’re for it or against it; I just need to know that one-third of the people are for it, and two-thirds are against it.” For instance, he says, if you notice that a Chinese restaurant in your neighborhood is always half-empty, and a nearby Indian restaurant is always crowded, then information about what percentages of people prefer Chinese or Indian food will tell you which restaurant, if either, is of above-average or below-average quality.
  •  
    By melding economics and engineering, researchers show that as social networks get larger, they usually get better at sorting fact from fiction.
Weiye Loh

McKinsey & Company - Clouds, big data, and smart assets: Ten tech-enabled business tren... - 0 views

  • 1. Distributed cocreation moves into the mainstreamIn the past few years, the ability to organise communities of Web participants to develop, market, and support products and services has moved from the margins of business practice to the mainstream. Wikipedia and a handful of open-source software developers were the pioneers. But in signs of the steady march forward, 70 per cent of the executives we recently surveyed said that their companies regularly created value through Web communities. Similarly, more than 68m bloggers post reviews and recommendations about products and services.
  • for every success in tapping communities to create value, there are still many failures. Some companies neglect the up-front research needed to identify potential participants who have the right skill sets and will be motivated to participate over the longer term. Since cocreation is a two-way process, companies must also provide feedback to stimulate continuing participation and commitment. Getting incentives right is important as well: cocreators often value reputation more than money. Finally, an organisation must gain a high level of trust within a Web community to earn the engagement of top participants.
  • 2. Making the network the organisation In earlier research, we noted that the Web was starting to force open the boundaries of organisations, allowing nonemployees to offer their expertise in novel ways. We called this phenomenon "tapping into a world of talent." Now many companies are pushing substantially beyond that starting point, building and managing flexible networks that extend across internal and often even external borders. The recession underscored the value of such flexibility in managing volatility. We believe that the more porous, networked organisations of the future will need to organise work around critical tasks rather than molding it to constraints imposed by corporate structures.
  • ...10 more annotations...
  • 3. Collaboration at scale Across many economies, the number of people who undertake knowledge work has grown much more quickly than the number of production or transactions workers. Knowledge workers typically are paid more than others, so increasing their productivity is critical. As a result, there is broad interest in collaboration technologies that promise to improve these workers' efficiency and effectiveness. While the body of knowledge around the best use of such technologies is still developing, a number of companies have conducted experiments, as we see in the rapid growth rates of video and Web conferencing, expected to top 20 per cent annually during the next few years.
  • 4. The growing ‘Internet of Things' The adoption of RFID (radio-frequency identification) and related technologies was the basis of a trend we first recognised as "expanding the frontiers of automation." But these methods are rudimentary compared with what emerges when assets themselves become elements of an information system, with the ability to capture, compute, communicate, and collaborate around information—something that has come to be known as the "Internet of Things." Embedded with sensors, actuators, and communications capabilities, such objects will soon be able to absorb and transmit information on a massive scale and, in some cases, to adapt and react to changes in the environment automatically. These "smart" assets can make processes more efficient, give products new capabilities, and spark novel business models. Auto insurers in Europe and the United States are testing these waters with offers to install sensors in customers' vehicles. The result is new pricing models that base charges for risk on driving behavior rather than on a driver's demographic characteristics. Luxury-auto manufacturers are equipping vehicles with networked sensors that can automatically take evasive action when accidents are about to happen. In medicine, sensors embedded in or worn by patients continuously report changes in health conditions to physicians, who can adjust treatments when necessary. Sensors in manufacturing lines for products as diverse as computer chips and pulp and paper take detailed readings on process conditions and automatically make adjustments to reduce waste, downtime, and costly human interventions.
  • 5. Experimentation and big data Could the enterprise become a full-time laboratory? What if you could analyse every transaction, capture insights from every customer interaction, and didn't have to wait for months to get data from the field? What if…? Data are flooding in at rates never seen before—doubling every 18 months—as a result of greater access to customer data from public, proprietary, and purchased sources, as well as new information gathered from Web communities and newly deployed smart assets. These trends are broadly known as "big data." Technology for capturing and analysing information is widely available at ever-lower price points. But many companies are taking data use to new levels, using IT to support rigorous, constant business experimentation that guides decisions and to test new products, business models, and innovations in customer experience. In some cases, the new approaches help companies make decisions in real time. This trend has the potential to drive a radical transformation in research, innovation, and marketing.
  • Using experimentation and big data as essential components of management decision making requires new capabilities, as well as organisational and cultural change. Most companies are far from accessing all the available data. Some haven't even mastered the technologies needed to capture and analyse the valuable information they can access. More commonly, they don't have the right talent and processes to design experiments and extract business value from big data, which require changes in the way many executives now make decisions: trusting instincts and experience over experimentation and rigorous analysis. To get managers at all echelons to accept the value of experimentation, senior leaders must buy into a "test and learn" mind-set and then serve as role models for their teams.
  • 6. Wiring for a sustainable world Even as regulatory frameworks continue to evolve, environmental stewardship and sustainability clearly are C-level agenda topics. What's more, sustainability is fast becoming an important corporate-performance metric—one that stakeholders, outside influencers, and even financial markets have begun to track. Information technology plays a dual role in this debate: it is both a significant source of environmental emissions and a key enabler of many strategies to mitigate environmental damage. At present, information technology's share of the world's environmental footprint is growing because of the ever-increasing demand for IT capacity and services. Electricity produced to power the world's data centers generates greenhouse gases on the scale of countries such as Argentina or the Netherlands, and these emissions could increase fourfold by 2020. McKinsey research has shown, however, that the use of IT in areas such as smart power grids, efficient buildings, and better logistics planning could eliminate five times the carbon emissions that the IT industry produces.
  • 7. Imagining anything as a service Technology now enables companies to monitor, measure, customise, and bill for asset use at a much more fine-grained level than ever before. Asset owners can therefore create services around what have traditionally been sold as products. Business-to-business (B2B) customers like these service offerings because they allow companies to purchase units of a service and to account for them as a variable cost rather than undertake large capital investments. Consumers also like this "paying only for what you use" model, which helps them avoid large expenditures, as well as the hassles of buying and maintaining a product.
  • In the IT industry, the growth of "cloud computing" (accessing computer resources provided through networks rather than running software or storing data on a local computer) exemplifies this shift. Consumer acceptance of Web-based cloud services for everything from e-mail to video is of course becoming universal, and companies are following suit. Software as a service (SaaS), which enables organisations to access services such as customer relationship management, is growing at a 17 per cent annual rate. The biotechnology company Genentech, for example, uses Google Apps for e-mail and to create documents and spreadsheets, bypassing capital investments in servers and software licenses. This development has created a wave of computing capabilities delivered as a service, including infrastructure, platform, applications, and content. And vendors are competing, with innovation and new business models, to match the needs of different customers.
  • 8. The age of the multisided business model Multisided business models create value through interactions among multiple players rather than traditional one-on-one transactions or information exchanges. In the media industry, advertising is a classic example of how these models work. Newspapers, magasines, and television stations offer content to their audiences while generating a significant portion of their revenues from third parties: advertisers. Other revenue, often through subscriptions, comes directly from consumers. More recently, this advertising-supported model has proliferated on the Internet, underwriting Web content sites, as well as services such as search and e-mail (see trend number seven, "Imagining anything as a service," earlier in this article). It is now spreading to new markets, such as enterprise software: Spiceworks offers IT-management applications to 950,000 users at no cost, while it collects advertising from B2B companies that want access to IT professionals.
  • 9. Innovating from the bottom of the pyramid The adoption of technology is a global phenomenon, and the intensity of its usage is particularly impressive in emerging markets. Our research has shown that disruptive business models arise when technology combines with extreme market conditions, such as customer demand for very low price points, poor infrastructure, hard-to-access suppliers, and low cost curves for talent. With an economic recovery beginning to take hold in some parts of the world, high rates of growth have resumed in many developing nations, and we're seeing companies built around the new models emerging as global players. Many multinationals, meanwhile, are only starting to think about developing markets as wellsprings of technology-enabled innovation rather than as traditional manufacturing hubs.
  • 10. Producing public good on the grid The role of governments in shaping global economic policy will expand in coming years. Technology will be an important factor in this evolution by facilitating the creation of new types of public goods while helping to manage them more effectively. This last trend is broad in scope and draws upon many of the other trends described above.
Weiye Loh

The future of customer support: Outsourcing is so last year | The Economist - 0 views

  • Gartner, the research company, estimates that using communities to solve support issues can reduce costs by up to 50%. When TomTom, a maker of satellite-navigation systems, switched on social support, members handled 20,000 cases in its first two weeks and saved it around $150,000. Best Buy, an American gadget retailer, values its 600,000 users at $5m annually. 
  •  
    "Unsourcing", as the new trend has been dubbed, involves companies setting up online communities to enable peer-to-peer support among users. Instead of speaking with a faceless person thousands of miles away, customers' problems are answered by individuals in the same country who have bought and used the same products. This happens either on the company's own website or on social networks like Facebook and Twitter, and the helpers are generally not paid anything for their efforts.
Weiye Loh

UNICEF - India - Children map their community using innovative technology in India - 0 views

  • After data were collected, the children drew the map’s first draft on a big sheet of paper. It clearly labelled and colour-coded each detail, from houses to street lamps. Now, the map and survey – which identified 71 sources of water but not one clean enough for drinking – can also be used as a powerful advocacy tool.
  • Ms. Das says improvements have already been made. Pointing to a lamp post in her crowded alley, she observes, “Things are already better. We have more light here.” The children also use survey data to target households during polio immunization campaigns. In teams armed with handmade paper megaphones and signs, they regularly march about shouting: “Shunun, shunun (listen),” imploring neighbours to bring children for polio drops. They also take toddlers to polio booths themselves. The children also mobilize for malaria information drives, to check on children who drop out of school, or to teach proper hand washing techniques. They tackle tough topics, like child marriage and human trafficking, with puppets and street plays at each community festival.
Weiye Loh

Smithsonian's Crowdsourced "The Art Of Video Games" Exhibition Comes Under Fire | The C... - 0 views

  • My initial concerns about the current show were its sort of lack of perspective. The strength of a curated show comes from the choice and arrangement of the works, and I worried that with a crowdsourced show like this, it would be hard to form a central thesis. What makes each of these games influential and how will those qualities come together to paint a moving picture of games as an art medium? I wasn’t sure this list particularly answered those questions.
  • They’ve avoided directly addressing the question of why are video games art, and instead danced around it, showing a number of wonderful games and explaining why each great. Despite this success though, I feel that the show was still damaged by the crowdsourced curation approach. While I agree that the player is a major component of games (as Abe Stein recently posted to his blog, “A game not played is no game at all”), the argument that because games are played by the public they should be publicly curated doesn’t necessarily follow for me, especially when the resultant list is so muddled.
  • Despite Chris’ apparent love for the games, the show doesn’t feel as strongly curated as it could have been, overly heavy in some places, and completely missing in others, and I think that is a result of the crowdsourcing. Although I’m sure Chris has a fantastic perspective that will tie this all together beautifully and the resulting show will be enjoyable and successful, I wish that he had just selected a strong list of games on his own and been confident with his picks.
  • ...1 more annotation...
  • perhaps it would have been nice to not side-step the question of why are these games, as a whole, important as art. Considering this is the first major American art institution to put on a video game show, I would have liked to see a more powerful statement about the medium.
Weiye Loh

TODAYonline | Commentary | For the info-rich and time-poor, digital curators to the res... - 0 views

  • digital "curators" choose and present things related to a specific topic and context. They "curate", as opposed to "aggregate", which implies plain collecting with little or no value add. Viewed in this context, Google search does the latter, not the former. So, who curates? The Huffington Post, or HuffPo, is one high-profile example and, it appears, a highly-valued one too, going by AOL numbers-crunchers who forked out US$315 million (S$396.9 million) to acquire it. Accolades have also come in for Arianna Huffington's team of contributors and more than 3,000 bloggers - from politicians to celebrities to think-tankers. The website was named second among the 25 best blogs of 2009 by Time magazine, and most powerful blog in the world by The Observer.
  • By sifting, sorting and presenting news and views - yes, "curating" - HuffPo makes itself useful in an age of too much information and too many opinions. (Strictly speaking, HuffPo is both a creator and curator.) If what HuffPo is doing seems deja vu, it is hardly surprising. Remember the good old "curated" news of the pre-Internet days when newspapers decided what news was published and what we read? Then, the Editor was the Curator with the capital "C".
  • But with the arrival of the Internet and the uploading of news and views by organisations and netizens, the bits and bytes have turned into a tsunami. Aggregators like Google search threw us some life buoys, using text and popularity to filter the content. But with millions of new articles and videos added to the Internet daily, the "right" content has become that proverbial needle in the haystack. Hence the need for curation.
  •  
    Inundated by the deluge of information, and with little time on our hands, some of us turn to social media networks. Sometimes, postings by friends are useful. But often, the typically self-indulgent musings are not. It's "curators" to the rescue.
Weiye Loh

Meet Science: What is "peer review"? - Boing Boing - 0 views

  • Scientists do complain about peer review. But let me set one thing straight: The biggest complaints scientists have about peer review are not that it stifles unpopular ideas. You've heard this truthy factoid from countless climate-change deniers, and purveyors of quack medicine. And peer review is a convenient scapegoat for their conspiracy theories. There's just enough truth to make the claims sound plausible.
  • Peer review is flawed. Peer review can be biased. In fact, really new, unpopular ideas might well have a hard time getting published in the biggest journals right at first. You saw an example of that in my interview with sociologist Harry Collins. But those sort of findings will often published by smaller, more obscure journals. And, if a scientist keeps finding more evidence to support her claims, and keeps submitting her work to peer review, more often than not she's going to eventually convince people that she's right. Plenty of scientists, including Harry Collins, have seen their once-shunned ideas published widely.
  • So what do scientists complain about? This shouldn't be too much of a surprise. It's the lack of training, the lack of feedback, the time constraints, and the fact that, the more specific your research gets, the fewer people there are with the expertise to accurately and thoroughly review your work.
  • ...5 more annotations...
  • Scientists are frustrated that most journals don't like to publish research that is solid, but not ground-breaking. They're frustrated that most journals don't like to publish studies where the scientist's hypothesis turned out to be wrong.
  • Some scientists would prefer that peer review not be anonymous—though plenty of others like that feature. Journals like the British Medical Journal have started requiring reviewers to sign their comments, and have produced evidence that this practice doesn't diminish the quality of the reviews.
  • There are also scientists who want to see more crowd-sourced, post-publication review of research papers. Because peer review is flawed, they say, it would be helpful to have centralized places where scientists can go to find critiques of papers, written by scientists other than the official peer-reviewers. Maybe the crowd can catch things the reviewers miss. We certainly saw that happen earlier this year, when microbiologist Rosie Redfield took a high-profile peer-reviewed paper about arsenic-based life to task on her blog. The website Faculty of 1000 is attempting to do something like this. You can go to that site, look up a previously published peer-reviewed paper, and see what other scientists are saying about it. And the Astrophysics Archive has been doing this same basic thing for years.
  • you shouldn't canonize everything a peer-reviewed journal article says just because it is a peer-reviewed journal article.
  • at the same time, being peer reviewed is a sign that the paper's author has done some level of due diligence in their work. Peer review is flawed, but it has value. There are improvements that could be made. But, like the old joke about democracy, peer review is the worst possible system except for every other system we've ever come up with.
  •  
    Being peer reviewed doesn't mean your results are accurate. Not being peer reviewed doesn't mean you're a crank. But the fact that peer review exists does weed out a lot of cranks, simply by saying, "There is a standard." Journals that don't have peer review do tend to be ones with an obvious agenda. White papers, which are not peer reviewed, do tend to contain more bias and self-promotion than peer-reviewed journal articles.
Weiye Loh

"Stem Cell City" To Make All Research Available To The Public | The Utopianist - Think ... - 0 views

  •  
    A new website launched in Toronto allows the public to peruse all the current research on stem cells, as well as take a tour of a lab and stay updated on any specific disease - all in the hopes of educating us about a line of research that has huge potential to save a lot of lives. The ethical and political controversy hovering over work with stem cells, particularly embryonic cells - which have the biggest potential but pose the greatest ethical problems - has made work in the field particularly jittery; stop and go funding, as well as confusion about the concept in the public sector hasn't made for the most ideal working conditions. Stem Cell City - an online portal launched yesterday may significantly contribute to the cause, its founding scientists hope.
Weiye Loh

Does "Inclusion" Matter for Open Government? (The Answer Is, Very Much Indeed... - 0 views

  • But in the context of the Open Government Partnership and the 70 or so countries that have already committed themselves to this or are in the process I’m not sure that the world can afford to wait to see whether this correlation is direct, indirect or spurious especially if we can recognize that in the world of OGP, the currency of accumulation and concentration is not raw economic wealth but rather raw political power.
  • in the same way as there appears to be an association between the rise of the Internet and increasing concentrations of wealth one might anticipate that the rise of Internet enabled structures of government might be associated with the increasing concentration of political power in fewer and fewer hands and particularly the hands of those most adept at manipulating the artifacts and symbols of the new Internet age.
  • I am struck by the fact that while the OGP over and over talks about the importance and value and need for Open Government there is no similar or even partial call for Inclusive Government.  I’ve argued elsewhere how “Open”, in the absence of attention being paid to ensuring that the pre-conditions for the broadest base of participation will almost inevitably lead to the empowerment of the powerful. What I fear with the OGP is that by not paying even a modicum of attention to the issue of inclusion or inclusive development and participation that all of the idealism and energy that is displayed today in Brasilia is being directed towards the creation of the Governance equivalents of the Internet billionaires whatever that might look like.
  • ...1 more annotation...
  • crowd sourced public policy
  •  
    alongside the rise of the Internet and the empowerment of the Internet generation has emerged the greatest inequalities of wealth and privilege that any of the increasingly Internet enabled economies/societies have experienced at least since the great Depression and perhaps since the beginnings of systematic economic record keeping.  The association between the rise of inequality and the rise of the Internet has not yet been explained and if may simply be a coincidence but somehow I'm doubtful and we await a newer generation of rather more critical and less dewey economists to give us the models and explanations for this co-evolution.
Weiye Loh

Eben Moglen Is Reshaping Internet With a Freedom Box - NYTimes.com - 0 views

  • Secretary of State Hillary Rodham Clinton spoke in Washington about the Internet and human liberty, a Columbia law professor in Manhattan, Eben Moglen, was putting together a shopping list to rebuild the Internet — this time, without governments and big companies able to watch every twitch of our fingers.
  • The list begins with “cheap, small, low-power plug servers,” Mr. Moglen said. “A small device the size of a cellphone charger, running on a low-power chip. You plug it into the wall and forget about it.”
  • Almost anyone could have one of these tiny servers, which are now produced for limited purposes but could be adapted to a full range of Internet applications, he said. “They will get very cheap, very quick,” Mr. Moglen said. “They’re $99; they will go to $69. Once everyone is getting them, they will cost $29.”
  • ...5 more annotations...
  • The missing ingredients are software packages, which are available at no cost but have to be made easy to use. “You would have a whole system with privacy and security built in for the civil world we are living in,” he said. “It stores everything you care about.” Put free software into the little plug server in the wall, and you would have a Freedom Box that would decentralize information and power, Mr. Moglen said. This month, he created the Freedom Box Foundation to organize the software.
  • In the first days of the personal computer era, many scoffed at the idea that free software could have an important place in the modern world. Today, it is the digital genome for millions of phones, printers, cameras, MP3 players, televisions, the Pentagon, the New York Stock Exchange and the computers that underpin Google’s empire.
  • Social networking has changed the balance of political power, he said, “but everything we know about technology tells us that the current forms of social network communication, despite their enormous current value for politics, are also intensely dangerous to use. They are too centralized; they are too vulnerable to state retaliation and control.”
  • investors were said to have put a value of about $50 billion on Facebook, the social network founded by Mark Zuckerberg. If revolutions for freedom rest on the shoulders of Facebook, Mr. Moglen said, the revolutionaries will have to count on individuals who have huge stakes in keeping the powerful happy.
  • “It is not hard, when everybody is just in one big database controlled by Mr. Zuckerberg, to decapitate a revolution by sending an order to Mr. Zuckerberg that he cannot afford to refuse,” Mr. Moglen said. By contrast, with tens of thousands of individual encrypted servers, there would be no one place where a repressive government could find out who was publishing or reading “subversive” material.
Weiye Loh

How Crowdsourcing Is Improving Global Communities - 0 views

  •  
    You can crowdsource almost anything these days - news, music videos, fashion advice, your love life or even your entire life. While these examples are all very useful (or just plain amusing), there are a plethora of examples of how innovative entrepreneurs and eager philanthropists are using crowdsourcing techniques to improve local and global communities in real, substantive ways.
Weiye Loh

Google's War on Nonsense - NYTimes.com - 0 views

  • As a verbal artifact, farmed content exhibits neither style nor substance.
  • The insultingly vacuous and frankly bizarre prose of the content farms — it seems ripped from Wikipedia and translated from the Romanian — cheapens all online information.
  • These prose-widgets are not hammered out by robots, surprisingly. But they are written by writers who work like robots. As recent accounts of life in these words-are-money mills make clear, some content-farm writers have deadlines as frequently as every 25 minutes. Others are expected to turn around reported pieces, containing interviews with several experts, in an hour. Some compose, edit, format and publish 10 articles in a single shift. Many with decades of experience in journalism work 70-hour weeks for salaries of $40,000 with no vacation time. The content farms have taken journalism hackwork to a whole new level.
  • ...6 more annotations...
  • So who produces all this bulk jive? Business Insider, the business-news site, has provided a forum to a half dozen low-paid content farmers, especially several who work at AOL’s enormous Seed and Patch ventures. They describe exhausting and sometimes exploitative writing conditions. Oliver Miller, a journalist with an MFA in fiction from Sarah Lawrence who once believed he’d write the Great American Novel, told me AOL paid him about $28,000 for writing 300,000 words about television, all based on fragments of shows he’d never seen, filed in half-hour intervals, on a graveyard shift that ran from 11 p.m. to 7 or 8 in the morning.
  • Mr. Miller’s job, as he made clear in an article last week in The Faster Times, an online newspaper, was to cram together words that someone’s research had suggested might be in demand on Google, position these strings as titles and headlines, embellish them with other inoffensive words and make the whole confection vaguely resemble an article. AOL would put “Rick Fox mustache” in a headline, betting that some number of people would put “Rick Fox mustache” into Google, and retrieve Mr. Miller’s article. Readers coming to AOL, expecting information, might discover a subliterate wasteland. But before bouncing out, they might watch a video clip with ads on it. Their visits would also register as page views, which AOL could then sell to advertisers.
  • commodify writing: you pay little or nothing to writers, and make readers pay a lot — in the form of their “eyeballs.” But readers get zero back, no useful content.
  • You can’t mess with Google forever. In February, the corporation concocted what it concocts best: an algorithm. The algorithm, called Panda, affects some 12 percent of searches, and it has — slowly and imperfectly — been improving things. Just a short time ago, the Web seemed ungovernable; bad content was driving out good. But Google asserted itself, and credit is due: Panda represents good cyber-governance. It has allowed Google to send untrustworthy, repetitive and unsatisfying content to the back of the class. No more A’s for cheaters.
  • the goal, according to Amit Singhal and Matt Cutts, who worked on Panda, is to “provide better rankings for high-quality sites — sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
  • Google officially rolled out Panda 2.2. Put “Whitey Bulger” into Google, and where you might once have found dozens of content farms, today you get links to useful articles from sites ranging from The Boston Globe, The Los Angeles Times, the F.B.I. and even Mashable, doing original analysis of how federal agents used social media to find Bulger. Last month, Demand Media, once the most notorious of the content farms, announced plans to improve quality by publishing more feature articles by hired writers, and fewer by “users” — code for unpaid freelancers. Amazing. Demand Media is stepping up its game.
  •  
    Content farms, which have flourished on the Web in the past 18 months, are massive news sites that use headlines, keywords and other tricks to lure Web-users into looking at ads. These sites confound and embarrass Google by gaming its ranking system. As a business proposition, they once seemed exciting. Last year, The Economist admiringly described Associated Content and Demand Media as cleverly cynical operations that "aim to produce content at a price so low that even meager advertising revenue can support it."
Weiye Loh

Is Crowdfunding the Future of Book Publishing? | The Utopianist - Think Bigger - 0 views

  • And just like Kickstarter, if the book doesn’t reach its targeted goal, your donation is refunded to you.

    I

  • One thing is for sure–traditional publishing isn’t getting anymore lucrative. That means under the current system, if you’re not selling a sure thing, publishers probably aren’t going to buy it. Hopefully crowdfunding sites like Unbound can change all of that–it sure beats waiting for a wealthy benefactor, anyway.
  •  
    writers now have a different option: crowdfunding their next novel. That's the idea behind Unbound, a new site from the U.K. that allows donors to pledge cash to authors in exchange for things like signed hard copies of the book, goodie bags and invites to the launch party. You can even choose to fund the entire project, in which case you get … I don't know, say, a back massage and a chicken dinner. The point is, you're directly involved in the process. And just like Kickstarter, if the book doesn't reach its targeted goal, your donation is refunded to you.
1 - 20 of 21 Next ›
Showing 20 items per page