Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged support

Rss Feed Group items tagged

Weiye Loh

Roger Pielke Jr.'s Blog: Flawed Food Narrative in the New York Times - 0 views

  • The article relies heavily on empty appeals to authority.  For example, it makes an unsupported assertion about what "scientists believe": Many of the failed harvests of the past decade were a consequence of weather disasters, like floods in the United States, drought in Australia and blistering heat waves in Europe and Russia. Scientists believe some, though not all, of those events were caused or worsened by human-induced global warming.  Completely unmentioned are the many (most?) scientists who believe that evidence is lacking to connect recent floods and heat waves to "human-induced global warming."
  • Some important issues beyond carbon dioxide are raised in the article, but are presented as secondary to the carbon narrative.  Other important issues are completely ignored -- for example, wheat rust goes unmentioned, and it probably has a greater risk to food supplies in the short term than anything to do with carbon dioxide. The carbon dioxide-centric focus on the article provides a nice illustration of how an obsession with "global warming" can serve to distract attention from factors that actually matter more for issues of human and environmental concern.
  • The central thesis of the NYT article is the following statement: The rapid growth in farm output that defined the late 20th century has slowed to the point that it is failing to keep up with the demand for food, driven by population increases and rising affluence in once-poor countries. But this claim of slowing output is shown to be completely false by the graphic that accompanies the article, shown below.  Far from slowing, farm output has increased dramatically over the past half-century (left panel) and on a per capita basis in 2009 was higher than at any point since the early 1980s (right panel).  
  •  
    Today's New York Times has an article by Justin Gillis on global food production that strains itself to the breaking point to make a story fit a narrative.  The narrative, of course, is that climate change "is helping to destabilize the food system."  The problem with the article is that the data that it presents don't support this narrative. Before proceeding, let me reiterate that human-caused climate change is a threat and one that we should be taking seriously. But taking climate change seriously does not mean shoehorning every global concern into that narrative, and especially conflating concerns about the future with what has been observed in the past. The risk of course of putting a carbon-centric spin on every issue is that other important dimensions are neglected.
Weiye Loh

How drug companies' PR tactics skew the presentation of medical research | Science | gu... - 0 views

  • Drug companies exert this hold on knowledge through publication planning agencies, an obscure subsection of the pharmaceutical industry that has ballooned in size in recent years, and is now a key lever in the commercial machinery that gets drugs sold.The planning companies are paid to implement high-impact publication strategies for specific drugs. They target the most influential academics to act as authors, draft the articles, and ensure that these include clearly-defined branding messages and appear in the most prestigious journals.
  • In selling their services to drug companies, the agencies' explain their work in frank language. Current Medical Directions, a medical communications company based in New York, promises to create "scientific content in support of our clients' messages". A rival firm from Macclesfield, Complete HealthVizion, describes what it does as "a fusion of evidence and inspiration."
  • There are now at least 250 different companies engaged in the business of planning clinical publications for the pharmaceutical industry, according to the International Society for Medical Publication Professionals, which said it has over 1000 individual members.Many firms are based in the UK and the east coast of the United States in traditional "pharma" centres like Pennsylvania and New Jersey.Precise figures are hard to pin down because publication planning is widely dispersed and is only beginning to be recognized as something like a discrete profession.
  • ...6 more annotations...
  • the standard approach to article preparation is for planners to work hand-in-glove with drug companies to create a first draft. "Key messages" laid out by the drug companies are accommodated to the extent that they can be supported by available data.Planners combine scientific information about a drug with two kinds of message that help create a "drug narrative". "Environmental" messages are intended to forge the sense of a gap in available medicine within a specific clinical field, while "product" messages show how the new drug meets this need.
  • In a flow-chart drawn up by Eric Crown, publications manager at Merck (the company that sold the controversial painkiller Vioxx), the determination of authorship appears as the fourth stage of the article preparation procedure. That is, only after company employees have presented clinical study data, discussed the findings, finalised "tactical plans" and identified where the article should be published.Perhaps surprisingly to the casual observer, under guidelines tightened up in recent years by the International Committee of Journal Editors (ICMJE), Crown's approach, typical among pharmaceutical companies, does not constitute ghostwriting.
  • What publication planners understand by the term is precise but it is also quite distinct from the popular interpretation.
  • "We may have written a paper, but the people we work with have to have some input and approve it."
  • "I feel that we're doing something good for mankind in the long-run," said Kimberly Goldin, head of the International Society for Medical Publication Professionals (ISMPP). "We want to influence healthcare in a very positive, scientifically sound way.""The profession grew out of a marketing umbrella, but has moved under the science umbrella," she said.But without the window of court documents to show how publication planning is being carried out today, the public simply cannot know if reforms the industry says it has made are genuine.
  • Dr Leemon McHenry, a medical ethicist at California State University, says nothing has changed. "They've just found more clever ways of concealing their activities. There's a whole army of hidden scribes. It's an epistemological morass where you can't trust anything."Alastair Matheson is a British medical writer who has worked extensively for medical communication agencies. He dismisses the planners' claims to having reformed as "bullshit"."The new guidelines work very nicely to permit the current system to continue as it has been", he said. "The whole thing is a big lie. They are promoting a product."
Weiye Loh

Roger Pielke Jr.'s Blog: Wanted: Less Spin, More Informed Debate - 0 views

  • , the rejection of proposals that suggest starting with a low carbon price is thus a pretty good guarantee against any carbon pricing at all.  It is rather remarkable to see advocates for climate action arguing against a policy that recommends implementing a carbon price, simply because it does not start high enough for their tastes.  For some, idealism trumps pragmatism, even if it means no action at all.
  • Ward writes: . . . climate change is the result of a number of market failures, the largest of which arises from the fact that the prices of products and services involving emissions of greenhouse gases do not reflect the true costs of the damage caused through impacts on the climate. . . All serious economic analyses of how to tackle climate change identify the need to correct this market failure through a carbon price, which can be implemented, for instance, through cap and trade schemes or carbon taxes. . . A carbon price can be usefully supplemented by improvements in innovation policies, but it needs to be at the core of action on climate change, as this paper by Carolyn Fischer and Richard Newell points out.
  • First, the criticism is off target. A low and rising carbon price is in fact a central element to the policy recommendations advanced by the Hartwell Group in Climate Pragmatism, the Hartwell Paper, and as well, in my book The Climate Fix.  In Climate Pragmatism, we approvingly cite Japan's low-but-rising fossil fuels tax and discuss a range of possible fees or taxes on fossil fuels, implemented, not to penalize energy use or price fossil fuels out of the market, but rather to ensure that as we benefit from today’s energy resources we are setting aside the funds necessary to accelerate energy innovation and secure the nation’s energy future.
  • ...3 more annotations...
  • Here is another debating lesson -- before engaging in public not only should one read the materials that they are critiquing, they should also read the materials that they cite in support of their own arguments. This is not the first time that Bob Ward has put out misleading information related to my work.  Ever since we debated in public at the Royal Institution, Bob has adopted guerrilla tactics, lobbing nonsense into the public arena and then hiding when challenged to support or defend his views.  As readers here know, I am all for open and respectful debate over these important topics.  Why is that instead, all we get is poorly informed misdirection and spin? Despite the attempts at spin, I'd welcome Bob's informed engagement on this topic. Perhaps he might start by explaining which of the 10 statements that I put up on the mathematics and logic underlying climate pragmatism is incorrect.
  • In comments to another blog, I've identified Bob as a PR flack. I see no reason to change that assessment. In fact, his actions only confirm it. Where does he fit into a scientific debate?
  • Thanks for the comment, but I'll take the other side ;-)First, this is a policy debate that involves various scientific, economic, political analyses coupled with various values commitments including monied interests -- and as such, PR guys are as welcome as anyone else.That said, the problem here is not that Ward is a PR guy, but that he is trying to make his case via spin and misrepresentation. That gets noticed pretty quickly by anyone paying attention and is easily shot down.
Weiye Loh

Would You Donate Your Facebook Account to Al Gore For a Day? | The Utopianist - Think B... - 0 views

  •  
    On September 14, Al Gore will launch the Climate Reality Project, or "24 Hours of Reality" - this most recent project will have 24 presentations, done by 24 presenters in 13 languages, each broadcast one hour after the other, representing all the time zones of the world. Al Gore will be connecting the dots of climate change, extreme weather and pollution, among other things - but the innovative thing is that he wants to harness the power of his follower's social media accounts in order to reach more people. Gore is asking his supporters to lend him their accounts - the Project will be posting status updates in their name, trying to reach many more people as well as start a dialogue on the subject.
Weiye Loh

When Science Trumps Policy: The Triumph of Insite « British Columbia « Canada... - 0 views

  • As skeptics we obviously want to see science based medicine and effective methods to improve public health. What this means is that, we skeptics, want to see medicine like vaccines promoted instead of homeopathy; but, we also want to see science based policy as well. What Insite has proven is that the harm reduction policy is working, in fact, working better than the “war on drugs” policy that the Conservative government has been supporting. Since the evidence is pointing to harm reduction being a more effective method of controlling the harmful effects of drug addiction in society, it should follow that harm reduction as a policy gain the support of our government and health care providers.
  • what was really distressing was that the Harper Government wasn’t just arguing against the evidence (saying for instance that it was either wrong or misguided) but actually arguing in spite of the evidence. What they were saying was that, yes, harm reduction appears to be working…but that’s irrelevant because that isn’t the policy we want to use.
Weiye Loh

Author Paulo Coelho supports piracy: "share to get revenue" - 0 views

  • A year ago, exciting news about publishing 2.0 reached the blogosphere. Thriller writer Paulo Coelho had started to tell people how he was using filesharing networks as a way to promote his books. Coelho thinks that giving people the possibility to swap his books for free, actually has a positive effect on sales. In a keynote speech at the Digital, Life, Design conference in Munich he gave some strikingly good examples. When he uploaded the Russian translation of “The Alchemist”, sales in Russia went from around a 1.000 books per year to 100.000 and then to a million and more
  • His American publisher wasn’t too pleased though. After a rather imitating call from CEO Jane Friedman, Coelho chose a middle way and made the book viewable – but not downloadable. The torrent links are still up there though. Why? Coelho: “You’ll have to share in order to get some revenue”.
  • “At the end of the day, it doesn’t hurt your sales. People download the book but don’t read it They wait for the hard copy anyway”, Coelho continued. “Don’t be fooled by the publishers who say that piracy costs authors money.”
    • Weiye Loh
       
      Reminds me of music downloads... some people, including those I know, actually download music for free and if they like it, they'll buy the CD. 
  •  
    Allowing people to download his books for free actually increases Coelho's book sales. 
Weiye Loh

Skepticblog » Global Warming Skeptic Changes His Tune - by Doing the Science ... - 0 views

  • To the global warming deniers, Muller had been an important scientific figure with good credentials who had expressed doubt about the temperature data used to track the last few decades of global warming. Muller was influenced by Anthony Watts, a former TV weatherman (not a trained climate scientist) and blogger who has argued that the data set is mostly from large cities, where the “urban heat island” effect might bias the overall pool of worldwide temperature data. Climate scientists have pointed out that they have accounted for this possible effect already, but Watts and Muller were unconvinced. With $150,000 (25% of their funding) from the Koch brothers (the nation’s largest supporters of climate denial research), as well as the Getty Foundation (their wealth largely based on oil money) and other funding sources, Muller set out to reanalyze all the temperature data by setting up the Berkeley Earth Surface Temperature Project.
  • Although only 2% of the data were analyzed by last month, the Republican climate deniers in Congress called him to testify in their March 31 hearing to attack global warming science, expecting him to give them scientific data supporting their biases. To their dismay, Muller behaved like a real scientist and not an ideologue—he followed his data and told them the truth, not what they wanted to hear. Muller pointed out that his analysis of the data set almost exactly tracked what the National Oceanographic and Atmospheric Administration (NOAA), the Goddard Institute of Space Science (GISS), and the Hadley Climate Research Unit at the University of East Anglia in the UK had already published (see figure).
  • Muller testified before the House Committee that: The Berkeley Earth Surface Temperature project was created to make the best possible estimate of global temperature change using as complete a record of measurements as possible and by applying novel methods for the estimation and elimination of systematic biases. We see a global warming trend that is very similar to that previously reported by the other groups. The world temperature data has sufficient integrity to be used to determine global temperature trends. Despite potential biases in the data, methods of analysis can be used to reduce bias effects well enough to enable us to measure long-term Earth temperature changes. Data integrity is adequate. Based on our initial work at Berkeley Earth, I believe that some of the most worrisome biases are less of a problem than I had previously thought.
  • ...4 more annotations...
  • The right-wing ideologues were sorely disappointed, and reacted viciously in the political sphere by attacking their own scientist, but Muller’s scientific integrity overcame any biases he might have harbored at the beginning. He “called ‘em as he saw ‘em” and told truth to power.
  • it speaks well of the scientific process when a prominent skeptic like Muller does his job properly and admits that his original biases were wrong. As reported in the Los Angeles Times : Ken Caldeira, an atmospheric scientist at the Carnegie Institution for Science, which contributed some funding to the Berkeley effort, said Muller’s statement to Congress was “honorable” in recognizing that “previous temperature reconstructions basically got it right…. Willingness to revise views in the face of empirical data is the hallmark of the good scientific process.”
  • This is the essence of the scientific method at its best. There may be biases in our perceptions, and we may want to find data that fits our preconceptions about the world, but if science is done properly, we get a real answer, often one we did not expect or didn’t want to hear. That’s the true test of when science is giving us a reality check: when it tells us “an inconvenient truth”, something we do not like, but is inescapable if one follows the scientific method and analyzes the data honestly.
  • Sit down before fact as a little child, be prepared to give up every preconceived notion, follow humbly wherever and to whatever abysses nature leads, or you shall learn nothing.
Weiye Loh

Scientist Beloved by Climate Deniers Pulls Rug Out from Their Argument - Environment - ... - 0 views

  • One of the scientists was Richard Muller from University of California, Berkeley. Muller has been working on an independent project to better estimate the planet's surface temperatures over time. Because he is willing to say publicly that he has some doubts about the accuracy of the temperature stations that most climate models are based on, he has been embraced by the science denying crowd.
  • A Koch brothers charity, for example, has donated nearly 25 percent of the financial support provided to Muller's project.
  • Skeptics of climate science have been licking their lips waiting for his latest research, which they hoped would undermine the data behind basic theories of anthropogenic climate change. At the hearing today, however, Muller threw them for a loop with this graph:
  • ...3 more annotations...
  • Muller's data (black line) tracks pretty well with the three established data sets. This is just an initial sampling of Muller's data—just 2 percent of the 1.6 billion records he's working with—but these early findings are incredibly consistent with the previous findings
  • In his testimony, Muller made these points (emphasis mine): The Berkeley Earth Surface Temperature project was created to make the best possible estimate of global temperature change using as complete a record of measurements as possible and by applying novel methods for the estimation and elimination of systematic biases. We see a global warming trend that is very similar to that previously reported by the other groups. The world temperature data has sufficient integrity to be used to determine global temperature trends. Despite potential biases in the data, methods of analysis can be used to reduce bias effects well enough to enable us to measure long-term Earth temperature changes. Data integrity is adequate. Based on our initial work at Berkeley Earth, I believe that some of the most worrisome biases are less of a problem than I had previously thought.
  • For the many climate deniers who hang their arguments on Muller's "doubts," this is a severe blow. Of course, when the hard scientific truths are inconvenient, climate denying House leaders can always call a lawyer, a marketing professor, and an economist into the scientific hearing.
  •  
    Today, there was a climate science hearing in the House Committee on Science, Space, and Technology. Of the six "expert" witnesses, only three were scientists. The others were an economist, a lawyer, and a professor of marketing. One of the scientists was Richard Muller from University of California, Berkeley. Muller has been working on an independent project to better estimate the planet's surface temperatures over time. Because he is willing to say publicly that he has some doubts about the accuracy of the temperature stations that most climate models are based on, he has been embraced by the science denying crowd. A Koch brothers charity, for example, has donated nearly 25 percent of the financial support provided to Muller's project.
Weiye Loh

Roger Pielke Jr.'s Blog: Blind Spots in Australian Flood Policies - 0 views

  • better management of flood risks in Australia will depend up better data on flood risk.  However, collecting such data has proven problematic
  • As many Queenslanders affected by January’s floods are realising, riverine flood damage is commonly excluded from household insurance policies. And this is unlikely to change until councils – especially in Queensland – stop dragging their feet and actively assist in developing comprehensive data insurance companies can use.
  • ? Because there is often little available information that would allow an insurer to adequately price this flood risk. Without this, there is little economic incentive for insurers to accept this risk. It would be irresponsible for insurers to cover riverine flood without quantifying and pricing the risk accordingly.
  • ...8 more annotations...
  • The first step in establishing risk-adjusted premiums is to know the likelihood of the depth of flooding at each address. This information has to be address-specific because the severity of flooding can vary widely over small distances, for example, from one side of a road to the other.
  • A litany of reasons is given for withholding data. At times it seems that refusal stems from a view that insurance is innately evil. This is ironic in view of the gratuitous advice sometimes offered by politicians and commentators in the aftermath of extreme events, exhorting insurers to pay claims even when no legal liability exists and riverine flood is explicitly excluded from policies.
  • Risk Frontiers is involved in jointly developing the National Flood Information Database (NFID) for the Insurance Council of Australia with Willis Re, a reinsurance broking intermediary. NFID is a five year project aiming to integrate flood information from all city councils in a consistent insurance-relevant form. The aim of NFID is to help insurers understand and quantify their risk. Unfortunately, obtaining the base data for NFID from some local councils is difficult and sometimes impossible despite the support of all state governments for the development of NFID. Councils have an obligation to assess their flood risk and to establish rules for safe land development. However, many are antipathetic to the idea of insurance. Some states and councils have been very supportive – in New South Wales and Victoria, particularly. Some states have a central repository – a library of all flood studies and digital terrain models (digital elevation data). Council reluctance to release data is most prevalent in Queensland, where, unfortunately, no central repository exists.
  • Second, models of flood risk are sometimes misused:
  • many councils only undertake flood modelling in order to create a single design flood level, usually the so-called one-in-100 year flood. (For reasons given later, a better term is the flood with an 1% annual likelihood of being exceeded.)
  • Inundation maps showing the extent of the flood with a 1% annual likelihood of exceedance are increasingly common on council websites, even in Queensland. Unfortunately these maps say little about the depth of water at an address or, importantly, how depth varies for less probable floods. Insurance claims usually begin when the ground is flooded and increase rapidly as water rises above the floor level. At Windsor in NSW, for example, the difference in the water depth between the flood with a 1% annual chance of exceedance and the maximum possible flood is nine metres. In other catchments this difference may be as small as ten centimetres. The risk of damage is quite different in both cases and an insurer needs this information if they are to provide coverage in these areas.
  • The ‘one-in-100 year flood’ term is misleading. To many it is something that happens regularly once every 100 years — with the reliability of a bus timetable. It is still possible, though unlikely, that a flood of similar magnitude or even greater flood could happen twice in one year or three times in successive years.
  • The calculations underpinning this are not straightforward but the probability that an address exposed to a 1-in-100 year flood will experience such an event or greater over the lifetime of the house – 50 years say – is around 40%. Over the lifetime of a typical home mortgage – 25 years – the probability of occurrence is 22%. These are not good odds.
  •  
    John McAneney of Risk Frontiers at Macquarie University in Sydney identifies some opportunities for better flood policies in Australia.
Weiye Loh

A Data Divide? Data "Haves" and "Have Nots" and Open (Government) Data « Gurs... - 0 views

  • Researchers have extensively explored the range of social, economic, geographical and other barriers which underlie and to a considerable degree “explain” (cause) the Digital Divide.  My own contribution has been to argue that “access is not enough”, it is whether opportunities and pre-conditions are in place for the “effective use” of the technology particularly for those at the grassroots.
  • The idea of a possible parallel “Data Divide” between those who have access and the opportunity to make effective use of data and particularly “open data” and those who do not, began to occur to me.  I was attending several planning/recruitment events for the Open Data “movement” here in Vancouver and the socio-demographics and some of the underlying political assumptions seemed to be somewhat at odds with the expressed advocacy position of “data for all”.
  • Thus the “open data” which was being argued for would not likely be accessible and usable to the groups and individuals with which Community Informatics has largely been concerned – the grassroots, the poor and marginalized, indigenous people, rural people and slum dwellers in Less Developed countries. It was/is hard to see, given the explanations, provided to date how these folks could use this data in any effective way to help them in responding to the opportunities for advance and social betterment which open data advocates have been indicating as the outcome of their efforts.
  • ...5 more annotations...
  • many involved in “open data” saw their interests and activities being confined to making data ‘legally” and “technically” accessible — what happened to it after that was somebody else’s responsibility.
  • while the Digital Divide deals with, for the most part “infrastructure” issues, the Data Divide is concerned with “content” issues.
  • where a Digital Divide might exist for example, as a result of geographical or policy considerations and thus have uniform effects on all those on the wrong side of the “divide” whatever their socio-demographic situation; a Data Divide and particularly one of the most significant current components of the Open Data movement i.e. OGD, would have particularly damaging negative effects and result in particularly significant lost opportunities for the most vulnerable groups and individuals in society and globally. (I’ve discussed some examples here at length in a previous blogpost.)
  • Data Divide thus would be the gap between those who have access to and are able to use Open (Government) Data and those who are not so enabled.
  • 1. infrastructure—being on the wrong side of the “Digital Divide” and thus not having access to the basic infrastructure supporting the availability of OGD. 2. devices—OGD that is not universally accessible and device independent (that only runs on I-Phones for example) 3. software—“accessible” OGD that requires specialized technical software/training to become “usable” 4. content—OGD not designed for use by those with handicaps, non-English speakers, those with low levels of functional literacy for example 5.  interpretation/sense-making—OGD that is only accessible for use through a technical intermediary and/or is useful only if “interpreted” by a professional intermediary 6. advocacy—whether the OGD is in a form and context that is supportive for use in advocacy (or other purposes) on behalf of marginalized and other groups and individuals 7. governance—whether the OGD process includes representation from the broad public in its overall policy development and governance (not just lawyers, techies and public servants).
Weiye Loh

Google's War on Nonsense - NYTimes.com - 0 views

  • As a verbal artifact, farmed content exhibits neither style nor substance.
  • The insultingly vacuous and frankly bizarre prose of the content farms — it seems ripped from Wikipedia and translated from the Romanian — cheapens all online information.
  • These prose-widgets are not hammered out by robots, surprisingly. But they are written by writers who work like robots. As recent accounts of life in these words-are-money mills make clear, some content-farm writers have deadlines as frequently as every 25 minutes. Others are expected to turn around reported pieces, containing interviews with several experts, in an hour. Some compose, edit, format and publish 10 articles in a single shift. Many with decades of experience in journalism work 70-hour weeks for salaries of $40,000 with no vacation time. The content farms have taken journalism hackwork to a whole new level.
  • ...6 more annotations...
  • So who produces all this bulk jive? Business Insider, the business-news site, has provided a forum to a half dozen low-paid content farmers, especially several who work at AOL’s enormous Seed and Patch ventures. They describe exhausting and sometimes exploitative writing conditions. Oliver Miller, a journalist with an MFA in fiction from Sarah Lawrence who once believed he’d write the Great American Novel, told me AOL paid him about $28,000 for writing 300,000 words about television, all based on fragments of shows he’d never seen, filed in half-hour intervals, on a graveyard shift that ran from 11 p.m. to 7 or 8 in the morning.
  • Mr. Miller’s job, as he made clear in an article last week in The Faster Times, an online newspaper, was to cram together words that someone’s research had suggested might be in demand on Google, position these strings as titles and headlines, embellish them with other inoffensive words and make the whole confection vaguely resemble an article. AOL would put “Rick Fox mustache” in a headline, betting that some number of people would put “Rick Fox mustache” into Google, and retrieve Mr. Miller’s article. Readers coming to AOL, expecting information, might discover a subliterate wasteland. But before bouncing out, they might watch a video clip with ads on it. Their visits would also register as page views, which AOL could then sell to advertisers.
  • commodify writing: you pay little or nothing to writers, and make readers pay a lot — in the form of their “eyeballs.” But readers get zero back, no useful content.
  • You can’t mess with Google forever. In February, the corporation concocted what it concocts best: an algorithm. The algorithm, called Panda, affects some 12 percent of searches, and it has — slowly and imperfectly — been improving things. Just a short time ago, the Web seemed ungovernable; bad content was driving out good. But Google asserted itself, and credit is due: Panda represents good cyber-governance. It has allowed Google to send untrustworthy, repetitive and unsatisfying content to the back of the class. No more A’s for cheaters.
  • the goal, according to Amit Singhal and Matt Cutts, who worked on Panda, is to “provide better rankings for high-quality sites — sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
  • Google officially rolled out Panda 2.2. Put “Whitey Bulger” into Google, and where you might once have found dozens of content farms, today you get links to useful articles from sites ranging from The Boston Globe, The Los Angeles Times, the F.B.I. and even Mashable, doing original analysis of how federal agents used social media to find Bulger. Last month, Demand Media, once the most notorious of the content farms, announced plans to improve quality by publishing more feature articles by hired writers, and fewer by “users” — code for unpaid freelancers. Amazing. Demand Media is stepping up its game.
  •  
    Content farms, which have flourished on the Web in the past 18 months, are massive news sites that use headlines, keywords and other tricks to lure Web-users into looking at ads. These sites confound and embarrass Google by gaming its ranking system. As a business proposition, they once seemed exciting. Last year, The Economist admiringly described Associated Content and Demand Media as cleverly cynical operations that "aim to produce content at a price so low that even meager advertising revenue can support it."
Weiye Loh

New Statesman - Johann Hari and media standards - 0 views

  • Consistency is a virtue. One cannot attack - in any principled terms - the reactionary and the credulous, the knavish and the foolish, for a casual approach to sources, data, and evidence, or for disregarding normal journalistic standards, if when it is a leading liberal writer that is caught out it is somehow exceptional. It simply smacks of shallow partisanship.
  • inconsistency also undermines the normative claims for the superiority of a liberal and critical approach.How can one sensibly call out the "other side" on any given issue in terms which one would not apply to one's "own side"?
  •  
    now that Johann Hari has apologised, one wonders if many who rushed to his support should apologise too. There were many liberal, rational, and atheistic writers and pundits who defended him on Twitter on terms they would never have extended to a conservative, religious, or quack writer or pundit exposed as making a similar sort of mistake. Naming names would be inflammatory; and they, and their followers, know who they are. What is important here is the basic principle of consistency and its value. Just imagine had it been, say, Peter Hitchens, Garry Bushell, Richard Littlejohn, Rod Liddle, Toby Young, Guido Fawkes, Melanie Phillips, Damian Thompson, Daniel Hannan, Christopher Booker, Andrew Roberts, Nadine Dorries, and so on, who had been caught out indulging in some similar malpractice. Would the many liberal or atheistic writers and pundits who sought to defend (or "put into perspective") Hari have been so charitable? Of course not.
Weiye Loh

Open data, democracy and public sector reform - 0 views

  •  
    Governments are increasingly making their data available online in standard formats and under licenses that permit the free re-use of data. The justifications advanced for this include claims regarding the economic potential of open government data (OGD), the potential for OGD to promote transparency and accountability of government and the role of OGD in supporting the reform and reshaping of public services. This paper takes a pragmatic mixed-methods approach to exploring uses of data from the UK national open government data portal, data.gov.uk, and identifies how the emerging practices of OGD use are developing. It sets out five 'processes' of data use, and describes a series of embedded cases of education OGD use, and use of public-spending OGD. Drawing upon quantitative and qualitative data it presents an outline account of the motivations driving different individuals to engage with open government data, and it identifies a range of connections between open government data use of processes of civic change. It argues that a "data for developers" narrative that assumes OGD use will primarily be mediated by technology developers is misplaced, and that whilst innovation-based routes to OGD-driven public sector reform are evident, the relationship between OGD and democracy is less clear. As strategic research it highlights a number of emerging policy issues for developing OGD provision and use, and makes a contribution towards theoretical understandings of OGD use in practice.
Weiye Loh

Gleick apology over Heartland leak stirs ethics debate among climate scientists | Envir... - 0 views

  • For some campaigners, such as Naomi Klein, Gleick was an unalloyed hero, who should be sent some "Twitter love", she wrote on Tuesday."Heartland has been subverting well-understood science for years," wrote Scott Mandia, co-founder of the climate science rapid response team. "They also subvert the education of our schoolchildren by trying to 'teach the controversy' where none exists."Mandia went on: "Peter Gleick, a scientist who is also a journalist, just used the same tricks that any investigative reporter uses to uncover the truth. He is the hero and Heartland remains the villain. He will have many people lining up to support him."
  • Others acknowledged Gleick's wrongdoing, but said it should be viewed in the context of the work of Heartland and other entities devoted to spreading disinformation about science."What Peter Gleick did was unethical. He acknowledges that from a point of view of professional ethics there is no defending those actions," said Dale Jamieson, an expert on ethics who heads the environmental studies programme at New York University. "But relative to what has been going on on the climate denial side this is a fairly small breach of ethics."He also rejected the suggestion that Gleick's wrongdoing could hurt the cause of climate change, or undermine the credibility of scientists."Whatever moral high ground there is in science comes from doing science," he said. "The failing that Peter Gleick engaged in is not a scientific failing. It is just a personal failure."
Weiye Loh

Community Informatics and Older Persons: The Necessary Connection « Gurstein'... - 0 views

  • we know instinctively and research evidence itself is beginning to emerge that particularly in the case of older persons individualized, medicalized, institutionally focused care may be precisely what is not needed and may have the effect over the span of the final decades of life to not only reduce the quality of life but even the length of life itself.
  • individuals including or even especially older persons are happiest and healthiest and thus less likely to need interventions from the formal medical system if they are living surrounded by family and friends and firmly embedded in communities where they have support, friendship and love.
Weiye Loh

When big pharma pays a publisher to publish a fake journal... : Respectful Insolence - 0 views

  • pharmaceutical company Merck, Sharp & Dohme paid Elsevier to produce a fake medical journal that, to any superficial examination, looked like a real medical journal but was in reality nothing more than advertising for Merck
  • As reported by The Scientist: Merck paid an undisclosed sum to Elsevier to produce several volumes of a publication that had the look of a peer-reviewed medical journal, but contained only reprinted or summarized articles--most of which presented data favorable to Merck products--that appeared to act solely as marketing tools with no disclosure of company sponsorship. "I've seen no shortage of creativity emanating from the marketing departments of drug companies," Peter Lurie, deputy director of the public health research group at the consumer advocacy nonprofit Public Citizen, said, after reviewing two issues of the publication obtained by The Scientist. "But even for someone as jaded as me, this is a new wrinkle." The Australasian Journal of Bone and Joint Medicine, which was published by Exerpta Medica, a division of scientific publishing juggernaut Elsevier, is not indexed in the MEDLINE database, and has no website (not even a defunct one). The Scientist obtained two issues of the journal: Volume 2, Issues 1 and 2, both dated 2003. The issues contained little in the way of advertisements apart from ads for Fosamax, a Merck drug for osteoporosis, and Vioxx.
  • there are numerous "throwaway" journals out there. "Throwaway" journals tend to be defined as journals that are provided free of charge, have a lot of advertising (a high "advertising-to-text" ratio, as it is often described), and contain no original investigations. Other relevant characteristics include: Supported virtually entirely by advertising revenue. Ads tend to be placed within article pages interrupting the articles, rather than between articles, as is the case with most medical journals that accept ads Virtually the entire content is reviews of existing content of variable (and often dubious) quality. Parasitic. Throwaways often summarize peer-reviewed research from real journals. Questionable (at best) peer review. Throwaways tend to cater to an uninvolved and uncritical readership. No original work.
Weiye Loh

The Creativity Crisis - Newsweek - 0 views

  • The accepted definition of creativity is production of something original and useful, and that’s what’s reflected in the tests. There is never one right answer. To be creative requires divergent thinking (generating many unique ideas) and then convergent thinking (combining those ideas into the best result).
  • Torrance’s tasks, which have become the gold standard in creativity assessment, measure creativity perfectly. What’s shocking is how incredibly well Torrance’s creativity index predicted those kids’ creative accomplishments as adults.
  • The correlation to lifetime creative accomplishment was more than three times stronger for childhood creativity than childhood IQ.
  • ...20 more annotations...
  • there is one crucial difference between IQ and CQ scores. With intelligence, there is a phenomenon called the Flynn effect—each generation, scores go up about 10 points. Enriched environments are making kids smarter. With creativity, a reverse trend has just been identified and is being reported for the first time here: American creativity scores are falling.
  • creativity scores had been steadily rising, just like IQ scores, until 1990. Since then, creativity scores have consistently inched downward.
  • It is the scores of younger children in America—from kindergarten through sixth grade—for whom the decline is “most serious.”
  • It’s too early to determine conclusively why U.S. creativity scores are declining. One likely culprit is the number of hours kids now spend in front of the TV and playing videogames rather than engaging in creative activities. Another is the lack of creativity development in our schools. In effect, it’s left to the luck of the draw who becomes creative: there’s no concerted effort to nurture the creativity of all children.
  • Around the world, though, other countries are making creativity development a national priority.
  • In China there has been widespread education reform to extinguish the drill-and-kill teaching style. Instead, Chinese schools are also adopting a problem-based learning approach.
  • When faculty of a major Chinese university asked Plucker to identify trends in American education, he described our focus on standardized curriculum, rote memorization, and nationalized testing.
  • Overwhelmed by curriculum standards, American teachers warn there’s no room in the day for a creativity class.
  • The age-old belief that the arts have a special claim to creativity is unfounded. When scholars gave creativity tasks to both engineering majors and music majors, their scores laid down on an identical spectrum, with the same high averages and standard deviations.
  • The argument that we can’t teach creativity because kids already have too much to learn is a false trade-off. Creativity isn’t about freedom from concrete facts. Rather, fact-finding and deep research are vital stages in the creative process.
  • The lore of pop psychology is that creativity occurs on the right side of the brain. But we now know that if you tried to be creative using only the right side of your brain, it’d be like living with ideas perpetually at the tip of your tongue, just beyond reach.
  • Creativity requires constant shifting, blender pulses of both divergent thinking and convergent thinking, to combine new information with old and forgotten ideas. Highly creative people are very good at marshaling their brains into bilateral mode, and the more creative they are, the more they dual-activate.
  • “Creativity can be taught,” says James C. Kaufman, professor at California State University, San Bernardino. What’s common about successful programs is they alternate maximum divergent thinking with bouts of intense convergent thinking, through several stages. Real improvement doesn’t happen in a weekend workshop. But when applied to the everyday process of work or school, brain function improves.
  • highly creative adults tended to grow up in families embodying opposites. Parents encouraged uniqueness, yet provided stability. They were highly responsive to kids’ needs, yet challenged kids to develop skills. This resulted in a sort of adaptability: in times of anxiousness, clear rules could reduce chaos—yet when kids were bored, they could seek change, too. In the space between anxiety and boredom was where creativity flourished.
  • highly creative adults frequently grew up with hardship. Hardship by itself doesn’t lead to creativity, but it does force kids to become more flexible—and flexibility helps with creativity.
  • In early childhood, distinct types of free play are associated with high creativity. Preschoolers who spend more time in role-play (acting out characters) have higher measures of creativity: voicing someone else’s point of view helps develop their ability to analyze situations from different perspectives. When playing alone, highly creative first graders may act out strong negative emotions: they’ll be angry, hostile, anguished.
  • In middle childhood, kids sometimes create paracosms—fantasies of entire alternative worlds. Kids revisit their paracosms repeatedly, sometimes for months, and even create languages spoken there. This type of play peaks at age 9 or 10, and it’s a very strong sign of future creativity.
  • From fourth grade on, creativity no longer occurs in a vacuum; researching and studying become an integral part of coming up with useful solutions. But this transition isn’t easy. As school stuffs more complex information into their heads, kids get overloaded, and creativity suffers. When creative children have a supportive teacher—someone tolerant of unconventional answers, occasional disruptions, or detours of curiosity—they tend to excel. When they don’t, they tend to underperform and drop out of high school or don’t finish college at high rates.
  • They’re quitting because they’re discouraged and bored, not because they’re dark, depressed, anxious, or neurotic. It’s a myth that creative people have these traits. (Those traits actually shut down creativity; they make people less open to experience and less interested in novelty.) Rather, creative people, for the most part, exhibit active moods and positive affect. They’re not particularly happy—contentment is a kind of complacency creative people rarely have. But they’re engaged, motivated, and open to the world.
  • A similar study of 1,500 middle schoolers found that those high in creative self-efficacy had more confidence about their future and ability to succeed. They were sure that their ability to come up with alternatives would aid them, no matter what problems would arise.
  •  
    The Creativity Crisis For the first time, research shows that American creativity is declining. What went wrong-and how we can fix it.
Weiman Kow

Think you're a good employee? Office snooping software can tell - CNN.com - 1 views

  • More than that, Killock believes using such software can have a negative psychological impact on a workplace. "It is a powerful signal that you do not fully trust the people you are paying or perhaps don't invest the time and care to properly manage them," he says.
    • Weiman Kow
       
      the presentation group brought up this point.. =)
  • Ultimately, true privacy only begins outside the workplace -- and the law supports that. In the United States, at least all email and other electronic content created on the employer's equipment belongs to the employer, not the employee. Slackers would do well to remember that.
  • But Charnock is keen to stress Cataphora isn't only about bosses spying on their team -- it works both ways.
    • Weiman Kow
       
      Is that really true?
  • ...5 more annotations...
  • the emails they send, the calls they make and the documents they write.
  • Our software builds a multi-dimensional model of normal behavior,
  • [We can tell] who is really being consulted by other employees, and on which topics; who is really making decisions
  • The software began as a tool to assist lawyers with the huge corporate databases often subpoenaed as evidence in trials but has now moved into human resources.
  • We do have extensive filters to try to weed out people who are highly productive in areas such as sports banter and knowledge of local bars,
  •  
    Just a link on advances in extensive office surveillance - this program is supposed to "separate the good employees from the bad by analyzing workers 'electronic footprints' -- the emails they send, the calls they make and the documents they write"
Weiye Loh

Liberal Democrat conference | Libel laws silence scientists | Richard Dawkins | Comment... - 0 views

  • Scientists often disagree with one another, sometimes passionately. But they don't go to court to sort out their differences, they go into the lab, repeat the experiments, carefully examine the controls and the statistical analysis. We care about whether something is true, supported by the evidence. We are not interested in whether somebody sincerely believes he is right.
    • Weiye Loh
       
      Exactly the reason why appeals to faith cannot work in secularism!!! Unfortunately, people who are unable to prove their point usually resort to underhand straw-in-nose methods; throw enough shit and hopefully some will stay.
  • Why doesn't it submit its case to the higher court of scientific test? I think we all know the answer.
Weiye Loh

Balderdash - 0 views

  • A letter Paul wrote to complain about the "The Dead Sea Scrolls" exhibition at the Arts House:To Ms. Amira Osman (Marketing and Communications Manager),cc.Colin Goh, General Manager,Florence Lee, Depury General ManagerDear Ms. Osman,I visited the Dead Sea Scrolls “exhibition” today with my wife. Thinking that it was from a legitimate scholarly institute or (how naïve of me!) the Israel Antiquities Authority, I was looking forward to a day of education and entertainment.Yet when I got it, much of the exhibition (and booklets) merely espouses an evangelical (fundamentalist) view of the Bible – there are booklets on the inerrancy of the Bible, on how archaeology has proven the Bible to be true etc.Apart from these there are many blatant misrepresentations of the state of archaeology and mainstream biblical scholarship:a) There was initial screening upon entry of a 5-10 minute pseudo-documentary on the Dead Sea Scrolls. A presenter (can’t remember the name) was described as a “biblical archaeologist” – a term that no serious archaeologist working in the Levant would apply to him or herself. (Some prefer the term “Syro-Palestinian archaeologist” but almost all reject the term “biblical archaeologist”). See the book by Thomas W. Davis, “Shifting Sands: The Rise and Fall of Biblical Archaeology”, Oxford, New York 2004. Davis is an actual archaeologist working in the field and the book tells why the term “Biblical archaeologist” is not considered a legitimate term by serious archaeologist.b) In the same presentation, the presenter made the erroneous statement that the entire old testament was translated into Greek in the third century BCE. This is a mistake – only the Pentateuch (the first five books of the Old Testament) was translated during that time. Note that this ‘error’ is not inadvertent but is a familiar claim by evangelical apologists who try to argue for an early date of all the books of the Old testament - if all the books have been translated by the third century BCE obviously these books must all have been written before then! This flies against modern scholarship which show that some books in the Old Testament such as the Book of Daniel was written only in the second century BCE]The actual state of scholarship on the Septuagint [The Greek translation of the Bible] is accurately given in the book by Ernst Würthwein, “The Text of the Old Testament” – Eerdmans 1988 pp.52-54c) Perhaps the most blatant error was one which claimed that the “Magdalene fragments” – which contains the 26th chapter of the Gospel of Matthew is dated to 50 AD!!! Scholars are unanimous in dating these fragments to 200 AD. The only ‘scholar’ cited that dated these fragments to 50 AD was the German papyrologist Carsten Thiede – a well know fundamentalist. This is what Burton Mack (a critical – legitimate – NT scholar) has to say about Thiede’s eccentric dating “From a critical scholar's point of view, Thiede's proposal is an example of just how desperate the Christian imagination can become in the quest to argue for the literal facticity of the Christian gospels” [Mack, Burton L., “Who Wrote the New Testament?:The Making of the Christian Myth” HarperCollins, San Francisco 1995] Yet the dating of 50 AD is presented as though it is a scholarly consensus position!In fact the last point was so blatant that I confronted the exhibitors. (Tak Boleh Tahan!!) One American exhibitor told me that “Yes, it could have been worded differently, but then we would have to change the whole display” (!!). When I told him that this was not a typo but a blatant attempt to deceive, he mentioned that Theide’s views are supported by “The Dallas Theological Seminary” – another well know evangelical institute!I have no issue with the religious strengthening their faith by having their own internal exhibitions on historical artifacts etc. But when it is presented to the public as a scholarly exhibition – this is quite close to being dishonest.I felt cheated of the $36 dollars I paid for the tickets and of the hour that I spent there before realizing what type of exhibition it was.I am disappointed with The Art House for show casing this without warning potential visitors of its clear religious bias.Yours sincerely,Paul TobinTo their credit, the Arts House speedily replied.
    • Weiye Loh
       
      The issue of truth is indeed so maddening. Certainly, the 'production' of truth has been widely researched and debated by scholars. Spivak for example, argued for the deconstruction by means of questioning the privilege of identity so that someone is believed to have the truth. And along the same line, albeit somewhat misunderstood I feel, It was mentioned in class that somehow people who are oppressed know better.
« First ‹ Previous 41 - 60 of 150 Next › Last »
Showing 20 items per page