Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged first

Rss Feed Group items tagged

Weiye Loh

Rationally Speaking: What do I think of Wikipedia? - 0 views

  • Scholarpedia. I know, this is probably the first time you've heard of it, and I must admit that I never use it myself, but it is an open access peer reviewed encyclopedia, curated by Dr. Eugene M. Izhikevich, associated with an outlet called the Brain Corporation, out in San Diego, CA. Don’t know anything more about it (even Wikipedia doesn’t have an article on that!).
  • I go to Wikipedia at least some of the times to use it as a starting point, a convenient trampoline to be used — together with Google (and, increasingly, Google Scholar) — to get an initial foothold into areas with which I am a bit less familiar. However, I don’t use that information for my writings (professional or for the general public) unless I actually check the sources and/or have independent confirmation of whatever it is that I found potentially interesting in the relevant Wikipedia article.This isn’t a matter of academic snobbism, it’s rather a question of sensibly covering your ass — which is the same advice I give to my undergraduate students (my graduate students better not be using Wiki for anything substantial at all, the peer reviewed Stanford Encyclopedia of Philosophy being a manyfold better source across the board).
Weiye Loh

Debating the Value of College in America : The New Yorker - 0 views

  • Society needs a mechanism for sorting out its more intelligent members from its less intelligent ones
  • Society wants to identify intelligent people early on so that it can funnel them into careers that maximize their talents. It wants to get the most out of its human resources. College is a process that is sufficiently multifaceted and fine-grained to do this. College is, essentially, a four-year intelligence test. Students have to demonstrate intellectual ability over time and across a range of subjects. If they’re sloppy or inflexible or obnoxious—no matter how smart they might be in the I.Q. sense—those negatives will get picked up in their grades.
  • college also sorts people according to aptitude. It separates the math types from the poetry types. At the end of the process, graduates get a score, the G.P.A., that professional schools and employers can trust as a measure of intellectual capacity and productive potential. It’s important, therefore, that everyone is taking more or less the same test.
  • ...2 more annotations...
  • College exposes future citizens to material that enlightens and empowers them, whatever careers they end up choosing. In performing this function, college also socializes. It takes people with disparate backgrounds and beliefs and brings them into line with mainstream norms of reason and taste. Independence of mind is tolerated in college, and even honored, but students have to master the accepted ways of doing things before they are permitted to deviate. Ideally, we want everyone to go to college, because college gets everyone on the same page. It’s a way of producing a society of like-minded grownups.
  • If you like the first theory, then it doesn’t matter which courses students take, or even what is taught in them, as long as they’re rigorous enough for the sorting mechanism to do its work. All that matters is the grades. If you prefer the second theory, then you might consider grades a useful instrument of positive or negative reinforcement, but the only thing that matters is what students actually learn. There is stuff that every adult ought to know, and college is the best delivery system for getting that stuff into people’s heads.
Weiye Loh

Major reform for climate body : Nature News - 0 views

  • The first major test of these changes will be towards the end of this year, with the release of a report assessing whether climate change is increasing the likelihood of extreme weather events. Despite much speculation, there is scant scientific evidence for such a link — particularly between climate warming, storm frequency and economic losses — and the report is expected to spark renewed controversy. "It'll be interesting to see how the IPCC will handle this hot potato where stakes are high but solid peer-reviewed results are few," says Silke Beck, a policy expert at the Helmholtz Centre for Environmental Research in Leipzig, Germany.
  •  
    A new conflict-of-interest policy will require all IPCC officials and authors to disclose financial and other interests relevant to their work (Pachauri had been harshly criticized in 2009 for alleged conflicts of interest.) The meeting also adopted a detailed protocol for addressing errors in existing and future IPCC reports, along with guidelines to ensure that descriptions of scientific uncertainties remain consistent across reports. "This is a heartening and encouraging outcome of the review we started one year ago," Pachauri told Nature. "It will strengthen the IPCC and help restore public trust in the climate sciences."
Weiye Loh

Hamlet and the region of death - The Boston Globe - 0 views

  • To many readers — and to some of Moretti’s fellow academics — the very notion of quantitative literary studies can seem like an offense to that which made literature worth studying in the first place: its meaning and beauty. For Moretti, however, moving literary scholarship beyond reading is the key to producing new knowledge about old texts — even ones we’ve been studying for centuries.
  •  
    Franco Moretti, however, often doesn't read the books he studies. Instead, he analyzes them as data. Working with a small group of graduate students, the Stanford University English professor has fed thousands of digitized texts into databases and then mined the accumulated information for new answers to new questions. How far, on average, do characters in 19th-century English novels walk over the course of a book? How frequently are new genres of popular fiction invented? How many words does the average novel's protagonist speak? By posing these and other questions, Moretti has become the unofficial leader of a new, more quantitative kind of literary study.
Weiye Loh

Jonathan Stray » Measuring and improving accuracy in journalism - 0 views

  • Accuracy is a hard thing to measure because it’s a hard thing to define. There are subjective and objective errors, and no standard way of determining whether a reported fact is true or false
  • The last big study of mainstream reporting accuracy found errors (defined below) in 59% of 4,800 stories across 14 metro newspapers. This level of inaccuracy — where about one in every two articles contains an error — has persisted for as long as news accuracy has been studied, over seven decades now.
  • With the explosion of available information, more than ever it’s time to get serious about accuracy, about knowing which sources can be trusted. Fortunately, there are emerging techniques that might help us to measure media accuracy cheaply, and then increase it.
  • ...7 more annotations...
  • We could continuously sample a news source’s output to produce ongoing accuracy estimates, and build social software to help the audience report and filter errors. Meticulously applied, this approach would give a measure of the accuracy of each information source, and a measure of the efficiency of their corrections process (currently only about 3% of all errors are corrected.)
  • Real world reporting isn’t always clearly “right” or “wrong,” so it will often be hard to decide whether something is an error or not. But we’re not going for ultimate Truth here,  just a general way of measuring some important aspect of the idea we call “accuracy.” In practice it’s important that the error counting method is simple, clear and repeatable, so that you can compare error rates of different times and sources.
  • Subjective errors, though by definition involving judgment, should not be dismissed as merely differences in opinion. Sources found such errors to be about as common as factual errors and often more egregious [as rated by the sources.] But subjective errors are a very complex category
  • One of the major problems with previous news accuracy metrics is the effort and time required to produce them. In short, existing accuracy measurement methods are expensive and slow. I’ve been wondering if we can do better, and a simple idea comes to mind: sampling. The core idea is this: news sources could take an ongoing random sample of their output and check it for accuracy — a fact check spot check
  • Standard statistical theory tells us what the error on that estimate will be for any given number of samples (If I’ve got this right, the relevant formula is standard error of a population proportion estimate without replacement.) At a sample rate of a few stories per day, daily estimates of error rate won’t be worth much. But weekly and monthly aggregates will start to produce useful accuracy estimates
  • the first step would be admitting how inaccurate journalism has historically been. Then we have to come up with standardized accuracy evaluation procedures, in pursuit of metrics that capture enough of what we mean by “true” to be worth optimizing. Meanwhile, we can ramp up the efficiency of our online corrections processes until we find as many useful, legitimate errors as possible with as little staff time as possible. It might also be possible do data mining on types of errors and types of stories to figure out if there are patterns in how an organization fails to get facts right.
  • I’d love to live in a world where I could compare the accuracy of information sources, where errors got found and fixed with crowd-sourced ease, and where news organizations weren’t shy about telling me what they did and did not know. Basic factual accuracy is far from the only measure of good journalism, but perhaps it’s an improvement over the current sad state of affairs
  •  
    Professional journalism is supposed to be "factual," "accurate," or just plain true. Is it? Has news accuracy been getting better or worse in the last decade? How does it vary between news organizations, and how do other information sources rate? Is professional journalism more or less accurate than everything else on the internet? These all seem like important questions, so I've been poking around, trying to figure out what we know and don't know about the accuracy of our news sources. Meanwhile, the online news corrections process continues to evolve, which gives us hope that the news will become more accurate in the future.
Weiye Loh

Freakonomics » Why Is Failure a Sign of a Healthy Economy? A Guest Post by Ti... - 0 views

  • Governments often fall down on all three: they have a particular ideology and so push a single-minded policy; they bet big; and they don’t bother to evaluate the results too carefully, perhaps through overconfidence. But markets can fail badly too, and for much the same reason. Just think about the subprime crisis. It failed the same three tests. First, many big banks and insurance companies were taking similar bets at similar times, so that when subprime loans started to go bad, much of Wall Street started struggling simultaneously. Second, the bets were gigantic. Fancy derivatives such as credit default swaps and complex mortgage-backed securities were new, rapidly growing, and largely untested. And third, many investment bankers were being paid large bonuses on the assumption that their performance could be measured properly – and it couldn’t, because profitable-seeming bets concealed large risks.
  • a study by Kathy Fogel, Randall Morck, and Bernard Yeung, found statistical evidence that economies with more churn in the corporate sector also had faster economic growth. The relationship even seems causal: churn today is correlated with fast economic growth tomorrow. The real benefit of this creative destruction, say Fogel and her colleagues, is not the appearance of “rising stars” but the disappearance of old, inefficient companies. Failure is not only common and unpredictable, it’s healthy.
  •  
    a study by Kathy Fogel, Randall Morck, and Bernard Yeung, found statistical evidence that economies with more churn in the corporate sector also had faster economic growth. The relationship even seems causal: churn today is correlated with fast economic growth tomorrow. The real benefit of this creative destruction, say Fogel and her colleagues, is not the appearance of "rising stars" but the disappearance of old, inefficient companies. Failure is not only common and unpredictable, it's healthy.
Weiye Loh

Google's in-house philosopher: Technologists need a "moral operating system" | VentureBeat - 0 views

  • technology-makers aren’t supposed to think about the morality of their products — they just build stuff and let other people worry about the ethics. But Horowitz pointed to the Manhattan Project, where physicists developed the nuclear bomb, as an obvious example where technologists should have thought carefully about the moral dimensions of their work. To put it another way, he argued that technology makers should be thinking as much about their “moral operating system” as their mobile operating system.
  • most of the evil in the world comes not from bad intentions, but rather from “not thinking.”
  • “Ethics is hard,” Horowitz said. “Ethics requires thinking.”
  • ...1 more annotation...
  • try to articulate how they decided what was right and wrong. “That’s the first step towards taking responsibility towards what we should do with all of our power,” Horowitz said, later adding, “We have so much power today. It is up to us to figure out what to do.”
  •  
    To illustrate how ethics are getting short-shrift in the tech world, Horowitz asked attendees whether they prefer the iPhone or Android. (When the majority voted for the iPhone, he joked that they were "suckers" who just chose the prettier device.) Then he asked whether it was a good idea to take data from an audience member's phone in order to provide various (and mostly beneficial) services, or whether he should be left alone, and the majority of audience voted to leave him alone. Finally, Horowitz wanted to know whether audience members would use the ideas proposed by John Stuart Mill or by Immanuel Kant to make that decision. Not surprisingly, barely anyone knew what he was talking about. "That's a terrifying result," Horowitz said. "We have stronger opinions about our handheld devices than about the moral framework we should use to guide our decisions."
Weiye Loh

Don't dumb me down | Science | The Guardian - 0 views

  • Science stories usually fall into three families: wacky stories, scare stories and "breakthrough" stories.
  • these stories are invariably written by the science correspondents, and hotly followed, to universal jubilation, with comment pieces, by humanities graduates, on how bonkers and irrelevant scientists are.
  • A close relative of the wacky story is the paradoxical health story. Every Christmas and Easter, regular as clockwork, you can read that chocolate is good for you (www.badscience.net/?p=67), just like red wine is, and with the same monotonous regularity
  • ...19 more annotations...
  • At the other end of the spectrum, scare stories are - of course - a stalwart of media science. Based on minimal evidence and expanded with poor understanding of its significance, they help perform the most crucial function for the media, which is selling you, the reader, to their advertisers. The MMR disaster was a fantasy entirely of the media's making (www.badscience.net/?p=23), which failed to go away. In fact the Daily Mail is still publishing hysterical anti-immunisation stories, including one calling the pneumococcus vaccine a "triple jab", presumably because they misunderstood that the meningitis, pneumonia, and septicaemia it protects against are all caused by the same pneumococcus bacteria (www.badscience.net/?p=118).
  • people periodically come up to me and say, isn't it funny how that Wakefield MMR paper turned out to be Bad Science after all? And I say: no. The paper always was and still remains a perfectly good small case series report, but it was systematically misrepresented as being more than that, by media that are incapable of interpreting and reporting scientific data.
  • Once journalists get their teeth into what they think is a scare story, trivial increases in risk are presented, often out of context, but always using one single way of expressing risk, the "relative risk increase", that makes the danger appear disproportionately large (www.badscience.net/?p=8).
  • he media obsession with "new breakthroughs": a more subtly destructive category of science story. It's quite understandable that newspapers should feel it's their job to write about new stuff. But in the aggregate, these stories sell the idea that science, and indeed the whole empirical world view, is only about tenuous, new, hotly-contested data
  • Articles about robustly-supported emerging themes and ideas would be more stimulating, of course, than most single experimental results, and these themes are, most people would agree, the real developments in science. But they emerge over months and several bits of evidence, not single rejiggable press releases. Often, a front page science story will emerge from a press release alone, and the formal academic paper may never appear, or appear much later, and then not even show what the press reports claimed it would (www.badscience.net/?p=159).
  • there was an interesting essay in the journal PLoS Medicine, about how most brand new research findings will turn out to be false (www.tinyurl.com/ceq33). It predictably generated a small flurry of ecstatic pieces from humanities graduates in the media, along the lines of science is made-up, self-aggrandising, hegemony-maintaining, transient fad nonsense; and this is the perfect example of the parody hypothesis that we'll see later. Scientists know how to read a paper. That's what they do for a living: read papers, pick them apart, pull out what's good and bad.
  • Scientists never said that tenuous small new findings were important headline news - journalists did.
  • there is no useful information in most science stories. A piece in the Independent on Sunday from January 11 2004 suggested that mail-order Viagra is a rip-off because it does not contain the "correct form" of the drug. I don't use the stuff, but there were 1,147 words in that piece. Just tell me: was it a different salt, a different preparation, a different isomer, a related molecule, a completely different drug? No idea. No room for that one bit of information.
  • Remember all those stories about the danger of mobile phones? I was on holiday at the time, and not looking things up obsessively on PubMed; but off in the sunshine I must have read 15 newspaper articles on the subject. Not one told me what the experiment flagging up the danger was. What was the exposure, the measured outcome, was it human or animal data? Figures? Anything? Nothing. I've never bothered to look it up for myself, and so I'm still as much in the dark as you.
  • Because papers think you won't understand the "science bit", all stories involving science must be dumbed down, leaving pieces without enough content to stimulate the only people who are actually going to read them - that is, the people who know a bit about science.
  • Compare this with the book review section, in any newspaper. The more obscure references to Russian novelists and French philosophers you can bang in, the better writer everyone thinks you are. Nobody dumbs down the finance pages.
  • Statistics are what causes the most fear for reporters, and so they are usually just edited out, with interesting consequences. Because science isn't about something being true or not true: that's a humanities graduate parody. It's about the error bar, statistical significance, it's about how reliable and valid the experiment was, it's about coming to a verdict, about a hypothesis, on the back of lots of bits of evidence.
  • science journalists somehow don't understand the difference between the evidence and the hypothesis. The Times's health editor Nigel Hawkes recently covered an experiment which showed that having younger siblings was associated with a lower incidence of multiple sclerosis. MS is caused by the immune system turning on the body. "This is more likely to happen if a child at a key stage of development is not exposed to infections from younger siblings, says the study." That's what Hawkes said. Wrong! That's the "Hygiene Hypothesis", that's not what the study showed: the study just found that having younger siblings seemed to be somewhat protective against MS: it didn't say, couldn't say, what the mechanism was, like whether it happened through greater exposure to infections. He confused evidence with hypothesis (www.badscience.net/?p=112), and he is a "science communicator".
  • how do the media work around their inability to deliver scientific evidence? They use authority figures, the very antithesis of what science is about, as if they were priests, or politicians, or parent figures. "Scientists today said ... scientists revealed ... scientists warned." And if they want balance, you'll get two scientists disagreeing, although with no explanation of why (an approach at its most dangerous with the myth that scientists were "divided" over the safety of MMR). One scientist will "reveal" something, and then another will "challenge" it
  • The danger of authority figure coverage, in the absence of real evidence, is that it leaves the field wide open for questionable authority figures to waltz in. Gillian McKeith, Andrew Wakefield, Kevin Warwick and the rest can all get a whole lot further, in an environment where their authority is taken as read, because their reasoning and evidence is rarely publicly examined.
  • it also reinforces the humanities graduate journalists' parody of science, for which we now have all the ingredients: science is about groundless, incomprehensible, didactic truth statements from scientists, who themselves are socially powerful, arbitrary, unelected authority figures. They are detached from reality: they do work that is either wacky, or dangerous, but either way, everything in science is tenuous, contradictory and, most ridiculously, "hard to understand".
  • This misrepresentation of science is a direct descendant of the reaction, in the Romantic movement, against the birth of science and empiricism more than 200 years ago; it's exactly the same paranoid fantasy as Mary Shelley's Frankenstein, only not as well written. We say descendant, but of course, the humanities haven't really moved forward at all, except to invent cultural relativism, which exists largely as a pooh-pooh reaction against science. And humanities graduates in the media, who suspect themselves to be intellectuals, desperately need to reinforce the idea that science is nonsense: because they've denied themselves access to the most significant developments in the history of western thought for 200 years, and secretly, deep down, they're angry with themselves over that.
  • had a good spirited row with an eminent science journalist, who kept telling me that scientists needed to face up to the fact that they had to get better at communicating to a lay audience. She is a humanities graduate. "Since you describe yourself as a science communicator," I would invariably say, to the sound of derisory laughter: "isn't that your job?" But no, for there is a popular and grand idea about, that scientific ignorance is a useful tool: if even they can understand it, they think to themselves, the reader will. What kind of a communicator does that make you?
  • Science is done by scientists, who write it up. Then a press release is written by a non-scientist, who runs it by their non-scientist boss, who then sends it to journalists without a science education who try to convey difficult new ideas to an audience of either lay people, or more likely - since they'll be the ones interested in reading the stuff - people who know their way around a t-test a lot better than any of these intermediaries. Finally, it's edited by a whole team of people who don't understand it. You can be sure that at least one person in any given "science communication" chain is just juggling words about on a page, without having the first clue what they mean, pretending they've got a proper job, their pens all lined up neatly on the desk.
Weiye Loh

TODAYonline | World | On BBC: Assisted suicide of terminally ill man - 0 views

  • Mr Smedley was such a private man that none of those friends had known in advance that he had planned his own assisted suicide at the Dignitas clinic in Switzerland, where he drank poison and died on Dec 10 last year. A few days after Mr Smedley's death, his close friends found individually-written letters from him in their post, telling each one how much they had meant to him.
  • They were in for a greater surprise when they were joined at his memorial service by Sir Terry, the author and campaigner, and the BBC crew that had filmed Mr Smedley's final moments. Unknown to all but his closest family, Mr Smedley invited Sir Terry to accompany him and his wife Christine, 60, to the clinic in Switzerland. Moments before he died, said Sir Terry: "I shook hands with Peter and he said to me 'Have a good life', and he added 'I know I have'." When a clinic worker asked him if he was ready to drink the poison that would end his life, Mr Smedley said: "Yes" and added: "I'd like to thank you all." After Mr Smedley died, said Sir Terry: "I was spinning not because anything bad had happened but something was saying', A man is dead... that's a bad thing,' but somehow the second part of the clause chimes in with, 'but he had an incurable disease that was dragging him down, so he's decided of his own free will to leave before he was dragged'. So it's not a bad thing."
High Syn

The Best Thing Happened to Party People Like Me - 1 views

I really have my doubts when I first heard about herbal highs and legal weed. I said "there is no such thing" but everything changed when I tried Kronic original. True to its promise, it gives the ...

legal pot

started by High Syn on 13 Jun 11 no follow-up yet
Weiye Loh

How the net traps us all in our own little bubbles | Technology | The Observer - 0 views

  • Google would use 57 signals – everything from where you were logging in from to what browser you were using to what you had searched for before – to make guesses about who you were and what kinds of sites you'd like. Even if you were logged out, it would customise its results, showing you the pages it predicted you were most likely to click on.
  • Most of us assume that when we google a term, we all see the same results – the ones that the company's famous Page Rank algorithm suggests are the most authoritative based on other pages' links. But since December 2009, this is no longer true. Now you get the result that Google's algorithm suggests is best for you in particular – and someone else may see something entirely different. In other words, there is no standard Google any more.
  • In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing oil into the Gulf of Mexico, I asked two friends to search for the term "BP". They're pretty similar – educated white left-leaning women who live in the north-east. But the results they saw were quite different. One saw investment information about BP. The other saw news.
  • ...7 more annotations...
  • the query "stem cells" might produce diametrically opposed results for scientists who support stem-cell research and activists who oppose it.
  • "Proof of climate change" might turn up different results for an environmental activist and an oil-company executive.
  • majority of us assume search engines are unbiased. But that may be just because they're increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click. Google's announcement marked the turning point of an important but nearly invisible revolution in how we consume information. You could say that on 4 December 2009 the era of personalisation began.
  • We are predisposed to respond to a pretty narrow set of stimuli – if a piece of news is about sex, power, gossip, violence, celebrity or humour, we are likely to read it first. This is the content that most easily makes it into the filter bubble. It's easy to push "Like" and increase the visibility of a friend's post about finishing a marathon or an instructional article about how to make onion soup. It's harder to push the "Like" button on an article titled "Darfur sees bloodiest month in two years". In a personalised world, important but complex or unpleasant issues – the rising prison population, for example, or homelessness – are less likely to come to our attention at all.
  • As a consumer, it's hard to argue with blotting out the irrelevant and unlikable. But what is good for consumers is not necessarily good for citizens. What I seem to like may not be what I actually want, let alone what I need to know to be an informed member of my community or country. "It's a civic virtue to be exposed to things that appear to be outside your interest," technology journalist Clive Thompson told me. Cultural critic Lee Siegel puts it a different way: "Customers are always right, but people aren't."
  • Personalisation is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life – much of which you might not trust friends with.
  • To be the author of your life, professor Yochai Benkler argues, you have to be aware of a diverse array of options and lifestyles. When you enter a filter bubble, you're letting the companies that construct it choose which options you're aware of. You may think you're the captain of your own destiny, but personalisation can lead you down a road to a kind of informational determinism in which what you've clicked on in the past determines what you see next – a web history you're doomed to repeat. You can get stuck in a static, ever- narrowing version of yourself – an endless you-loop.
  •  
    An invisible revolution has taken place is the way we use the net, but the increasing personalisation of information by search engines such as Google threatens to limit our access to information and enclose us in a self-reinforcing world view, writes Eli Pariser in an extract from The Filter Bubble
Weiye Loh

UNICEF - India - Children map their community using innovative technology in India - 0 views

  • After data were collected, the children drew the map’s first draft on a big sheet of paper. It clearly labelled and colour-coded each detail, from houses to street lamps. Now, the map and survey – which identified 71 sources of water but not one clean enough for drinking – can also be used as a powerful advocacy tool.
  • Ms. Das says improvements have already been made. Pointing to a lamp post in her crowded alley, she observes, “Things are already better. We have more light here.” The children also use survey data to target households during polio immunization campaigns. In teams armed with handmade paper megaphones and signs, they regularly march about shouting: “Shunun, shunun (listen),” imploring neighbours to bring children for polio drops. They also take toddlers to polio booths themselves. The children also mobilize for malaria information drives, to check on children who drop out of school, or to teach proper hand washing techniques. They tackle tough topics, like child marriage and human trafficking, with puppets and street plays at each community festival.
Weiye Loh

How drug companies' PR tactics skew the presentation of medical research | Science | gu... - 0 views

  • Drug companies exert this hold on knowledge through publication planning agencies, an obscure subsection of the pharmaceutical industry that has ballooned in size in recent years, and is now a key lever in the commercial machinery that gets drugs sold.The planning companies are paid to implement high-impact publication strategies for specific drugs. They target the most influential academics to act as authors, draft the articles, and ensure that these include clearly-defined branding messages and appear in the most prestigious journals.
  • In selling their services to drug companies, the agencies' explain their work in frank language. Current Medical Directions, a medical communications company based in New York, promises to create "scientific content in support of our clients' messages". A rival firm from Macclesfield, Complete HealthVizion, describes what it does as "a fusion of evidence and inspiration."
  • There are now at least 250 different companies engaged in the business of planning clinical publications for the pharmaceutical industry, according to the International Society for Medical Publication Professionals, which said it has over 1000 individual members.Many firms are based in the UK and the east coast of the United States in traditional "pharma" centres like Pennsylvania and New Jersey.Precise figures are hard to pin down because publication planning is widely dispersed and is only beginning to be recognized as something like a discrete profession.
  • ...6 more annotations...
  • the standard approach to article preparation is for planners to work hand-in-glove with drug companies to create a first draft. "Key messages" laid out by the drug companies are accommodated to the extent that they can be supported by available data.Planners combine scientific information about a drug with two kinds of message that help create a "drug narrative". "Environmental" messages are intended to forge the sense of a gap in available medicine within a specific clinical field, while "product" messages show how the new drug meets this need.
  • In a flow-chart drawn up by Eric Crown, publications manager at Merck (the company that sold the controversial painkiller Vioxx), the determination of authorship appears as the fourth stage of the article preparation procedure. That is, only after company employees have presented clinical study data, discussed the findings, finalised "tactical plans" and identified where the article should be published.Perhaps surprisingly to the casual observer, under guidelines tightened up in recent years by the International Committee of Journal Editors (ICMJE), Crown's approach, typical among pharmaceutical companies, does not constitute ghostwriting.
  • What publication planners understand by the term is precise but it is also quite distinct from the popular interpretation.
  • "We may have written a paper, but the people we work with have to have some input and approve it."
  • "I feel that we're doing something good for mankind in the long-run," said Kimberly Goldin, head of the International Society for Medical Publication Professionals (ISMPP). "We want to influence healthcare in a very positive, scientifically sound way.""The profession grew out of a marketing umbrella, but has moved under the science umbrella," she said.But without the window of court documents to show how publication planning is being carried out today, the public simply cannot know if reforms the industry says it has made are genuine.
  • Dr Leemon McHenry, a medical ethicist at California State University, says nothing has changed. "They've just found more clever ways of concealing their activities. There's a whole army of hidden scribes. It's an epistemological morass where you can't trust anything."Alastair Matheson is a British medical writer who has worked extensively for medical communication agencies. He dismisses the planners' claims to having reformed as "bullshit"."The new guidelines work very nicely to permit the current system to continue as it has been", he said. "The whole thing is a big lie. They are promoting a product."
Weiye Loh

Media Reacts to News That Norwegian Terror Suspect Isn't Muslim - Global - The Atlantic... - 0 views

  • The editorial remains up on the Post, "sixteen hours after its claims were shown to be false and hysterical, it's still there, with no correction or apology," according to James Fallows at The Atlantic. Fallows responded to Rubin's piece, in a blog post titled, "The Washington Post Owes the World an Apology for this Item," writing that: No, this is a sobering reminder for those who think it's too tedious to reserve judgment about horrifying events rather than instantly turning them into talking points for pre-conceived views. On a per capita basis, Norway lost twice as many people today as the U.S. did on 9/11. Imagine the political repercussions through the world if double-9/11-scale damage had been done by an al-Qaeda offshoot. The unbelievably sweeping damage is there in either case.
  • Ta-Nehisi Coates, in another Comment at The Atlantic, echoed Fallow's comments on Rubin's piece: As for this case, my golden rule is that as terrible as it is to be wrong, it many times more terrible to pretend that wrong is right. As of this wring, Rubin has issued no correction in any form. That is shameful.
  • In an op-ed at Jadaliyya, Shiva Balaghi calls the events a "Tragic Day for Norway; Shameful Day for Journalism." She summarizes her own view of the reports: I read a story in the New York Times that squarely pointed to jihadi groups angered at the war in Afghanistan...The Financial Times was no better. From the start, it reported allegations of Islamic terrorism, continuing with this view well into its evening reporting by which time an arrest had already been made in the case... Judy Woodruff’s interview with a Norwegian journalist that aired on PBS’s Newshour followed a similar scenario. In this 24/7 news cycle driven even more mad by terror experts who conduct research using google and tweet a mile a minute, journalists should exercise caution... Perhaps today the neo-Nazis in Europe count Muslims among the problems that drive their madness. But to a large degree, these right wing extremist views shaped twentieth century Europe.
  • ...2 more annotations...
  • Ibrahim Hewitt writes an editoral at Al-Jazeera, where he observes that once media outlets noted that the suspect was not Muslim, they disassociated connections between the suspect's beliefs and his alleged violent actions. ...the perpetrator was a "blond, blue-eyed Norwegian" with "political traits towards the right, and anti-Muslim views." Not surprisingly, the man's intentions were neither linked to these "traits," nor to his postings on "websites with Christian fundamentalist tendencies." Any influence "remains to be seen"; echoes of Oklahoma 1995. Interestingly, this criminal is described by one unnamed Norwegian official as a "madman."
  • ...Anyone who claims therefore, that the perpetrator's "right-wing traits" and "anti-Muslim views," or even links with "Christian fundamentalist" websites are irrelevant is trying to draw a veil over the unacceptable truths of such "traits" and expecting us to believe that right-wing ideology is incapable of prompting someone towards such criminality. Of course, that idea is nonsensical. Right-wing ideology was behind the Holocaust; it has been behind most anti-Semitism and other racism around the world; the notion of Europe's and Europeans' racial superiority - giving cultural credibility to the far-right - gave rise to the slave trade and the scramble for Africa leading to untold atrocities against "the Other"; ditto in the Middle and Far East.
  •  
    Jennifer Rubin
Weiye Loh

Some Scientists Fear Computer Chips Will Soon Hit a Wall - NYTimes.com - 0 views

  • The problem has the potential to counteract an important principle in computing that has held true for decades: Moore’s Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.
  • In their paper, Dr. Burger and fellow researchers simulated the electricity used by more than 150 popular microprocessors and estimated that by 2024 computing speed would increase only 7.9 times, on average. By contrast, if there were no limits on the capabilities of the transistors, the maximum potential speedup would be nearly 47 times, the researchers said.
  • Some scientists disagree, if only because new ideas and designs have repeatedly come along to preserve the computer industry’s rapid pace of improvement. Dr. Dally of Nvidia, for instance, is sanguine about the future of chip design. “The good news is that the old designs are really inefficient, leaving lots of room for innovation,” he said.
  • ...3 more annotations...
  • Shekhar Y. Borkar, a fellow at Intel Labs, called Dr. Burger’s analysis “right on the dot,” but added: “His conclusions are a little different than what my conclusions would have been. The future is not as golden as it used to be, but it’s not bleak either.” Dr. Borkar cited a variety of new design ideas that he said would help ease the limits identified in the paper. Intel recently developed a way to vary the power consumed by different parts of a processor, making it possible to have both slower, lower-power transistors as well as faster-switching ones that consume more power. Increasingly, today’s processor chips contain two or more cores, or central processing units, that make it possible to use multiple programs simultaneously. In the future, Intel computers will have different kinds of cores optimized for different kinds of problems, only some of which require high power.
  • And while Intel announced in May that it had found a way to use 3-D design to crowd more transistors onto a single chip, that technology does not solve the energy problem described in the paper about dark silicon. The authors of the paper said they had tried to account for some of the promised innovation, and they argued that the question was how far innovators could go in overcoming the power limits.
  • “It’s one of those ‘If we don’t innovate, we’re all going to die’ papers,” Dr. Patterson said in an e-mail. “I’m pretty sure it means we need to innovate, since we don’t want to die!”
Weiye Loh

Roger Pielke Jr.'s Blog: Faith-Based Education and a Return to Shop Class - 0 views

  • In the United States, nearly a half century of research, application of new technologies and development of new methods and policies has failed to translate into improved reading abilities for the nation’s children1.
  • the reasons why progress has been so uneven point to three simple rules for anticipating when more research and development (R&D) could help to yield rapid social progress. In a world of limited resources, the trick is distinguishing problems amenable to technological fixes from those that are not. Our rules provide guidance\ in making this distinction . . .
  • unlike vaccines, the textbooks and software used in education do not embody the essence of what needs to be done. That is, they don’t provide the basic ‘go’ of teaching and learning. That depends on the skills of teachers and on the attributes of classrooms and students. Most importantly, the effectiveness of a vaccine is largely independent of who gives or receives it, and of the setting in which it is given.
  • ...5 more annotations...
  • The three rules for a technological fix proposed by Sarewitz and Nelson are: I. The technology must largely embody the cause–effect relationship connecting problem to solution. II. The effects of the technological fix must be assessable using relatively unambiguous or uncontroversial criteria. III. Research and development is most likely to contribute decisively to solving a social problem when it focuses on improving a standardized technical core that already exists.
  • technology in the classroom fails with respect to each of the three criteria: (a) technology is not a causal factor in learning in the sense that more technology means more learning, (b) assessment of educational outcome sis itself difficult and contested, much less disentangling various causal factors, and (c) the lack of evidence that technology leads to improved educational outcomes means that there is no such standardized technological core.
  • This conundrum calls into question one of the most significant contemporary educational movements. Advocates for giving schools a major technological upgrade — which include powerful educators, Silicon Valley titans and White House appointees — say digital devices let students learn at their own pace, teach skills needed in a modern economy and hold the attention of a generation weaned on gadgets. Some backers of this idea say standardized tests, the most widely used measure of student performance, don’t capture the breadth of skills that computers can help develop. But they also concede that for now there is no better way to gauge the educational value of expensive technology investments.
  • absent clear proof, schools are being motivated by a blind faith in technology and an overemphasis on digital skills — like using PowerPoint and multimedia tools — at the expense of math, reading and writing fundamentals. They say the technology advocates have it backward when they press to upgrade first and ask questions later.
  • [D]emand for educated labour is being reconfigured by technology, in much the same way that the demand for agricultural labour was reconfigured in the 19th century and that for factory labour in the 20th. Computers can not only perform repetitive mental tasks much faster than human beings. They can also empower amateurs to do what professionals once did: why hire a flesh-and-blood accountant to complete your tax return when Turbotax (a software package) will do the job at a fraction of the cost? And the variety of jobs that computers can do is multiplying as programmers teach them to deal with tone and linguistic ambiguity. Several economists, including Paul Krugman, have begun to argue that post-industrial societies will be characterised not by a relentless rise in demand for the educated but by a great “hollowing out”, as mid-level jobs are destroyed by smart machines and high-level job growth slows. David Autor, of the Massachusetts Institute of Technology (MIT), points out that the main effect of automation in the computer era is not that it destroys blue-collar jobs but that it destroys any job that can be reduced to a routine. Alan Blinder, of Princeton University, argues that the jobs graduates have traditionally performed are if anything more “offshorable” than low-wage ones. A plumber or lorry-driver’s job cannot be outsourced to India.
  •  
    In 2008 Dick Nelson and Dan Sarewitz had a commentary in Nature (here in PDF) that eloquently summarized why it is that we should not expect technology in the classroom to reault in better educational outcomes as they suggest we should in the case of a tehcnology like vaccines
Weiye Loh

Smithsonian's Crowdsourced "The Art Of Video Games" Exhibition Comes Under Fire | The C... - 0 views

  • My initial concerns about the current show were its sort of lack of perspective. The strength of a curated show comes from the choice and arrangement of the works, and I worried that with a crowdsourced show like this, it would be hard to form a central thesis. What makes each of these games influential and how will those qualities come together to paint a moving picture of games as an art medium? I wasn’t sure this list particularly answered those questions.
  • They’ve avoided directly addressing the question of why are video games art, and instead danced around it, showing a number of wonderful games and explaining why each great. Despite this success though, I feel that the show was still damaged by the crowdsourced curation approach. While I agree that the player is a major component of games (as Abe Stein recently posted to his blog, “A game not played is no game at all”), the argument that because games are played by the public they should be publicly curated doesn’t necessarily follow for me, especially when the resultant list is so muddled.
  • Despite Chris’ apparent love for the games, the show doesn’t feel as strongly curated as it could have been, overly heavy in some places, and completely missing in others, and I think that is a result of the crowdsourcing. Although I’m sure Chris has a fantastic perspective that will tie this all together beautifully and the resulting show will be enjoyable and successful, I wish that he had just selected a strong list of games on his own and been confident with his picks.
  • ...1 more annotation...
  • perhaps it would have been nice to not side-step the question of why are these games, as a whole, important as art. Considering this is the first major American art institution to put on a video game show, I would have liked to see a more powerful statement about the medium.
Weiye Loh

First principles of justice: Rights and wrongs | The Economist - 0 views

  • Mr Sandel illustrates the old classroom chestnut—is it ever right to kill one innocent person to save the lives of several others?—with a horrifying dilemma from Afghanistan in 2005. A four-man American unit on reconnaissance behind lines stumbled on a shepherd likely, if let go, to betray them to the Taliban. They could not hold him prisoner. Nor, on moral grounds, would the serviceman in charge kill him. Released, the shepherd alerted the Taliban, who surrounded the unit. Three were killed along with 16 Americans in a rescue helicopter. The soldier in command, who lived, called his decision “stupid, lamebrained and southern-fried”. Which was right, his earlier refusal or his later regret?
  • He returns also to an old charge against the late John Rawls. In “Liberalism and the Limits of Justice” (1982) Mr Sandel argued that Rawls’s celebrated account of social justice downplayed the moral weight of family feeling, group loyalties and community attachments. He repeats those “communitarian” charges here.
Weiye Loh

A Singapore Taxi Driver's Diary: May 6, 2009. Wednesday: The countERProductive erp system - 0 views

  • As far as I know, ERP is not a mere toll system that cares nothing but money. It is actually designed for the greater good: a smooth traffic flow characteristic of the efficiency-minded Singapore. Money is only a means to an end, so to speak. The ERP system, as I remembered, is a world’s first, uniquely Singapore invention for fighting traffic jams, and has been hailed as a genius answer to a common problem in large metropolitan centers around world. A wonder remedy for a disease brought about by advancement of civilization, very much like the cholesterol-clogged blood circulation in human bodies.
  • However, I can’t help being perplexed this time. How could a highly celebrated system like ERP become so hopelessly impotent in regulating traffic nowadays?
  • What happened to “money is only a means to an end”?
    • Weiye Loh
       
      Does Kantian ethics include the treatment of non-subjects i.e. objects as means to an end? What about exploitation of environment? Hmm...
« First ‹ Previous 221 - 240 of 273 Next › Last »
Showing 20 items per page