Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Road

Rss Feed Group items tagged

Weiye Loh

Privacy in Singapore - 9 views

http://unpan1.un.org/intradoc/groups/public/documents/APCITY/UNPAN002553.pdf There is no general data protection or privacy law in Singapore. The government has been aggressive in using surveill...

Singapore Privacy Electronic Road Pricing Surveillance

Weiye Loh

Hunch Blog | Blog Archive | You've got mail: What your email domain says about you - 0 views

  • AOL users are most likely to be overweight women ages 35-64 who have a high school diploma and are spiritual, but not religious. They tend to be politically middle of the road, in a relationship of 10+ years, and have children. AOL users live in the suburbs and haven’t traveled outside their own country. Family is their first priority. AOL users mostly read magazines, have a desktop computer, listen to the radio, and watch TV on 1-3 DVRs in their home. At home, they lounge around in sweats. AOL users are optimistic extroverts who prefer sweet snacks and like working on a team.
  • Gmail users are most likely to be thin young men ages 18-34 who are college-educated and not religious. Like other young Hunch users, they tend to be politically liberal, single (and ready to mingle), and childless. Gmail users live in cities and have traveled to five or more countries. They’re career-focused and plugged in — they mostly read blogs, have an iPhone and laptop, and listen to music via MP3s and computers (but they don’t have a DVR). At home, they lounge around in a t-shirt and jeans. Gmail users prefer salty snacks and are introverted and entrepreneurial. They are optimistic or pessimistic, depending on the situation.
  • Hotmail users are most likely to be young women of average build ages 18-34 (and younger) who have a high school diploma and are not religious. They tend to be politically middle of the road, single, and childless. Hotmail users live in the suburbs, perhaps still with their parents, and have traveled to up to five countries. They mostly read magazines and contemporary fiction, have a laptop, and listen to music via MP3s and computers (but they don’t have a DVR). At home, Hotmail users lounge around in a t-shirt and jeans. They’re introverts who prefer sweet snacks and like working on a team. They consider themselves more pessimistic, but sometimes it depends on the situation.
  • ...1 more annotation...
  • Yahoo! users are most likely to be overweight women ages 18-49 who have a high school diploma and are spiritual, but not religious. They tend to be politically middle of the road, in a relationship of 1-5 years, and have children. Yahoo! users live in the suburbs or in rural areas and haven’t traveled outside their own country. Family is their first priority. They mostly read magazines, are almost equally likely to have a laptop or desktop computer, listen to the radio and cds, and watch TV on 1-2 DVRs in their home. At home, Yahoo! users lounge around in pajamas. They’re extroverts who prefer sweet snacks and like working on a team. Yahoo! users are optimistic or pessimistic, depending on the situation.
  •  
    What your email domain says about you
Weiye Loh

Analysis: Midterm election results to limit Internet regulation | Reuters - 0 views

  • Republicans are traditionally against onerous regulation of private industry, and many campaigned on promises to rein in government before the midterm elections, which saw the GOP pick up 60 House seats to secure the majority.
  • As a result, Verizon Communications Inc, AT&T Inc and Comcast Corp have gained the upper hand in the long-fought battle over net neutrality rules,
  • The underlying idea of net neutrality is that high-speed and mobile Internet providers should not be allowed to give preferential treatment to content providers that pay for faster transmission.Companies like Verizon, AT&T and Comcast have lobbied against such regulations, saying they could crimp profits and lessen investments.At stake is how quickly handheld devices, like Research in Motion Ltd's BlackBerry and Apple Inc's iPhone, can receive and download videos and other content.
  •  
    Republican takeover of the House of Representatives will mean fewer regulations for technology and telecommunications companies and a tough road ahead for the Federal Communications Commission.
Weiye Loh

The Medium Is Not The Message: 3 Handwritten Newspapers | Brain Pickings - 0 views

  • Handwritten newspapers.
  • Since 1927, The Musalman has been quietly churning out its evening edition of four pages, all of which hand-written by Indian calligraphers in the shadow of the Wallajah Mosque in the city of Chennai. According to Wired, it might just be the last remaining hand-written newspaper in the world. It’s also India’s oldest daily newspaper in Urdu, the Hindustani language typically spoken by Muslims in South Asia. The Musalman: Preservation of a Dream is wonderful short film by Ishani K. Dutta, telling the story of the unusual publication and its writers’ dedication to the ancient art of Urdu calligraphy.

  • I mentioned a fascinating reversal of the-medium-is-the-message as one Japanese newspaper reverted to hand-written editions once the earthquake-and-tsunami disaster destroyed all power in the city of Ishinomaki in Miyagi Prefecture. For the next six days, the editors of the Ishinomaki Hibi Shimbun “printed” the daily newspaper’s disaster coverage the only way possible: By hand, in pen and paper. Using flashlights and marker pens, the reporters wrote the stories on poster-size paper and pinned the dailies to the entrance doors of relief centers around the city. Six staffers collected stories, which another three digested, spending an hour and a half per day composing the newspapers by hand.
  •  
    Minuscule literacy rates and prevailing poverty may not be conditions particularly conducive to publishing entrepreneurship, but they were no hindrance for Monrovia's The Daily Talk, a clever concept by Alfred Sirleaf that reaches thousands of Liberians every day by printing just once copy. That copy just happens to reside on a large blackboard on the side of one of the capital's busiest roads. Sirleaf started the project in 2000, at the peak of Liberia's civil war, but its cultural resonance and open access sustained it long after the war was over. To this day, he runs this remarkable one-man show as the editor, reporter, production manager, designer, fact-checker and publicist of The Daily Talk. For an added layer of thoughtfulness and sophistication, Sirleaf uses symbols to indicate specific topics for those who struggle to read. The common man in society can't afford a newspaper, can't afford to buy a generator to get on the internet - you know, power shortage - and people are caught up in a city where they have no access to information. And all of these things motivated me to come up with a kind of free media system for people to get informed." ~ Alfred Sirleaf
kenneth yang

SD ballot measure would ease restrictions on stem cell research - 1 views

PIERRE, S.D. (AP) - A proposed ballot issue to ease restrictions on stem cell research will strike a chord with South Dakotans because nearly everyone has had a serious disease or knows someone who...

ethics rights stem cell

started by kenneth yang on 21 Oct 09 no follow-up yet
joanne ye

TJC Stomp Scandal - 34 views

This is a very interesting topic. Thanks, Weiman! From the replies for this topic, I would say two general questions surfaced. Firstly, is STOMP liable for misinformation? Secondly, is it right for...

Weiye Loh

Glowing trees could light up city streets - environment - 25 November 2010 - New Scientist - 0 views

  • If work by a team of undergraduates at the University of Cambridge pans out, bioluminescent trees could one day be giving our streets this dreamlike look. The students have taken the first step on this road by developing genetic tools that allow bioluminescence traits to be easily transferred into an organism.
  • Nature is full of glow-in-the-dark critters, but their shine is feeble - far too weak to read by, for example. To boost this light, the team, who were participating in the annual International Genetically Engineered Machines competition (iGEM), modified genetic material from fireflies and the luminescent marine bacterium Vibrio fischeri to boost the production and activity of light-yielding enzymes. They then made further modifications to create genetic components or "BioBricks" that can be inserted into a genome.
  • So are glowing trees coming soon to a street near you? It's unlikely, says Alexandra Daisy Ginsberg, a designer and artist who advised the Cambridge team. "We already have light bulbs," she says. "We're not going to spend our money and time engineering a replacement for something that works very well." However, she adds that "bio-light" has a distinctive allure. "There's something much more visceral about a living light. If you have to feed the light and look after it, then it becomes more precious."
Weiye Loh

A lesson in citing irrelevant statistics | The Online Citizen - 0 views

  • Statistics that are quoted, by themselves, may be quite meaningless, unless they are on a comparative basis. To illustrate this, if we want to say that Group A (poorer kids) is not significantly worse off than Group B (richer kids), then it may be pointless to just cite the statistics for Group A, without Group B’s.
  • “How children from the bottom one-third by socio-economic background fare: One in two scores in the top two-thirds at PSLE” “One in six scores in the top one-third at PSLE” What we need to know for comparative purposes, is the percentage of richer kids who scores in the top two-thirds too.
  • “… one in five scores in the top 30% at O and A levels… One in five goes to university and polys” What’s the data for richer kids? Since the proportion of the entire population going to university and polys has increased substantially, this clearly shows that poorer kids are worse off!
  • ...4 more annotations...
  • The Minister was quoted as saying: “My  parents had six children.  My first home as a young boy was a rental flat in Zion Road.  We shared it as tenants with other families” Citing individuals who made it, may be of no “statistical” relevance, as what we need are the statistics as to the proportion of poorer kids to richer kids, who get scholarships, proportional to their representation in the population.
  • “More spent on primary and secondary/JC schools.  This means having significantly more and better teachers, and having more programmes to meet children’s specific needs” What has spending more money, which what most countries do, got to do with the argument whether poorer kids are disadvantaged?
  • Straits Times journalist, Li XueYing put the crux of the debate in the right perspective: “Dr Ng had noted that ensuring social mobility “cannot mean equal outcomes, because students are inherently different”, But can it be that those from low-income families are consistently “inherently different” to such an extent?”
  • Relevant statistics Perhaps the most damning statistics that poorer kids are disadvantaged was the chart from the Ministry of Education (provided by the Straits Times), which showed that the percentage of Primary 1 pupils who lived in 1 to 3-room HDB flats and subsequently progressed to University and/or Polytechnic, has been declining since around 1986.
Weiye Loh

Science, Strong Inference -- Proper Scientific Method - 0 views

  • Scientists these days tend to keep up a polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist's field and methods of study are as good as every other scientist's and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants.
  • Why should there be such rapid advances in some fields and not in others? I think the usual explanations that we tend to think of - such as the tractability of the subject, or the quality or education of the men drawn into it, or the size of research contracts - are important but inadequate. I have begun to believe that the primary factor in scientific advance is an intellectual one. These rapidly moving fields are fields where a particular method of doing scientific research is systematically used and taught, an accumulative method of inductive inference that is so effective that I think it should be given the name of "strong inference." I believe it is important to examine this method, its use and history and rationale, and to see whether other groups and individuals might learn to adopt it profitably in their own scientific and intellectual work. In its separate elements, strong inference is just the simple and old-fashioned method of inductive inference that goes back to Francis Bacon. The steps are familiar to every college student and are practiced, off and on, by every scientist. The difference comes in their systematic application. Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: Devising alternative hypotheses; Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly is possible, exclude one or more of the hypotheses; Carrying out the experiment so as to get a clean result; Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
  • On any new problem, of course, inductive inference is not as simple and certain as deduction, because it involves reaching out into the unknown. Steps 1 and 2 require intellectual inventions, which must be cleverly chosen so that hypothesis, experiment, outcome, and exclusion will be related in a rigorous syllogism; and the question of how to generate such inventions is one which has been extensively discussed elsewhere (2, 3). What the formal schema reminds us to do is to try to make these inventions, to take the next step, to proceed to the next fork, without dawdling or getting tied up in irrelevancies.
  • ...28 more annotations...
  • It is clear why this makes for rapid and powerful progress. For exploring the unknown, there is no faster method; this is the minimum sequence of steps. Any conclusion that is not an exclusion is insecure and must be rechecked. Any delay in recycling to the next set of hypotheses is only a delay. Strong inference, and the logical tree it generates, are to inductive reasoning what the syllogism is to deductive reasoning in that it offers a regular method for reaching firm inductive conclusions one after the other as rapidly as possible.
  • "But what is so novel about this?" someone will say. This is the method of science and always has been, why give it a special name? The reason is that many of us have almost forgotten it. Science is now an everyday business. Equipment, calculations, lectures become ends in themselves. How many of us write down our alternatives and crucial experiments every day, focusing on the exclusion of a hypothesis? We may write our scientific papers so that it looks as if we had steps 1, 2, and 3 in mind all along. But in between, we do busywork. We become "method- oriented" rather than "problem-oriented." We say we prefer to "feel our way" toward generalizations. We fail to teach our students how to sharpen up their inductive inferences. And we do not realize the added power that the regular and explicit use of alternative hypothesis and sharp exclusion could give us at every step of our research.
  • A distinguished cell biologist rose and said, "No two cells give the same properties. Biology is the science of heterogeneous systems." And he added privately. "You know there are scientists, and there are people in science who are just working with these over-simplified model systems - DNA chains and in vitro systems - who are not doing science at all. We need their auxiliary work: they build apparatus, they make minor studies, but they are not scientists." To which Cy Levinthal replied: "Well, there are two kinds of biologists, those who are looking to see if there is one thing that can be understood and those who keep saying it is very complicated and that nothing can be understood. . . . You must study the simplest system you think has the properties you are interested in."
  • At the 1958 Conference on Biophysics, at Boulder, there was a dramatic confrontation between the two points of view. Leo Szilard said: "The problems of how enzymes are induced, of how proteins are synthesized, of how antibodies are formed, are closer to solution than is generally believed. If you do stupid experiments, and finish one a year, it can take 50 years. But if you stop doing experiments for a little while and think how proteins can possibly be synthesized, there are only about 5 different ways, not 50! And it will take only a few experiments to distinguish these." One of the young men added: "It is essentially the old question: How small and elegant an experiment can you perform?" These comments upset a number of those present. An electron microscopist said. "Gentlemen, this is off the track. This is philosophy of science." Szilard retorted. "I was not quarreling with third-rate scientists: I was quarreling with first-rate scientists."
  • Any criticism or challenge to consider changing our methods strikes of course at all our ego-defenses. But in this case the analytical method offers the possibility of such great increases in effectiveness that it is unfortunate that it cannot be regarded more often as a challenge to learning rather than as challenge to combat. Many of the recent triumphs in molecular biology have in fact been achieved on just such "oversimplified model systems," very much along the analytical lines laid down in the 1958 discussion. They have not fallen to the kind of men who justify themselves by saying "No two cells are alike," regardless of how true that may ultimately be. The triumphs are in fact triumphs of a new way of thinking.
  • the emphasis on strong inference
  • is also partly due to the nature of the fields themselves. Biology, with its vast informational detail and complexity, is a "high-information" field, where years and decades can easily be wasted on the usual type of "low-information" observations or experiments if one does not think carefully in advance about what the most important and conclusive experiments would be. And in high-energy physics, both the "information flux" of particles from the new accelerators and the million-dollar costs of operation have forced a similar analytical approach. It pays to have a top-notch group debate every experiment ahead of time; and the habit spreads throughout the field.
  • Historically, I think, there have been two main contributions to the development of a satisfactory strong-inference method. The first is that of Francis Bacon (13). He wanted a "surer method" of "finding out nature" than either the logic-chopping or all-inclusive theories of the time or the laudable but crude attempts to make inductions "by simple enumeration." He did not merely urge experiments as some suppose, he showed the fruitfulness of interconnecting theory and experiment so that the one checked the other. Of the many inductive procedures he suggested, the most important, I think, was the conditional inductive tree, which proceeded from alternative hypothesis (possible "causes," as he calls them), through crucial experiments ("Instances of the Fingerpost"), to exclusion of some alternatives and adoption of what is left ("establishing axioms"). His Instances of the Fingerpost are explicitly at the forks in the logical tree, the term being borrowed "from the fingerposts which are set up where roads part, to indicate the several directions."
  • ere was a method that could separate off the empty theories! Bacon, said the inductive method could be learned by anybody, just like learning to "draw a straighter line or more perfect circle . . . with the help of a ruler or a pair of compasses." "My way of discovering sciences goes far to level men's wit and leaves but little to individual excellence, because it performs everything by the surest rules and demonstrations." Even occasional mistakes would not be fatal. "Truth will sooner come out from error than from confusion."
  • Nevertheless there is a difficulty with this method. As Bacon emphasizes, it is necessary to make "exclusions." He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions, and then, after a sufficient number of negatives come to a conclusion on the affirmative instances." "[To man] it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted." Or, as the philosopher Karl Popper says today there is no such thing as proof in science - because some later alternative explanation may be as good or better - so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable because such hypotheses do not say anything, "it must be possible for all empirical scientific system to be refuted by experience" (14).
  • The difficulty is that disproof is a hard doctrine. If you have a hypothesis and I have another hypothesis, evidently one of them must be eliminated. The scientist seems to have no choice but to be either soft-headed or disputatious. Perhaps this is why so many tend to resist the strong analytical approach and why some great scientists are so disputatious.
  • Fortunately, it seems to me, this difficulty can be removed by the use of a second great intellectual invention, the "method of multiple hypotheses," which is what was needed to round out the Baconian scheme. This is a method that was put forward by T.C. Chamberlin (15), a geologist at Chicago at the turn of the century, who is best known for his contribution to the Chamberlain-Moulton hypothesis of the origin of the solar system.
  • Chamberlin says our trouble is that when we make a single hypothesis, we become attached to it. "The moment one has offered an original explanation for a phenomenon which seems satisfactory, that moment affection for his intellectual child springs into existence, and as the explanation grows into a definite theory his parental affections cluster about his offspring and it grows more and more dear to him. . . . There springs up also unwittingly a pressing of the theory to make it fit the facts and a pressing of the facts to make them fit the theory..." "To avoid this grave danger, the method of multiple working hypotheses is urged. It differs from the simple working hypothesis in that it distributes the effort and divides the affections. . . . Each hypothesis suggests its own criteria, its own method of proof, its own method of developing the truth, and if a group of hypotheses encompass the subject on all sides, the total outcome of means and of methods is full and rich."
  • The conflict and exclusion of alternatives that is necessary to sharp inductive inference has been all too often a conflict between men, each with his single Ruling Theory. But whenever each man begins to have multiple working hypotheses, it becomes purely a conflict between ideas. It becomes much easier then for each of us to aim every day at conclusive disproofs - at strong inference - without either reluctance or combativeness. In fact, when there are multiple hypotheses, which are not anyone's "personal property," and when there are crucial experiments to test them, the daily life in the laboratory takes on an interest and excitement it never had, and the students can hardly wait to get to work to see how the detective story will come out. It seems to me that this is the reason for the development of those distinctive habits of mind and the "complex thought" that Chamberlin described, the reason for the sharpness, the excitement, the zeal, the teamwork - yes, even international teamwork - in molecular biology and high- energy physics today. What else could be so effective?
  • Unfortunately, I think, there are other other areas of science today that are sick by comparison, because they have forgotten the necessity for alternative hypotheses and disproof. Each man has only one branch - or none - on the logical tree, and it twists at random without ever coming to the need for a crucial decision at any point. We can see from the external symptoms that there is something scientifically wrong. The Frozen Method, The Eternal Surveyor, The Never Finished, The Great Man With a Single Hypothcsis, The Little Club of Dependents, The Vendetta, The All-Encompassing Theory Which Can Never Be Falsified.
  • a "theory" of this sort is not a theory at all, because it does not exclude anything. It predicts everything, and therefore does not predict anything. It becomes simply a verbal formula which the graduate student repeats and believes because the professor has said it so often. This is not science, but faith; not theory, but theology. Whether it is hand-waving or number-waving, or equation-waving, a theory is not a theory unless it can be disproved. That is, unless it can be falsified by some possible experimental outcome.
  • the work methods of a number of scientists have been testimony to the power of strong inference. Is success not due in many cases to systematic use of Bacon's "surest rules and demonstrations" as much as to rare and unattainable intellectual power? Faraday's famous diary (16), or Fermi's notebooks (3, 17), show how these men believed in the effectiveness of daily steps in applying formal inductive methods to one problem after another.
  • Surveys, taxonomy, design of equipment, systematic measurements and tables, theoretical computations - all have their proper and honored place, provided they are parts of a chain of precise induction of how nature works. Unfortunately, all too often they become ends in themselves, mere time-serving from the point of view of real scientific advance, a hypertrophied methodology that justifies itself as a lore of respectability.
  • We speak piously of taking measurements and making small studies that will "add another brick to the temple of science." Most such bricks just lie around the brickyard (20). Tables of constraints have their place and value, but the study of one spectrum after another, if not frequently re-evaluated, may become a substitute for thinking, a sad waste of intelligence in a research laboratory, and a mistraining whose crippling effects may last a lifetime.
  • Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method-oriented rather than problem-oriented. The method-oriented man is shackled; the problem-oriented man is at least reaching freely toward that is most important. Strong inference redirects a man to problem-orientation, but it requires him to be willing repeatedly to put aside his last methods and teach himself new ones.
  • anyone who asks the question about scientific effectiveness will also conclude that much of the mathematizing in physics and chemistry today is irrelevant if not misleading. The great value of mathematical formulation is that when an experiment agrees with a calculation to five decimal places, a great many alternative hypotheses are pretty well excluded (though the Bohr theory and the Schrödinger theory both predict exactly the same Rydberg constant!). But when the fit is only to two decimal places, or one, it may be a trap for the unwary; it may be no better than any rule-of-thumb extrapolation, and some other kind of qualitative exclusion might be more rigorous for testing the assumptions and more important to scientific understanding than the quantitative fit.
  • Today we preach that science is not science unless it is quantitative. We substitute correlations for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but, in my observation, they more often tend to make the thinking noncausal and fuzzy. They tend to become the object of scientific manipulation instead of auxiliary tests of crucial inferences.
  • Many - perhaps most - of the great issues of science are qualitative, not quantitative, even in physics and chemistry. Equations and measurements are useful when and only when they are related to proof; but proof or disproof comes first and is in fact strongest when it is absolutely convincing without any quantitative measurement.
  • you can catch phenomena in a logical box or in a mathematical box. The logical box is coarse but strong. The mathematical box is fine-grained but flimsy. The mathematical box is a beautiful way of wrapping up a problem, but it will not hold the phenomena unless they have been caught in a logical box to begin with.
  • Of course it is easy - and all too common - for one scientist to call the others unscientific. My point is not that my particular conclusions here are necessarily correct, but that we have long needed some absolute standard of possible scientific effectiveness by which to measure how well we are succeeding in various areas - a standard that many could agree on and one that would be undistorted by the scientific pressures and fashions of the times and the vested interests and busywork that they develop. It is not public evaluation I am interested in so much as a private measure by which to compare one's own scientific performance with what it might be. I believe that strong inference provides this kind of standard of what the maximum possible scientific effectiveness could be - as well as a recipe for reaching it.
  • The strong-inference point of view is so resolutely critical of methods of work and values in science that any attempt to compare specific cases is likely to sound but smug and destructive. Mainly one should try to teach it by example and by exhorting to self-analysis and self-improvement only in general terms
  • one severe but useful private test - a touchstone of strong inference - that removes the necessity for third-person criticism, because it is a test that anyone can learn to carry with him for use as needed. It is our old friend the Baconian "exclusion," but I call it "The Question." Obviously it should be applied as much to one's own thinking as to others'. It consists of asking in your own mind, on hearing any scientific explanation or theory put forward, "But sir, what experiment could disprove your hypothesis?"; or, on hearing a scientific experiment described, "But sir, what hypothesis does your experiment disprove?"
  • It is not true that all science is equal; or that we cannot justly compare the effectiveness of scientists by any method other than a mutual-recommendation system. The man to watch, the man to put your money on, is not the man who wants to make "a survey" or a "more detailed study" but the man with the notebook, the man with the alternative hypotheses and the crucial experiments, the man who knows how to answer your Question of disproof and is already working on it.
  •  
    There is so much bad science and bad statistics information in media reports, publications, and shared between conversants that I think it is important to understand about facts and proofs and the associated pitfalls.
Weiye Loh

How the net traps us all in our own little bubbles | Technology | The Observer - 0 views

  • Google would use 57 signals – everything from where you were logging in from to what browser you were using to what you had searched for before – to make guesses about who you were and what kinds of sites you'd like. Even if you were logged out, it would customise its results, showing you the pages it predicted you were most likely to click on.
  • Most of us assume that when we google a term, we all see the same results – the ones that the company's famous Page Rank algorithm suggests are the most authoritative based on other pages' links. But since December 2009, this is no longer true. Now you get the result that Google's algorithm suggests is best for you in particular – and someone else may see something entirely different. In other words, there is no standard Google any more.
  • In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing oil into the Gulf of Mexico, I asked two friends to search for the term "BP". They're pretty similar – educated white left-leaning women who live in the north-east. But the results they saw were quite different. One saw investment information about BP. The other saw news.
  • ...7 more annotations...
  • the query "stem cells" might produce diametrically opposed results for scientists who support stem-cell research and activists who oppose it.
  • "Proof of climate change" might turn up different results for an environmental activist and an oil-company executive.
  • majority of us assume search engines are unbiased. But that may be just because they're increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click. Google's announcement marked the turning point of an important but nearly invisible revolution in how we consume information. You could say that on 4 December 2009 the era of personalisation began.
  • We are predisposed to respond to a pretty narrow set of stimuli – if a piece of news is about sex, power, gossip, violence, celebrity or humour, we are likely to read it first. This is the content that most easily makes it into the filter bubble. It's easy to push "Like" and increase the visibility of a friend's post about finishing a marathon or an instructional article about how to make onion soup. It's harder to push the "Like" button on an article titled "Darfur sees bloodiest month in two years". In a personalised world, important but complex or unpleasant issues – the rising prison population, for example, or homelessness – are less likely to come to our attention at all.
  • As a consumer, it's hard to argue with blotting out the irrelevant and unlikable. But what is good for consumers is not necessarily good for citizens. What I seem to like may not be what I actually want, let alone what I need to know to be an informed member of my community or country. "It's a civic virtue to be exposed to things that appear to be outside your interest," technology journalist Clive Thompson told me. Cultural critic Lee Siegel puts it a different way: "Customers are always right, but people aren't."
  • Personalisation is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life – much of which you might not trust friends with.
  • To be the author of your life, professor Yochai Benkler argues, you have to be aware of a diverse array of options and lifestyles. When you enter a filter bubble, you're letting the companies that construct it choose which options you're aware of. You may think you're the captain of your own destiny, but personalisation can lead you down a road to a kind of informational determinism in which what you've clicked on in the past determines what you see next – a web history you're doomed to repeat. You can get stuck in a static, ever- narrowing version of yourself – an endless you-loop.
  •  
    An invisible revolution has taken place is the way we use the net, but the increasing personalisation of information by search engines such as Google threatens to limit our access to information and enclose us in a self-reinforcing world view, writes Eli Pariser in an extract from The Filter Bubble
Weiye Loh

flaneurose: The KK Chemo Misdosage Incident - 0 views

  • Labelling the pump that dispenses in ml/hr in a different color from the pump that dispenses in ml/day would be an obvious remedy that would have addressed the KK incident. It's the common-sensical solution that anyone can think of.
  • Sometimes, design flaws like that really do occur because engineers can't see the wood for the trees.
  • But sometimes the team is aware of these issues and highlights them to management, but the manufacturer still proceeds as before. Why is that? Because in addition to design principles, one must be mindful that there are always business considerations at play as well. Manufacturing two (or more) separate designs for pumps incurs greater costs, eliminates the ability to standardize across pumps, increases holding inventory, and overall increases complexity of business and manufacturing processes, and decreases economies of scale. All this naturally reduces profitability.It's not just pumps. Even medicines are typically sold in identical-looking vials with identically colored vial caps, with only the text on the vial labels differentiating them in both drug type and concentration. You can imagine what kinds of accidents can potentially happen there.
  • ...2 more annotations...
  • Legally, the manufacturer has clearly labelled on the pump (in text) the appropriate dosing regime, or for a medicine vial, the type of drug and concentration. The manufacturer has hence fulfilled its duty. Therefore, if there are any mistakes in dosing, the liability for the error lies with the hospital and not the manufacturer of the product. The victim of such a dosing error can be said to be an "externalized cost"; the beneficiaries of the victim's suffering are the manufacturer, who enjoys greater profitability, the hospital, which enjoys greater cost-savings, and the public, who save on healthcare. Is it ethical of the manufacturer, to "pass on" liability to the hospital? To make it difficult (or at least not easy) for the hospital to administer the right dosage? Maybe the manufacturer is at fault, but IMHO, it's very hard to say.
  • When a chemo incident like the one that happened in KK occurs, there are cries of public remonstration, and the pendulum may swing the other way. Hospitals might make the decision to purchase more expensive and better designed pumps (that is, if they are available). Then years down the road, when a bureaucrat (or a management consultant) with an eye to trim costs looks through the hospital purchasing orders, they may make the suggestion that $XXX could be saved by buying the generic version of such-and-such a product, instead of the more expensive version. And they would not be wrong, just...myopic.Then the cycle starts again.Sometimes it's not only about human factors. It could be about policy, or human nature, or business fundamentals, or just the plain old, dysfunctional way the world works.
    • Weiye Loh
       
      Interesting article. Explains clearly why our 'ethical' considerations is always only limited to a particular context and specific considerations. 
Weiye Loh

Roger Pielke Jr.'s Blog: Blind Spots in Australian Flood Policies - 0 views

  • better management of flood risks in Australia will depend up better data on flood risk.  However, collecting such data has proven problematic
  • As many Queenslanders affected by January’s floods are realising, riverine flood damage is commonly excluded from household insurance policies. And this is unlikely to change until councils – especially in Queensland – stop dragging their feet and actively assist in developing comprehensive data insurance companies can use.
  • ? Because there is often little available information that would allow an insurer to adequately price this flood risk. Without this, there is little economic incentive for insurers to accept this risk. It would be irresponsible for insurers to cover riverine flood without quantifying and pricing the risk accordingly.
  • ...8 more annotations...
  • The first step in establishing risk-adjusted premiums is to know the likelihood of the depth of flooding at each address. This information has to be address-specific because the severity of flooding can vary widely over small distances, for example, from one side of a road to the other.
  • A litany of reasons is given for withholding data. At times it seems that refusal stems from a view that insurance is innately evil. This is ironic in view of the gratuitous advice sometimes offered by politicians and commentators in the aftermath of extreme events, exhorting insurers to pay claims even when no legal liability exists and riverine flood is explicitly excluded from policies.
  • Risk Frontiers is involved in jointly developing the National Flood Information Database (NFID) for the Insurance Council of Australia with Willis Re, a reinsurance broking intermediary. NFID is a five year project aiming to integrate flood information from all city councils in a consistent insurance-relevant form. The aim of NFID is to help insurers understand and quantify their risk. Unfortunately, obtaining the base data for NFID from some local councils is difficult and sometimes impossible despite the support of all state governments for the development of NFID. Councils have an obligation to assess their flood risk and to establish rules for safe land development. However, many are antipathetic to the idea of insurance. Some states and councils have been very supportive – in New South Wales and Victoria, particularly. Some states have a central repository – a library of all flood studies and digital terrain models (digital elevation data). Council reluctance to release data is most prevalent in Queensland, where, unfortunately, no central repository exists.
  • Second, models of flood risk are sometimes misused:
  • many councils only undertake flood modelling in order to create a single design flood level, usually the so-called one-in-100 year flood. (For reasons given later, a better term is the flood with an 1% annual likelihood of being exceeded.)
  • Inundation maps showing the extent of the flood with a 1% annual likelihood of exceedance are increasingly common on council websites, even in Queensland. Unfortunately these maps say little about the depth of water at an address or, importantly, how depth varies for less probable floods. Insurance claims usually begin when the ground is flooded and increase rapidly as water rises above the floor level. At Windsor in NSW, for example, the difference in the water depth between the flood with a 1% annual chance of exceedance and the maximum possible flood is nine metres. In other catchments this difference may be as small as ten centimetres. The risk of damage is quite different in both cases and an insurer needs this information if they are to provide coverage in these areas.
  • The ‘one-in-100 year flood’ term is misleading. To many it is something that happens regularly once every 100 years — with the reliability of a bus timetable. It is still possible, though unlikely, that a flood of similar magnitude or even greater flood could happen twice in one year or three times in successive years.
  • The calculations underpinning this are not straightforward but the probability that an address exposed to a 1-in-100 year flood will experience such an event or greater over the lifetime of the house – 50 years say – is around 40%. Over the lifetime of a typical home mortgage – 25 years – the probability of occurrence is 22%. These are not good odds.
  •  
    John McAneney of Risk Frontiers at Macquarie University in Sydney identifies some opportunities for better flood policies in Australia.
Weiye Loh

Freakonomics » "Conspicuous Conservation" and the Prius Effect - 0 views

  • Two young economists, Steve and Alison Sexton, have been looking into this question. (Not only are the Sextons twins, but their parents are also economists, and Steve is a competitive triathlete.) The result is an interesting draft paper called “Conspicuous Conservation: The Prius Effect and WTP [Willingness to Pay] for Environmental Bona Fides.” When you drive a Prius, the Sextons argue, there’s a “green halo” around you. You make new friends; you get new business opportunities. In an especially “green” place like Boulder, Colo., the effect could be worth as much as $7,000.
  • The Sextons focused on the distinctive design of the Prius — which was no accident. Honda, Ford, Nissan and other car makers sell hybrids, but you can’t pick them out on the road (the Civic hybrid, for instance, looks just like a Civic). The Prius is unmistakable. It marks whoever is driving it as someone who cares about the environment; it’s an act of “conspicuous conservation,” an update of Thorstein Veblen’s “conspicuous consumption.” Here’s how Steve Sexton describes it: SEXTON: A sort of “keeping up with the Joneses”-type concept but applied to efforts to make society better. I will be competing with my neighbors to donate to a charity, for instance, or to reduce energy conservation or environmental impacts.
  •  
    when people make environmentally sound choices, how much are those choices driven by the consumers' desire to show off their green bona fides?
Weiye Loh

The Black Swan of Cairo | Foreign Affairs - 0 views

  • It is both misguided and dangerous to push unobserved risks further into the statistical tails of the probability distribution of outcomes and allow these high-impact, low-probability "tail risks" to disappear from policymakers' fields of observation.
  • Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.
  • Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups. And it is the same misperception of the properties of natural systems that led to both the economic crisis of 2007-8 and the current turmoil in the Arab world. The policy implications are identical: to make systems robust, all risks must be visible and out in the open -- fluctuat nec mergitur (it fluctuates but does not sink) goes the Latin saying.
  • ...21 more annotations...
  • Just as a robust economic system is one that encourages early failures (the concepts of "fail small" and "fail fast"), the U.S. government should stop supporting dictatorial regimes for the sake of pseudostability and instead allow political noise to rise to the surface. Making an economy robust in the face of business swings requires allowing risk to be visible; the same is true in politics.
  • Both the recent financial crisis and the current political crisis in the Middle East are grounded in the rise of complexity, interdependence, and unpredictability. Policymakers in the United Kingdom and the United States have long promoted policies aimed at eliminating fluctuation -- no more booms and busts in the economy, no more "Iranian surprises" in foreign policy. These policies have almost always produced undesirable outcomes. For example, the U.S. banking system became very fragile following a succession of progressively larger bailouts and government interventions, particularly after the 1983 rescue of major banks (ironically, by the same Reagan administration that trumpeted free markets). In the United States, promoting these bad policies has been a bipartisan effort throughout. Republicans have been good at fragilizing large corporations through bailouts, and Democrats have been good at fragilizing the government. At the same time, the financial system as a whole exhibited little volatility; it kept getting weaker while providing policymakers with the illusion of stability, illustrated most notably when Ben Bernanke, who was then a member of the Board of Governors of the U.S. Federal Reserve, declared the era of "the great moderation" in 2004.
  • Washington stabilized the market with bailouts and by allowing certain companies to grow "too big to fail." Because policymakers believed it was better to do something than to do nothing, they felt obligated to heal the economy rather than wait and see if it healed on its own.
  • The foreign policy equivalent is to support the incumbent no matter what. And just as banks took wild risks thanks to Greenspan's implicit insurance policy, client governments such as Hosni Mubarak's in Egypt for years engaged in overt plunder thanks to similarly reliable U.S. support.
  • Those who seek to prevent volatility on the grounds that any and all bumps in the road must be avoided paradoxically increase the probability that a tail risk will cause a major explosion.
  • In the realm of economics, price controls are designed to constrain volatility on the grounds that stable prices are a good thing. But although these controls might work in some rare situations, the long-term effect of any such system is an eventual and extremely costly blowup whose cleanup costs can far exceed the benefits accrued. The risks of a dictatorship, no matter how seemingly stable, are no different, in the long run, from those of an artificially controlled price.
  • Such attempts to institutionally engineer the world come in two types: those that conform to the world as it is and those that attempt to reform the world. The nature of humans, quite reasonably, is to intervene in an effort to alter their world and the outcomes it produces. But government interventions are laden with unintended -- and unforeseen -- consequences, particularly in complex systems, so humans must work with nature by tolerating systems that absorb human imperfections rather than seek to change them.
  • What is needed is a system that can prevent the harm done to citizens by the dishonesty of business elites; the limited competence of forecasters, economists, and statisticians; and the imperfections of regulation, not one that aims to eliminate these flaws. Humans must try to resist the illusion of control: just as foreign policy should be intelligence-proof (it should minimize its reliance on the competence of information-gathering organizations and the predictions of "experts" in what are inherently unpredictable domains), the economy should be regulator-proof, given that some regulations simply make the system itself more fragile. Due to the complexity of markets, intricate regulations simply serve to generate fees for lawyers and profits for sophisticated derivatives traders who can build complicated financial products that skirt those regulations.
  • The life of a turkey before Thanksgiving is illustrative: the turkey is fed for 1,000 days and every day seems to confirm that the farmer cares for it -- until the last day, when confidence is maximal. The "turkey problem" occurs when a naive analysis of stability is derived from the absence of past variations. Likewise, confidence in stability was maximal at the onset of the financial crisis in 2007.
  • The turkey problem for humans is the result of mistaking one environment for another. Humans simultaneously inhabit two systems: the linear and the complex. The linear domain is characterized by its predictability and the low degree of interaction among its components, which allows the use of mathematical methods that make forecasts reliable. In complex systems, there is an absence of visible causal links between the elements, masking a high degree of interdependence and extremely low predictability. Nonlinear elements are also present, such as those commonly known, and generally misunderstood, as "tipping points." Imagine someone who keeps adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.
  • Engineering, architecture, astronomy, most of physics, and much of common science are linear domains. The complex domain is the realm of the social world, epidemics, and economics. Crucially, the linear domain delivers mild variations without large shocks, whereas the complex domain delivers massive jumps and gaps. Complex systems are misunderstood, mostly because humans' sophistication, obtained over the history of human knowledge in the linear domain, does not transfer properly to the complex domain. Humans can predict a solar eclipse and the trajectory of a space vessel, but not the stock market or Egyptian political events. All man-made complex systems have commonalities and even universalities. Sadly, deceptive calm (followed by Black Swan surprises) seems to be one of those properties.
  • The system is responsible, not the components. But after the financial crisis of 2007-8, many people thought that predicting the subprime meltdown would have helped. It would not have, since it was a symptom of the crisis, not its underlying cause. Likewise, Obama's blaming "bad intelligence" for his administration's failure to predict the crisis in Egypt is symptomatic of both the misunderstanding of complex systems and the bad policies involved.
  • Obama's mistake illustrates the illusion of local causal chains -- that is, confusing catalysts for causes and assuming that one can know which catalyst will produce which effect. The final episode of the upheaval in Egypt was unpredictable for all observers, especially those involved. As such, blaming the CIA is as foolish as funding it to forecast such events. Governments are wasting billions of dollars on attempting to predict events that are produced by interdependent systems and are therefore not statistically understandable at the individual level.
  • Political and economic "tail events" are unpredictable, and their probabilities are not scientifically measurable. No matter how many dollars are spent on research, predicting revolutions is not the same as counting cards; humans will never be able to turn politics into the tractable randomness of blackjack.
  • Most explanations being offered for the current turmoil in the Middle East follow the "catalysts as causes" confusion. The riots in Tunisia and Egypt were initially attributed to rising commodity prices, not to stifling and unpopular dictatorships. But Bahrain and Libya are countries with high gdps that can afford to import grain and other commodities. Again, the focus is wrong even if the logic is comforting. It is the system and its fragility, not events, that must be studied -- what physicists call "percolation theory," in which the properties of the terrain are studied rather than those of a single element of the terrain.
  • When dealing with a system that is inherently unpredictable, what should be done? Differentiating between two types of countries is useful. In the first, changes in government do not lead to meaningful differences in political outcomes (since political tensions are out in the open). In the second type, changes in government lead to both drastic and deeply unpredictable changes.
  • Humans fear randomness -- a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced fitness and increased chances of survival, it can have the reverse effect in today's complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of "great moderation." This is not to say that any and all volatility should be embraced. Insurance should not be banned, for example.
  • But alongside the "catalysts as causes" confusion sit two mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing). This leads to the desire to impose man-made solutions
  • Variation is information. When there is no variation, there is no information. This explains the CIA's failure to predict the Egyptian revolution and, a generation before, the Iranian Revolution -- in both cases, the revolutionaries themselves did not have a clear idea of their relative strength with respect to the regime they were hoping to topple. So rather than subsidize and praise as a "force for stability" every tin-pot potentate on the planet, the U.S. government should encourage countries to let information flow upward through the transparency that comes with political agitation. It should not fear fluctuations per se, since allowing them to be in the open, as Italy and Lebanon both show in different ways, creates the stability of small jumps.
  • As Seneca wrote in De clementia, "Repeated punishment, while it crushes the hatred of a few, stirs the hatred of all . . . just as trees that have been trimmed throw out again countless branches." The imposition of peace through repeated punishment lies at the heart of many seemingly intractable conflicts, including the Israeli-Palestinian stalemate. Furthermore, dealing with seemingly reliable high-level officials rather than the people themselves prevents any peace treaty signed from being robust. The Romans were wise enough to know that only a free man under Roman law could be trusted to engage in a contract; by extension, only a free people can be trusted to abide by a treaty. Treaties that are negotiated with the consent of a broad swath of the populations on both sides of a conflict tend to survive. Just as no central bank is powerful enough to dictate stability, no superpower can be powerful enough to guarantee solid peace alone.
  • As Jean-Jacques Rousseau put it, "A little bit of agitation gives motivation to the soul, and what really makes the species prosper is not peace so much as freedom." With freedom comes some unpredictable fluctuation. This is one of life's packages: there is no freedom without noise -- and no stability without volatility.∂
Weiye Loh

Response to Guardian's Article on Singapore Elections | the kent ridge common - 0 views

  • Further, grumblings on Facebook accounts are hardly ‘anonymous’. Lastly, how anonymous can bloggers be, when every now and then a racist blogger gets arrested by the state? Think about it. These sorts of cases prove that the state does screen, survey and monitor the online community, and as all of us know there are many vehement anti-PAP comments and articles, much of which are outright slander and defamation.
  • Yet at the end of the day, it is the racist blogger, not the anti-government or anti-PAP blogger that gets arrested. The Singaporean model is a much more complex and sophisticated phenomenon than this Guardian writer gives it credit.
  • Why did this Guardian writer, anyway, pander to a favourite Western stereotype of that “far-off Asian undemocratic, repressive regime”? Is she really in Singapore as the Guardian claims? (“Kate Hodal in Singapore” is written at the top) Can the Guardian be anymore predictable and trite?
  • ...1 more annotation...
  • Can any Singaporean honestly say the she/he can conceive of a fellow Singaporean setting himself or herself on fire along Orchard Road or Shenton Way, as a result of desperate economic pressures or financial constraints? Can we even fathom the social and economic pressures that mobilized a whole people to protest and overthrow a corrupt, US-backed regime? (that is, not during elections time) Singapore has real problems, the People’s Action Party has its real problems, and there is indeed much room for improvement. Yet such irresponsible reporting by one of the esteemed newspapers from the UK is utterly disappointing, not constructive in the least sense, and utterly misrepresents our political situation (and may potentially provoke more irrationality in our society, leading people to ‘believe’ their affinity with their Arab peers which leads to more radicalism).
  •  
    Further, grumblings on Facebook accounts are hardly 'anonymous'. Lastly, how anonymous can bloggers be, when every now and then a racist blogger gets arrested by the state? Think about it. These sorts of cases prove that the state does screen, survey and monitor the online community, and as all of us know there are many vehement anti-PAP comments and articles, much of which are outright slander and defamation. Yet at the end of the day, it is the racist blogger, not the anti-government or anti-PAP blogger that gets arrested. The Singaporean model is a much more complex and sophisticated phenomenon than this Guardian writer gives it credit.
1 - 15 of 15
Showing 20 items per page