Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Habits

Rss Feed Group items tagged

Weiye Loh

Bad Health Habits Blamed on Genetics - Newsweek - 0 views

  • A new study shows just how alluring “My DNA did it!” is to some people.
  • here are serious scientific concerns about the reliability and value of many of the genes linked to disease. And now we have another reason why the hype is worrisome: people who engage in the riskiest-for-health behaviors, and who therefore most need to change, are more likely to blame their genes for their diseases, finds a new study published online in the journal Annals of Behavioral Medicine.
  • Worse, the more behavioral risk factors people have—smoking and eating a high-fat diet and not exercising, for instance—the less likely they are to be interested in information about living healthier.
  • ...1 more annotation...
  • The unhealthier people’s habits were, the more they latched on to genetic explanations for diseases
  •  
    My Alleles Made Me Do It: The Folly of Blaming Bad Behavior on Wonky DNA
Weiye Loh

The internet: is it changing the way we think? | Technology | The Observer - 0 views

  • American magazine the Atlantic lobs an intellectual grenade into our culture. In the summer of 1945, for example, it published an essay by the Massachusetts Institute of Technology (MIT) engineer Vannevar Bush entitled "As We May Think". It turned out to be the blueprint for what eventually emerged as the world wide web. Two summers ago, the Atlantic published an essay by Nicholas Carr, one of the blogosphere's most prominent (and thoughtful) contrarians, under the headline "Is Google Making Us Stupid?".
  • Carr wrote, "I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle."
  • Carr's target was not really the world's leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.
  • ...9 more annotations...
  • Carr's article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he's an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking.
  • Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it's needed again? The web has become, in a way, a global prosthesis for our collective memory.
  • easy to dismiss Carr's concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.
  • many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre's Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that "people's use of the internet has enhanced human intelligence".
  • As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn't the same thing as changing our brains. The brain is like any other muscle – if you don't stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.
  • he internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you've worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it's at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.
  • I've seen students' thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.
  • But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We've all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.
  • In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn't discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet's power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.
  •  
    The internet: is it changing the way we think? American writer Nicholas Carr's claim that the internet is not only shaping our lives but physically altering our brains has sparked a lively and ongoing debate, says John Naughton. Below, a selection of writers and experts offer their opinion
Weiye Loh

Hayek, The Use of Knowledge in Society | Library of Economics and Liberty - 0 views

  • the "data" from which the economic calculus starts are never for the whole society "given" to a single mind which could work out the implications and can never be so given.
  • The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.
  • The economic problem of society
  • ...14 more annotations...
  • is a problem of the utilization of knowledge which is not given to anyone in its totality.
  • who is to do the planning. It is about this question that all the dispute about "economic planning" centers. This is not a dispute about whether planning is to be done or not. It is a dispute as to whether planning is to be done centrally, by one authority for the whole economic system, or is to be divided among many individuals. Planning in the specific sense in which the term is used in contemporary controversy necessarily means central planning—direction of the whole economic system according to one unified plan. Competition, on the other hand, means decentralized planning by many separate persons. The halfway house between the two, about which many people talk but which few like when they see it, is the
  • Which of these systems is likely to be more efficient depends mainly on the question under which of them we can expect that fuller use will be made of the existing knowledge.
  • It may be admitted that, as far as scientific knowledge is concerned, a body of suitably chosen experts may be in the best position to command all the best knowledge available—though this is of course merely shifting the difficulty to the problem of selecting the experts.
  • Today it is almost heresy to suggest that scientific knowledge is not the sum of all knowledge. But a little reflection will show that there is beyond question a body of very important but unorganized knowledge which cannot possibly be called scientific in the sense of knowledge of general rules: the knowledge of the particular circumstances of time and place. It is with respect to this that practically every individual has some advantage over all others because he possesses unique information of which beneficial use might be made, but of which use can be made only if the decisions depending on it are left to him or are made with his active coöperation.
  • the relative importance of the different kinds of knowledge; those more likely to be at the disposal of particular individuals and those which we should with greater confidence expect to find in the possession of an authority made up of suitably chosen experts. If it is today so widely assumed that the latter will be in a better position, this is because one kind of knowledge, namely, scientific knowledge, occupies now so prominent a place in public imagination that we tend to forget that it is not the only kind that is relevant.
  • It is a curious fact that this sort of knowledge should today be generally regarded with a kind of contempt and that anyone who by such knowledge gains an advantage over somebody better equipped with theoretical or technical knowledge is thought to have acted almost disreputably. To gain an advantage from better knowledge of facilities of communication or transport is sometimes regarded as almost dishonest, although it is quite as important that society make use of the best opportunities in this respect as in using the latest scientific discoveries.
  • The common idea now seems to be that all such knowledge should as a matter of course be readily at the command of everybody, and the reproach of irrationality leveled against the existing economic order is frequently based on the fact that it is not so available. This view disregards the fact that the method by which such knowledge can be made as widely available as possible is precisely the problem to which we have to find an answer.
  • One reason why economists are increasingly apt to forget about the constant small changes which make up the whole economic picture is probably their growing preoccupation with statistical aggregates, which show a very much greater stability than the movements of the detail. The comparative stability of the aggregates cannot, however, be accounted for—as the statisticians occasionally seem to be inclined to do—by the "law of large numbers" or the mutual compensation of random changes.
  • the sort of knowledge with which I have been concerned is knowledge of the kind which by its nature cannot enter into statistics and therefore cannot be conveyed to any central authority in statistical form. The statistics which such a central authority would have to use would have to be arrived at precisely by abstracting from minor differences between the things, by lumping together, as resources of one kind, items which differ as regards location, quality, and other particulars, in a way which may be very significant for the specific decision. It follows from this that central planning based on statistical information by its nature cannot take direct account of these circumstances of time and place and that the central planner will have to find some way or other in which the decisions depending on them can be left to the "man on the spot."
  • We need decentralization because only thus can we insure that the knowledge of the particular circumstances of time and place will be promptly used. But the "man on the spot" cannot decide solely on the basis of his limited but intimate knowledge of the facts of his immediate surroundings. There still remains the problem of communicating to him such further information as he needs to fit his decisions into the whole pattern of changes of the larger economic system.
  • The problem which we meet here is by no means peculiar to economics but arises in connection with nearly all truly social phenomena, with language and with most of our cultural inheritance, and constitutes really the central theoretical problem of all social science. As Alfred Whitehead has said in another connection, "It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them." This is of profound significance in the social field. We make constant use of formulas, symbols, and rules whose meaning we do not understand and through the use of which we avail ourselves of the assistance of knowledge which individually we do not possess. We have developed these practices and institutions by building upon habits and institutions which have proved successful in their own sphere and which have in turn become the foundation of the civilization we have built up.
  • To assume all the knowledge to be given to a single mind in the same manner in which we assume it to be given to us as the explaining economists is to assume the problem away and to disregard everything that is important and significant in the real world.
  • That an economist of Professor Schumpeter's standing should thus have fallen into a trap which the ambiguity of the term "datum" sets to the unwary can hardly be explained as a simple error. It suggests rather that there is something fundamentally wrong with an approach which habitually disregards an essential part of the phenomena with which we have to deal: the unavoidable imperfection of man's knowledge and the consequent need for a process by which knowledge is constantly communicated and acquired. Any approach, such as that of much of mathematical economics with its simultaneous equations, which in effect starts from the assumption that people's knowledge corresponds with the objective facts of the situation, systematically leaves out what is our main task to explain. I am far from denying that in our system equilibrium analysis has a useful function to perform. But when it comes to the point where it misleads some of our leading thinkers into believing that the situation which it describes has direct relevance to the solution of practical problems, it is high time that we remember that it does not deal with the social process at all and that it is no more than a useful preliminary to the study of the main problem.
  •  
    The Use of Knowledge in Society Hayek, Friedrich A.(1899-1992)
Weiye Loh

Science, Strong Inference -- Proper Scientific Method - 0 views

  • Scientists these days tend to keep up a polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist's field and methods of study are as good as every other scientist's and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants.
  • Why should there be such rapid advances in some fields and not in others? I think the usual explanations that we tend to think of - such as the tractability of the subject, or the quality or education of the men drawn into it, or the size of research contracts - are important but inadequate. I have begun to believe that the primary factor in scientific advance is an intellectual one. These rapidly moving fields are fields where a particular method of doing scientific research is systematically used and taught, an accumulative method of inductive inference that is so effective that I think it should be given the name of "strong inference." I believe it is important to examine this method, its use and history and rationale, and to see whether other groups and individuals might learn to adopt it profitably in their own scientific and intellectual work. In its separate elements, strong inference is just the simple and old-fashioned method of inductive inference that goes back to Francis Bacon. The steps are familiar to every college student and are practiced, off and on, by every scientist. The difference comes in their systematic application. Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: Devising alternative hypotheses; Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly is possible, exclude one or more of the hypotheses; Carrying out the experiment so as to get a clean result; Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
  • On any new problem, of course, inductive inference is not as simple and certain as deduction, because it involves reaching out into the unknown. Steps 1 and 2 require intellectual inventions, which must be cleverly chosen so that hypothesis, experiment, outcome, and exclusion will be related in a rigorous syllogism; and the question of how to generate such inventions is one which has been extensively discussed elsewhere (2, 3). What the formal schema reminds us to do is to try to make these inventions, to take the next step, to proceed to the next fork, without dawdling or getting tied up in irrelevancies.
  • ...28 more annotations...
  • It is clear why this makes for rapid and powerful progress. For exploring the unknown, there is no faster method; this is the minimum sequence of steps. Any conclusion that is not an exclusion is insecure and must be rechecked. Any delay in recycling to the next set of hypotheses is only a delay. Strong inference, and the logical tree it generates, are to inductive reasoning what the syllogism is to deductive reasoning in that it offers a regular method for reaching firm inductive conclusions one after the other as rapidly as possible.
  • "But what is so novel about this?" someone will say. This is the method of science and always has been, why give it a special name? The reason is that many of us have almost forgotten it. Science is now an everyday business. Equipment, calculations, lectures become ends in themselves. How many of us write down our alternatives and crucial experiments every day, focusing on the exclusion of a hypothesis? We may write our scientific papers so that it looks as if we had steps 1, 2, and 3 in mind all along. But in between, we do busywork. We become "method- oriented" rather than "problem-oriented." We say we prefer to "feel our way" toward generalizations. We fail to teach our students how to sharpen up their inductive inferences. And we do not realize the added power that the regular and explicit use of alternative hypothesis and sharp exclusion could give us at every step of our research.
  • A distinguished cell biologist rose and said, "No two cells give the same properties. Biology is the science of heterogeneous systems." And he added privately. "You know there are scientists, and there are people in science who are just working with these over-simplified model systems - DNA chains and in vitro systems - who are not doing science at all. We need their auxiliary work: they build apparatus, they make minor studies, but they are not scientists." To which Cy Levinthal replied: "Well, there are two kinds of biologists, those who are looking to see if there is one thing that can be understood and those who keep saying it is very complicated and that nothing can be understood. . . . You must study the simplest system you think has the properties you are interested in."
  • At the 1958 Conference on Biophysics, at Boulder, there was a dramatic confrontation between the two points of view. Leo Szilard said: "The problems of how enzymes are induced, of how proteins are synthesized, of how antibodies are formed, are closer to solution than is generally believed. If you do stupid experiments, and finish one a year, it can take 50 years. But if you stop doing experiments for a little while and think how proteins can possibly be synthesized, there are only about 5 different ways, not 50! And it will take only a few experiments to distinguish these." One of the young men added: "It is essentially the old question: How small and elegant an experiment can you perform?" These comments upset a number of those present. An electron microscopist said. "Gentlemen, this is off the track. This is philosophy of science." Szilard retorted. "I was not quarreling with third-rate scientists: I was quarreling with first-rate scientists."
  • Any criticism or challenge to consider changing our methods strikes of course at all our ego-defenses. But in this case the analytical method offers the possibility of such great increases in effectiveness that it is unfortunate that it cannot be regarded more often as a challenge to learning rather than as challenge to combat. Many of the recent triumphs in molecular biology have in fact been achieved on just such "oversimplified model systems," very much along the analytical lines laid down in the 1958 discussion. They have not fallen to the kind of men who justify themselves by saying "No two cells are alike," regardless of how true that may ultimately be. The triumphs are in fact triumphs of a new way of thinking.
  • the emphasis on strong inference
  • is also partly due to the nature of the fields themselves. Biology, with its vast informational detail and complexity, is a "high-information" field, where years and decades can easily be wasted on the usual type of "low-information" observations or experiments if one does not think carefully in advance about what the most important and conclusive experiments would be. And in high-energy physics, both the "information flux" of particles from the new accelerators and the million-dollar costs of operation have forced a similar analytical approach. It pays to have a top-notch group debate every experiment ahead of time; and the habit spreads throughout the field.
  • Historically, I think, there have been two main contributions to the development of a satisfactory strong-inference method. The first is that of Francis Bacon (13). He wanted a "surer method" of "finding out nature" than either the logic-chopping or all-inclusive theories of the time or the laudable but crude attempts to make inductions "by simple enumeration." He did not merely urge experiments as some suppose, he showed the fruitfulness of interconnecting theory and experiment so that the one checked the other. Of the many inductive procedures he suggested, the most important, I think, was the conditional inductive tree, which proceeded from alternative hypothesis (possible "causes," as he calls them), through crucial experiments ("Instances of the Fingerpost"), to exclusion of some alternatives and adoption of what is left ("establishing axioms"). His Instances of the Fingerpost are explicitly at the forks in the logical tree, the term being borrowed "from the fingerposts which are set up where roads part, to indicate the several directions."
  • ere was a method that could separate off the empty theories! Bacon, said the inductive method could be learned by anybody, just like learning to "draw a straighter line or more perfect circle . . . with the help of a ruler or a pair of compasses." "My way of discovering sciences goes far to level men's wit and leaves but little to individual excellence, because it performs everything by the surest rules and demonstrations." Even occasional mistakes would not be fatal. "Truth will sooner come out from error than from confusion."
  • Nevertheless there is a difficulty with this method. As Bacon emphasizes, it is necessary to make "exclusions." He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions, and then, after a sufficient number of negatives come to a conclusion on the affirmative instances." "[To man] it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted." Or, as the philosopher Karl Popper says today there is no such thing as proof in science - because some later alternative explanation may be as good or better - so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable because such hypotheses do not say anything, "it must be possible for all empirical scientific system to be refuted by experience" (14).
  • The difficulty is that disproof is a hard doctrine. If you have a hypothesis and I have another hypothesis, evidently one of them must be eliminated. The scientist seems to have no choice but to be either soft-headed or disputatious. Perhaps this is why so many tend to resist the strong analytical approach and why some great scientists are so disputatious.
  • Fortunately, it seems to me, this difficulty can be removed by the use of a second great intellectual invention, the "method of multiple hypotheses," which is what was needed to round out the Baconian scheme. This is a method that was put forward by T.C. Chamberlin (15), a geologist at Chicago at the turn of the century, who is best known for his contribution to the Chamberlain-Moulton hypothesis of the origin of the solar system.
  • Chamberlin says our trouble is that when we make a single hypothesis, we become attached to it. "The moment one has offered an original explanation for a phenomenon which seems satisfactory, that moment affection for his intellectual child springs into existence, and as the explanation grows into a definite theory his parental affections cluster about his offspring and it grows more and more dear to him. . . . There springs up also unwittingly a pressing of the theory to make it fit the facts and a pressing of the facts to make them fit the theory..." "To avoid this grave danger, the method of multiple working hypotheses is urged. It differs from the simple working hypothesis in that it distributes the effort and divides the affections. . . . Each hypothesis suggests its own criteria, its own method of proof, its own method of developing the truth, and if a group of hypotheses encompass the subject on all sides, the total outcome of means and of methods is full and rich."
  • The conflict and exclusion of alternatives that is necessary to sharp inductive inference has been all too often a conflict between men, each with his single Ruling Theory. But whenever each man begins to have multiple working hypotheses, it becomes purely a conflict between ideas. It becomes much easier then for each of us to aim every day at conclusive disproofs - at strong inference - without either reluctance or combativeness. In fact, when there are multiple hypotheses, which are not anyone's "personal property," and when there are crucial experiments to test them, the daily life in the laboratory takes on an interest and excitement it never had, and the students can hardly wait to get to work to see how the detective story will come out. It seems to me that this is the reason for the development of those distinctive habits of mind and the "complex thought" that Chamberlin described, the reason for the sharpness, the excitement, the zeal, the teamwork - yes, even international teamwork - in molecular biology and high- energy physics today. What else could be so effective?
  • Unfortunately, I think, there are other other areas of science today that are sick by comparison, because they have forgotten the necessity for alternative hypotheses and disproof. Each man has only one branch - or none - on the logical tree, and it twists at random without ever coming to the need for a crucial decision at any point. We can see from the external symptoms that there is something scientifically wrong. The Frozen Method, The Eternal Surveyor, The Never Finished, The Great Man With a Single Hypothcsis, The Little Club of Dependents, The Vendetta, The All-Encompassing Theory Which Can Never Be Falsified.
  • a "theory" of this sort is not a theory at all, because it does not exclude anything. It predicts everything, and therefore does not predict anything. It becomes simply a verbal formula which the graduate student repeats and believes because the professor has said it so often. This is not science, but faith; not theory, but theology. Whether it is hand-waving or number-waving, or equation-waving, a theory is not a theory unless it can be disproved. That is, unless it can be falsified by some possible experimental outcome.
  • the work methods of a number of scientists have been testimony to the power of strong inference. Is success not due in many cases to systematic use of Bacon's "surest rules and demonstrations" as much as to rare and unattainable intellectual power? Faraday's famous diary (16), or Fermi's notebooks (3, 17), show how these men believed in the effectiveness of daily steps in applying formal inductive methods to one problem after another.
  • Surveys, taxonomy, design of equipment, systematic measurements and tables, theoretical computations - all have their proper and honored place, provided they are parts of a chain of precise induction of how nature works. Unfortunately, all too often they become ends in themselves, mere time-serving from the point of view of real scientific advance, a hypertrophied methodology that justifies itself as a lore of respectability.
  • We speak piously of taking measurements and making small studies that will "add another brick to the temple of science." Most such bricks just lie around the brickyard (20). Tables of constraints have their place and value, but the study of one spectrum after another, if not frequently re-evaluated, may become a substitute for thinking, a sad waste of intelligence in a research laboratory, and a mistraining whose crippling effects may last a lifetime.
  • Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method-oriented rather than problem-oriented. The method-oriented man is shackled; the problem-oriented man is at least reaching freely toward that is most important. Strong inference redirects a man to problem-orientation, but it requires him to be willing repeatedly to put aside his last methods and teach himself new ones.
  • anyone who asks the question about scientific effectiveness will also conclude that much of the mathematizing in physics and chemistry today is irrelevant if not misleading. The great value of mathematical formulation is that when an experiment agrees with a calculation to five decimal places, a great many alternative hypotheses are pretty well excluded (though the Bohr theory and the Schrödinger theory both predict exactly the same Rydberg constant!). But when the fit is only to two decimal places, or one, it may be a trap for the unwary; it may be no better than any rule-of-thumb extrapolation, and some other kind of qualitative exclusion might be more rigorous for testing the assumptions and more important to scientific understanding than the quantitative fit.
  • Today we preach that science is not science unless it is quantitative. We substitute correlations for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but, in my observation, they more often tend to make the thinking noncausal and fuzzy. They tend to become the object of scientific manipulation instead of auxiliary tests of crucial inferences.
  • Many - perhaps most - of the great issues of science are qualitative, not quantitative, even in physics and chemistry. Equations and measurements are useful when and only when they are related to proof; but proof or disproof comes first and is in fact strongest when it is absolutely convincing without any quantitative measurement.
  • you can catch phenomena in a logical box or in a mathematical box. The logical box is coarse but strong. The mathematical box is fine-grained but flimsy. The mathematical box is a beautiful way of wrapping up a problem, but it will not hold the phenomena unless they have been caught in a logical box to begin with.
  • Of course it is easy - and all too common - for one scientist to call the others unscientific. My point is not that my particular conclusions here are necessarily correct, but that we have long needed some absolute standard of possible scientific effectiveness by which to measure how well we are succeeding in various areas - a standard that many could agree on and one that would be undistorted by the scientific pressures and fashions of the times and the vested interests and busywork that they develop. It is not public evaluation I am interested in so much as a private measure by which to compare one's own scientific performance with what it might be. I believe that strong inference provides this kind of standard of what the maximum possible scientific effectiveness could be - as well as a recipe for reaching it.
  • The strong-inference point of view is so resolutely critical of methods of work and values in science that any attempt to compare specific cases is likely to sound but smug and destructive. Mainly one should try to teach it by example and by exhorting to self-analysis and self-improvement only in general terms
  • one severe but useful private test - a touchstone of strong inference - that removes the necessity for third-person criticism, because it is a test that anyone can learn to carry with him for use as needed. It is our old friend the Baconian "exclusion," but I call it "The Question." Obviously it should be applied as much to one's own thinking as to others'. It consists of asking in your own mind, on hearing any scientific explanation or theory put forward, "But sir, what experiment could disprove your hypothesis?"; or, on hearing a scientific experiment described, "But sir, what hypothesis does your experiment disprove?"
  • It is not true that all science is equal; or that we cannot justly compare the effectiveness of scientists by any method other than a mutual-recommendation system. The man to watch, the man to put your money on, is not the man who wants to make "a survey" or a "more detailed study" but the man with the notebook, the man with the alternative hypotheses and the crucial experiments, the man who knows how to answer your Question of disproof and is already working on it.
  •  
    There is so much bad science and bad statistics information in media reports, publications, and shared between conversants that I think it is important to understand about facts and proofs and the associated pitfalls.
Weiye Loh

Free Speech under Siege - Robert Skidelsky - Project Syndicate - 0 views

  • Breaking the cultural code damages a person’s reputation, and perhaps one’s career. Britain’s Home Secretary Kenneth Clarke recently had to apologize for saying that some rapes were less serious than others, implying the need for legal discrimination. The parade of gaffes and subsequent groveling apologies has become a regular feature of public life. In his classic essay On Liberty, John Stuart Mill defended free speech on the ground that free inquiry was necessary to advance knowledge. Restrictions on certain areas of historical inquiry are based on the opposite premise: the truth is known, and it is impious to question it. This is absurd; every historian knows that there is no such thing as final historical truth.
  • It is not the task of history to defend public order or morals, but to establish what happened. Legally protected history ensures that historians will play safe. To be sure, living by Mill’s principle often requires protecting the rights of unsavory characters. David Irving writes mendacious history, but his prosecution and imprisonment in Austria for “Holocaust denial” would have horrified Mill.
  • the pressure for “political correctness” rests on the argument that the truth is unknowable. Statements about the human condition are essentially matters of opinion.  Because a statement of opinion by some individuals is almost certain to offend others, and since such statements make no contribution to the discovery of truth, their degree of offensiveness becomes the sole criterion for judging their admissibility. Hence the taboo on certain words, phrases, and arguments that imply that certain individuals, groups, or practices are superior or inferior, normal or abnormal; hence the search for ever more neutral ways to label social phenomena, thereby draining language of its vigor and interest.
  • ...3 more annotations...
  • A classic example is the way that “family” has replaced “marriage” in public discourse, with the implication that all “lifestyles” are equally valuable, despite the fact that most people persist in wanting to get married. It has become taboo to describe homosexuality as a “perversion,” though this was precisely the word used in the 1960’s by the radical philosopher Herbert Marcuse (who was praising homosexuality as an expression of dissent). In today’s atmosphere of what Marcuse would call “repressive tolerance,” such language would be considered “stigmatizing.”
  • The sociological imperative behind the spread of “political correctness” is the fact that we no longer live in patriarchal, hierarchical, mono-cultural societies, which exhibit general, if unreflective, agreement on basic values. The pathetic efforts to inculcate a common sense of “Britishness” or “Dutchness” in multi-cultural societies, however well-intentioned, attest to the breakdown of a common identity.
  • The defense of free speech is made no easier by the abuses of the popular press. We need free media to expose abuses of power. But investigative journalism becomes discredited when it is suborned to “expose” the private lives of the famous when no issue of public interest is involved. Entertaining gossip has mutated into an assault on privacy, with newspapers claiming that any attempt to keep them out of people’s bedrooms is an assault on free speech. You know that a doctrine is in trouble when not even those claiming to defend it understand what it means. By that standard, the classic doctrine of free speech is in crisis. We had better sort it out quickly – legally, morally, and culturally – if we are to retain a proper sense of what it means to live in a free society.
  •  
    Yet freedom of speech in the West is under strain. Traditionally, British law imposed two main limitations on the "right to free speech." The first prohibited the use of words or expressions likely to disrupt public order; the second was the law against libel. There are good grounds for both - to preserve the peace, and to protect individuals' reputations from lies. Most free societies accept such limits as reasonable. But the law has recently become more restrictive. "Incitement to religious and racial hatred" and "incitement to hatred on the basis of sexual orientation" are now illegal in most European countries, independent of any threat to public order. The law has shifted from proscribing language likely to cause violence to prohibiting language intended to give offense. A blatant example of this is the law against Holocaust denial. To deny or minimize the Holocaust is a crime in 15 European countries and Israel. It may be argued that the Holocaust was a crime so uniquely abhorrent as to qualify as a special case. But special cases have a habit of multiplying.
Weiye Loh

journalism.sg » Racial and religious offence: Why censorship doesn't cut it - 1 views

  • All societies use a mix of approaches to address offensive speech. In international law, like at the European court of human rights and more and more jurisdictions, there is growing feeling that the law should really be a last resort and only used for the most extreme speech – speech that incites violence in a very direct way, or that is part of a campaign that violates the rights of minorities to live free of discrimination. In contrast, simply insulting and offending others, even if feelings are very hurt, is not seen as something that should invite a legal response. Using the law to protect feelings is too great an encroachment on freedom of speech.
  • Our laws are written very broadly, such that any sort of offence, even if it does not threaten imminent violence, is seen as deserving of strict regulation. This probably reflects a very strong social consensus that race and religion should be handled delicately. So we tend to rely on strong government. The state protects racial and religious sensibilities from offence, using censorship when there’s a danger of words and actions causing hurt.
  • in almost all cases, state action was instigated by complaints from members of the public. This is quite unlike political censorship, where action is initiated by the government, often with great resistance and opposition from netizens. In a string of cases involving racial and religious offence, however, it’s the netizens who tend to demand action, sometimes acting like a lynch mob.
  • ...5 more annotations...
  • in many cases, the offensive messages were spread further by those reporting the offence.
  • What is the justification for strong police action against any form of speech? Why do we sometimes feel that it may not be enough to counter bad speech with good speech in free and open debate, and that we must instead use the law to stop the bad speech? Surely, it must be because we think the bad speech is so dangerous that it can cause immediate harm; or because we don’t trust the public to respond rationally, so we don’t know if good speech would indeed triumph in open debate. Usually, if we call in the authorities, it must be because we have a mental picture of offensive speech being like lighting a match in a combustible atmosphere. It is dangerous and there’s no time to debate the merits of that match – we just have to put it out. The irony of most of the cases that we have seen in the past few years is that the people demanding government action, as if the offensive words were explosive, were also those who helped to spread them. It is like helping to spread a fire while calling for the fire brigade.
  • their act of spreading the offensive content must mean that they did not actually believe that the expression was really that dangerous in the sense of prompting violence through reprisal attacks or riots. In reposting the offensive words or pictures, they showed that they actually trusted the public enough to respond sympathetically – they had faith that enough people would add their voices to the outrage that they themselves felt when they saw the offensive images or videos or words.
  • This then raises the question, why the need to involve the police at all? If Singaporeans are grown-up enough to defend their society against offensive speech, why have calls for prosecution and censorship become the automatic response? I wonder if this is an example of the well-known habit of unthinkingly relying on government to solve all our problems even when, with a little bit of effort in the form of grassroots action can do the job.
  • The next time people encounter racist or religiously offensive speech, it would be nice to see swift responses from credible and respected civil society groups, Members of Parliament, and other ordinary citizens. If the speaker doesn’t get the message, organise boycotts, for example, and give him or her the clear message that our society isn’t going to take such offence lying down. The more we can respond ourselves through open debate and grassroots action, without the need to ask law and order to step in, the stronger our society will be.
  •  
    No matter how hard we work at developing media literacy, we should not expect to be rid of all racially offensive speech online. There are two broad ways to respond to these breaches. We can reach out horizontally and together with our fellow citizens repair the damage by persuading others to reject harmful ideas. Or, we can reach up vertically to government, getting the authorities to act against irresponsible speech by using the law. The advantage of the latter is that it seems more efficient, punishing those who cross the line of acceptability and violate social norms, and deterring others from doing the same. The horizontal approach works through persuasion rather than the law, so it is slower and not foolproof.
Weiye Loh

Research integrity: Sabotage! : Nature News - 0 views

  • University of Michigan in Ann Arbor
  • Vipul Bhrigu, a former postdoc at the university's Comprehensive Cancer Center, wears a dark-blue three-buttoned suit and a pinched expression as he cups his pregnant wife's hand in both of his. When Pollard Hines calls Bhrigu's case to order, she has stern words for him: "I was inclined to send you to jail when I came out here this morning."
  • Bhrigu, over the course of several months at Michigan, had meticulously and systematically sabotaged the work of Heather Ames, a graduate student in his lab, by tampering with her experiments and poisoning her cell-culture media. Captured on hidden camera, Bhrigu confessed to university police in April and pleaded guilty to malicious destruction of personal property, a misdemeanour that apparently usually involves cars: in the spaces for make and model on the police report, the arresting officer wrote "lab research" and "cells". Bhrigu has said on multiple occasions that he was compelled by "internal pressure" and had hoped to slow down Ames's work. Speaking earlier this month, he was contrite. "It was a complete lack of moral judgement on my part," he said.
  • ...16 more annotations...
  • Bhrigu's actions are surprising, but probably not unique. There are few firm numbers showing the prevalence of research sabotage, but conversations with graduate students, postdocs and research-misconduct experts suggest that such misdeeds occur elsewhere, and that most go unreported or unpoliced. In this case, the episode set back research, wasted potentially tens of thousands of dollars and terrorized a young student. More broadly, acts such as Bhrigu's — along with more subtle actions to hold back or derail colleagues' work — have a toxic effect on science and scientists. They are an affront to the implicit trust between scientists that is necessary for research endeavours to exist and thrive.
  • Despite all this, there is little to prevent perpetrators re-entering science.
  • federal bodies that provide research funding have limited ability and inclination to take action in sabotage cases because they aren't interpreted as fitting the federal definition of research misconduct, which is limited to plagiarism, fabrication and falsification of research data.
  • In Bhrigu's case, administrators at the University of Michigan worked with police to investigate, thanks in part to the persistence of Ames and her supervisor, Theo Ross. "The question is, how many universities have such procedures in place that scientists can go and get that kind of support?" says Christine Boesz, former inspector-general for the US National Science Foundation in Arlington, Virginia, and now a consultant on scientific accountability. "Most universities I was familiar with would not necessarily be so responsive."
  • Some labs are known to be hyper-competitive, with principal investigators pitting postdocs against each other. But Ross's lab is a small, collegial place. At the time that Ames was noticing problems, it housed just one other graduate student, a few undergraduates doing projects, and the lab manager, Katherine Oravecz-Wilson, a nine-year veteran of the lab whom Ross calls her "eyes and ears". And then there was Bhrigu, an amiable postdoc who had joined the lab in April 2009.
  • Some people whom Ross consulted with tried to convince her that Ames was hitting a rough patch in her work and looking for someone else to blame. But Ames was persistent, so Ross took the matter to the university's office of regulatory affairs, which advises on a wide variety of rules and regulations pertaining to research and clinical care. Ray Hutchinson, associate dean of the office, and Patricia Ward, its director, had never dealt with anything like it before. After several meetings and two more instances of alcohol in the media, Ward contacted the department of public safety — the university's police force — on 9 March. They immediately launched an investigation — into Ames herself. She endured two interrogations and a lie-detector test before investigators decided to look elsewhere.
  • At 4:00 a.m. on Sunday 18 April, officers installed two cameras in the lab: one in the cold room where Ames's blots had been contaminated, and one above the refrigerator where she stored her media. Ames came in that day and worked until 5:00 p.m. On Monday morning at around 10:15, she found that her medium had been spiked again. When Ross reviewed the tapes of the intervening hours with Richard Zavala, the officer assigned to the case, she says that her heart sank. Bhrigu entered the lab at 9:00 a.m. on Monday and pulled out the culture media that he would use for the day. He then returned to the fridge with a spray bottle of ethanol, usually used to sterilize lab benches. With his back to the camera, he rummaged through the fridge for 46 seconds. Ross couldn't be sure what he was doing, but it didn't look good. Zavala escorted Bhrigu to the campus police department for questioning. When he told Bhrigu about the cameras in the lab, the postdoc asked for a drink of water and then confessed. He said that he had been sabotaging Ames's work since February. (He denies involvement in the December and January incidents.)
  • Misbehaviour in science is nothing new — but its frequency is difficult to measure. Daniele Fanelli at the University of Edinburgh, UK, who studies research misconduct, says that overtly malicious offences such as Bhrigu's are probably infrequent, but other forms of indecency and sabotage are likely to be more common. "A lot more would be the kind of thing you couldn't capture on camera," he says. Vindictive peer review, dishonest reference letters and withholding key aspects of protocols from colleagues or competitors can do just as much to derail a career or a research project as vandalizing experiments. These are just a few of the questionable practices that seem quite widespread in science, but are not technically considered misconduct. In a meta-analysis of misconduct surveys, published last year (D. Fanelli PLoS ONE 4, e5738; 2009), Fanelli found that up to one-third of scientists admit to offences that fall into this grey area, and up to 70% say that they have observed them.
  • Some say that the structure of the scientific enterprise is to blame. The big rewards — tenured positions, grants, papers in stellar journals — are won through competition. To get ahead, researchers need only be better than those they are competing with. That ethos, says Brian Martinson, a sociologist at HealthPartners Research Foundation in Minneapolis, Minnesota, can lead to sabotage. He and others have suggested that universities and funders need to acknowledge the pressures in the research system and try to ease them by means of education and rehabilitation, rather than simply punishing perpetrators after the fact.
  • Bhrigu says that he felt pressure in moving from the small college at Toledo to the much bigger one in Michigan. He says that some criticisms he received from Ross about his incomplete training and his work habits frustrated him, but he doesn't blame his actions on that. "In any kind of workplace there is bound to be some pressure," he says. "I just got jealous of others moving ahead and I wanted to slow them down."
  • At Washtenaw County Courthouse in July, having reviewed the case files, Pollard Hines delivered Bhrigu's sentence. She ordered him to pay around US$8,800 for reagents and experimental materials, plus $600 in court fees and fines — and to serve six months' probation, perform 40 hours of community service and undergo a psychiatric evaluation.
  • But the threat of a worse sentence hung over Bhrigu's head. At the request of the prosecutor, Ross had prepared a more detailed list of damages, including Bhrigu's entire salary, half of Ames's, six months' salary for a technician to help Ames get back up to speed, and a quarter of the lab's reagents. The court arrived at a possible figure of $72,000, with the final amount to be decided upon at a restitution hearing in September.
  • Ross, though, is happy that the ordeal is largely over. For the month-and-a-half of the investigation, she became reluctant to take on new students or to hire personnel. She says she considered packing up her research programme. She even questioned her own sanity, worrying that she was the one sabotaging Ames's work via "an alternate personality". Ross now wonders if she was too trusting, and urges other lab heads to "realize that the whole spectrum of humanity is in your lab. So, when someone complains to you, take it seriously."
  • She also urges others to speak up when wrongdoing is discovered. After Bhrigu pleaded guilty in June, Ross called Trempe at the University of Toledo. He was shocked, of course, and for more than one reason. His department at Toledo had actually re-hired Bhrigu. Bhrigu says that he lied about the reason he left Michigan, blaming it on disagreements with Ross. Toledo let Bhrigu go in July, not long after Ross's call.
  • Now that Bhrigu is in India, there is little to prevent him from getting back into science. And even if he were in the United States, there wouldn't be much to stop him. The National Institutes of Health in Bethesda, Maryland, through its Office of Research Integrity, will sometimes bar an individual from receiving federal research funds for a time if they are found guilty of misconduct. But Bhigru probably won't face that prospect because his actions don't fit the federal definition of misconduct, a situation Ross finds strange. "All scientists will tell you that it's scientific misconduct because it's tampering with data," she says.
  • Ames says that the experience shook her trust in her chosen profession. "I did have doubts about continuing with science. It hurt my idea of science as a community that works together, builds upon each other's work and collaborates."
  •  
    Research integrity: Sabotage! Postdoc Vipul Bhrigu destroyed the experiments of a colleague in order to get ahead.
Weiye Loh

7 Essential Skills You Didn't Learn in College | Magazine - 0 views

shared by Weiye Loh on 15 Oct 10 - No Cached
  • Statistical Literacy Why take this course? We are misled by numbers and by our misunderstanding of probability.
  • Our world is shaped by widespread statistical illiteracy. We fear things that probably won’t kill us (terrorist attacks) and ignore things that probably will (texting while driving). We buy lottery tickets. We fall prey to misleading gut instincts, which lead to biases like loss aversion—an inability to gauge risk against potential gain. The effects play out in the grocery store, the office, and the voting booth (not to mention the bedroom: People who are more risk-averse are less successful in love).
  • We are now 53 percent more likely than our parents to trust polls of dubious merit. (That figure is totally made up. See?) Where do all these numbers that we remember so easily and cite so readily come from? How are they calculated, and by whom? How do we misuse them to make them say what we want them to? We’ll explore all of these questions in a sequence on sourcing statistics.
  • ...9 more annotations...
  • probabilistic intuition. We’ll learn to judge what’s likely and unlikely—and what’s impossible to know. We’ll learn about distorting habits of mind like selection bias—and how to guard against them. We’ll gamble. We’ll read The Art of Probability for Scientists and Engineers by Richard Hamming, Expert Political Judgment by Philip Tetlock, and How to Cheat Your Friends at Poker by Penn Jillette and Mickey Lynn.
  • Post-State Diplomacy Why take this course? As the world becomes evermore atomized, understanding the new leaders and constituencies becomes increasingly important.
  • tribal insurgents to multinational corporations, private charities to pirate gangs, religious movements to armies for hire, a range of organizations now compete with (and sometimes eclipse) the nation-states in which they reside. Without capitals or traditional constituencies, they can’t be persuaded or deterred by traditional tactics.
  • that doesn’t mean diplomacy is dead; quite the opposite. Negotiating with these parties requires the same skills as dealing with belligerent nations—understanding the shareholders and alliances they must answer to, the cultures that inform how they behave, and the religious, economic, and political interests they must address.
  • Power has always depended on who can provide justice, commerce, and stability.
  • Remix Culture Why take this course? Modern artists don’t start with a blank page or empty canvas. They start with preexisting works. What you’ll learn: How to analyze—and create—artworks made out of other artworks
  • philosophical roots of remix culture and study seminal works like Robert Rauschenberg’s Monogram and Jorge Luis Borges’ Pierre Menard, Author of Don Quixote. And we’ll examine modern-day exemplars from DJ Shadow’s Endtroducing to Auto-Tune the News.
  • Applied Cognition Why take this course? You have to know the brain to train the brain. What you’ll learn: How the mind works and how you can make it work for you.
  • Writing for New Forms Why take this course? You can write a cogent essay, but can you write it in 140 characters or less? What you’ll learn: How to adapt your message to multiple formats and audiences—human and machine.
  •  
    7 Essential Skills You Didn't Learn in College
Weiye Loh

Twitter, Facebook Won't Make You Immoral - But TV News Might | Wired Science | Wired.com - 1 views

  • It’s too soon to say that Twitter and Facebook destroy the mental foundations of morality, but not too soon to ask what they’re doing.
  • In the paper, published Monday in the Proceedings of the National Academy of Sciences, 13 people were shown documentary-style multimedia narratives designed to arouse empathy. Researchers recorded their brain activity and found that empathy is as deeply rooted in the human psyche as fear and anger.
  • They also noticed that empathic brain systems took an average of six to eight seconds to start up. The researchers didn’t connect this to media consumption habits, but the study’s press release fueled speculation that the Facebook generation could turn into sociopaths.
  • ...6 more annotations...
  • Entitled "Can Twitter Make You Amoral? Rapid-fire Media May Confuse Your Moral Compass," it claimed that the research "raises questions about the emotional cost —particularly for the developing brain — of heavy reliance on a rapid stream of news snippets obtained through television, online feeds or social networks such as Twitter."
  • Compared to in-depth news coverage, first-person Tweets of on-the-ground events, such as the 2008 Mumbai bombings, is generally unmoving. But in those situations, Twitter’s primary use is in gathering useful, immediate facts, not storytelling.
  • Most people who read a handful of words about a friend’s heartache, or see a link to a tragic story, would likely follow it up. But following links to a video news story makes the possibility of a short-circuited neurobiology of compassion becomes more real. Research suggests that people are far more empathic when stories are told in a linear way, without quick shot-to-shot edits. In a 1996 Empirical Studies of the Arts paper, researchers showed three versions of an ostensibly tear-jerking story to 120 test subjects. "Subjects had significantly more favorable impressions of the victimized female protagonist than of her male opponent only when the story structure was linear," they concluded.
  • A review of tabloid news formats in the Journal of Broadcasting & Electronic Media found that jarring, rapid-fire visual storytelling produced a physiological arousal led to better recall of what was seen, but only if the original subject matter was dull. If it was already arousing, tabloid storytelling appeared to produce a cognitive overload that actually prevented stories from sinking in.
  • "Quick cuts will draw and retain a viewer’s focus even if the content is uninteresting," said freelance video producer Jill Bauerle. "MTV-like jump cuts, which have become the standard for many editors, serve as a sort of eye candy to keep eyeballs peeled to screen."
  • f compassion can only be activated by sustained attention, which is prevented by fast-cut editing, then the ability to be genuinely moved by another’s story could atrophy. It might even fail to properly develop in children, whose brains are being formed in ways that will last a lifetime. More research is clearly needed, including a replication of the original empathy findings, but the hypothesis is plausible.
  •  
    Twitter, Facebook Won't Make You Immoral - But TV News Might
Weiye Loh

James Lovelock is an example to every scientist « Prospect Magazine - 0 views

  • Lovelock, creator of the controversial Gaia hypothesis, is certainly still capable of original thinking, and it was his verdict on recent environmental controversies—such as the leaked emails from the University of East Anglia—that his audience wanted to hear.
  • The Gaia hypothesis, which made Lovelock the darling of the emerging green movement of the 1960s, proposes that life on earth is closely coupled with the surface, ocean and atmosphere. Each element co-operates to keep conditions relatively constant—at least in the absence of exceptional external forces. Initially, he was ridiculed: the idea that inanimate objects such as rocks are active participants in a super-organism was understandably controversial, and rejected by most scientists. But the idea that life exerts a strong influence on the environment has come to be widely accepted.
  • Lovelock has tended to be at the apocalyptic end of the climate change spectrum. He predicted in 2006 that average temperatures would rise by 8°C in temperate regions by the end of the 21st century, leading to billions of deaths and leaving only the polar regions habitable.
  • ...4 more annotations...
  • He recently and rightly slated Ed Milliband, secretary of state for energy and climate change, for his ludicrous assertion that “opposition to wind farms should be as unacceptable as failing to wear a seatbelt,” describing this as political or environmental correctness veering towards fascism.
  • He has also retreated considerably from his extreme position of 2006, just as he earlier disassociated himself from some of the weirder extensions of the Gaia hypothesis, like the idea that the Earth as a whole is part of a universal consciousness.
  • Lovelock’s central point was that climate change models are not yet fit to make predictions even 40 years ahead. His position that continued release of carbon into the atmosphere constitutes a grave threat was unaltered, but he seemed to concede that the changes might not be as severe or rapid as he had earlier predicted.
  • It could be argued that Lovelock was over-hasty with his predictions of near extinction, but instead we should take heart that, almost half a century after developing his original hypothesis, he is still willing and capable of modifying his views on the basis of evidence. It is a good example not just for many younger scientists, but to everyone. On another level, it will perhaps raise hopes that some form of climate consensus can emerge out of the recent controversies. What we desperately need now is a more balanced and sustainable long-term energy strategy.
  •  
    James Lovelock is an example to every scientist
Weiye Loh

LRB · Jim Holt · Smarter, Happier, More Productive - 0 views

  • There are two ways that computers might add to our wellbeing. First, they could do so indirectly, by increasing our ability to produce other goods and services. In this they have proved something of a disappointment. In the early 1970s, American businesses began to invest heavily in computer hardware and software, but for decades this enormous investment seemed to pay no dividends. As the economist Robert Solow put it in 1987, ‘You can see the computer age everywhere but in the productivity statistics.’ Perhaps too much time was wasted in training employees to use computers; perhaps the sorts of activity that computers make more efficient, like word processing, don’t really add all that much to productivity; perhaps information becomes less valuable when it’s more widely available. Whatever the case, it wasn’t until the late 1990s that some of the productivity gains promised by the computer-driven ‘new economy’ began to show up – in the United States, at any rate. So far, Europe appears to have missed out on them.
  • The other way computers could benefit us is more direct. They might make us smarter, or even happier. They promise to bring us such primary goods as pleasure, friendship, sex and knowledge. If some lotus-eating visionaries are to be believed, computers may even have a spiritual dimension: as they grow ever more powerful, they have the potential to become our ‘mind children’. At some point – the ‘singularity’ – in the not-so-distant future, we humans will merge with these silicon creatures, thereby transcending our biology and achieving immortality. It is all of this that Woody Allen is missing out on.
  • But there are also sceptics who maintain that computers are having the opposite effect on us: they are making us less happy, and perhaps even stupider. Among the first to raise this possibility was the American literary critic Sven Birkerts. In his book The Gutenberg Elegies (1994), Birkerts argued that the computer and other electronic media were destroying our capacity for ‘deep reading’. His writing students, thanks to their digital devices, had become mere skimmers and scanners and scrollers. They couldn’t lose themselves in a novel the way he could. This didn’t bode well, Birkerts thought, for the future of literary culture.
  • ...6 more annotations...
  • Suppose we found that computers are diminishing our capacity for certain pleasures, or making us worse off in other ways. Why couldn’t we simply spend less time in front of the screen and more time doing the things we used to do before computers came along – like burying our noses in novels? Well, it may be that computers are affecting us in a more insidious fashion than we realise. They may be reshaping our brains – and not for the better. That was the drift of ‘Is Google Making Us Stupid?’, a 2008 cover story by Nicholas Carr in the Atlantic.
  • Carr thinks that he was himself an unwitting victim of the computer’s mind-altering powers. Now in his early fifties, he describes his life as a ‘two-act play’, ‘Analogue Youth’ followed by ‘Digital Adulthood’. In 1986, five years out of college, he dismayed his wife by spending nearly all their savings on an early version of the Apple Mac. Soon afterwards, he says, he lost the ability to edit or revise on paper. Around 1990, he acquired a modem and an AOL subscription, which entitled him to spend five hours a week online sending email, visiting ‘chat rooms’ and reading old newspaper articles. It was around this time that the programmer Tim Berners-Lee wrote the code for the World Wide Web, which, in due course, Carr would be restlessly exploring with the aid of his new Netscape browser.
  • Carr launches into a brief history of brain science, which culminates in a discussion of ‘neuroplasticity’: the idea that experience affects the structure of the brain. Scientific orthodoxy used to hold that the adult brain was fixed and immutable: experience could alter the strengths of the connections among its neurons, it was believed, but not its overall architecture. By the late 1960s, however, striking evidence of brain plasticity began to emerge. In one series of experiments, researchers cut nerves in the hands of monkeys, and then, using microelectrode probes, observed that the monkeys’ brains reorganised themselves to compensate for the peripheral damage. Later, tests on people who had lost an arm or a leg revealed something similar: the brain areas that used to receive sensory input from the lost limbs seemed to get taken over by circuits that register sensations from other parts of the body (which may account for the ‘phantom limb’ phenomenon). Signs of brain plasticity have been observed in healthy people, too. Violinists, for instance, tend to have larger cortical areas devoted to processing signals from their fingering hands than do non-violinists. And brain scans of London cab drivers taken in the 1990s revealed that they had larger than normal posterior hippocampuses – a part of the brain that stores spatial representations – and that the increase in size was proportional to the number of years they had been in the job.
  • The brain’s ability to change its own structure, as Carr sees it, is nothing less than ‘a loophole for free thought and free will’. But, he hastens to add, ‘bad habits can be ingrained in our neurons as easily as good ones.’ Indeed, neuroplasticity has been invoked to explain depression, tinnitus, pornography addiction and masochistic self-mutilation (this last is supposedly a result of pain pathways getting rewired to the brain’s pleasure centres). Once new neural circuits become established in our brains, they demand to be fed, and they can hijack brain areas devoted to valuable mental skills. Thus, Carr writes: ‘The possibility of intellectual decay is inherent in the malleability of our brains.’ And the internet ‘delivers precisely the kind of sensory and cognitive stimuli – repetitive, intensive, interactive, addictive – that have been shown to result in strong and rapid alterations in brain circuits and functions’. He quotes the brain scientist Michael Merzenich, a pioneer of neuroplasticity and the man behind the monkey experiments in the 1960s, to the effect that the brain can be ‘massively remodelled’ by exposure to the internet and online tools like Google. ‘THEIR HEAVY USE HAS NEUROLOGICAL CONSEQUENCES,’ Merzenich warns in caps – in a blog post, no less.
  • It’s not that the web is making us less intelligent; if anything, the evidence suggests it sharpens more cognitive skills than it dulls. It’s not that the web is making us less happy, although there are certainly those who, like Carr, feel enslaved by its rhythms and cheated by the quality of its pleasures. It’s that the web may be an enemy of creativity. Which is why Woody Allen might be wise in avoiding it altogether.
  • empirical support for Carr’s conclusion is both slim and equivocal. To begin with, there is evidence that web surfing can increase the capacity of working memory. And while some studies have indeed shown that ‘hypertexts’ impede retention – in a 2001 Canadian study, for instance, people who read a version of Elizabeth Bowen’s story ‘The Demon Lover’ festooned with clickable links took longer and reported more confusion about the plot than did those who read it in an old-fashioned ‘linear’ text – others have failed to substantiate this claim. No study has shown that internet use degrades the ability to learn from a book, though that doesn’t stop people feeling that this is so – one medical blogger quoted by Carr laments, ‘I can’t read War and Peace any more.’
Weiye Loh

Digital's Great Teenage Misunderstanding | ClickZ - 0 views

  • To quote, "most noteworthy was the shift in e-mail usage, particularly among young people. Total Web-based e-mail use was down eight percent last year, led by a walloping 59 percent drop among 12 to 17 year olds." I must reemphasize, the data is only for Web-based e-mail usage (think Hotmail, Yahoo Mail, Gmail, etc.) and that's an important distinction. A decline is a decline, but this certainly doesn't fully cover how e-mail is consumed in today's digital world.
  • Mark Zuckerberg offered this at the Facebook Messaging announcement: "High school kids don't use e-mail, they use SMS a lot. People want lighter weight things like SMS and IM to message each other."
  • There are two significant issues that must be added to the conversation though: Mobile's impact: The typical smartphone user spends almost half of her time on e-mail. This makes comScore's metrics marginal since it evaluated only Web-based e-mail usage. As e-Dialog CEO John Rizzi thoughtfully points out on a recent blog post: "In the 18-24 age group, unique visits increased 9%, while time spent decreased 10%. To me this points to the increasing use of mobile to triage inboxes on the go, and the desktop inbox being used to access specific e-mails and perform tasks like getting a code for a sale, or composing an e-mail reply that would be too onerous on a mobile phone. In fact, comScore found that 30% of respondents are viewing e-mail on their mobile phone, a 36% increase from 2009, and those using mobile e-mail daily increased 40% on average." Pew Internet recently evaluated how Internet users of different age groups spent their time online. Guess what? Even 90 to 100 percent of Millennials (ages 18-33) used e-mail. As you can see in the chart, below, e-mail was the top activity across all age groups.*
  • ...1 more annotation...
  • Teenagers become adults: I may not win any scientific breakthrough awards for this statement, but people are missing the boat on this piece of the puzzle. What happens when a teenager becomes an adult in the workplace? Not only do they dress, speak, and act differently - they use different approaches to communicate, too. The first thing a new employee typical gets is…an e-mail address. And guess what? They use it, even if they have been reliant on social, IM, and texting for their primary communication channels. They will correspond for work via e-mail and opt in to e-mails from their favorite brands (including brands that they certainly did not like as a teenager). They likely will also "Like" their favorite companies on Facebook, follow them on Twitter, and opt in to SMS offers as well. They will also expect different value and information in each of these channels.
Weiye Loh

Want your opinions distorted and misrepresented? Write in to The Straits Time... - 0 views

  • Letter sent by by my good friend Samuel C. Wee to ST on the 8th of March, quoting statistics from their Page One infographic: (Read this closely!) I read with keen interest the news that social mobility in Singapore’s education system is still alive and well (“School system still ‘best way to move up’”; Monday). It is indeed heartwarming to learn that only 90% of children from one-to-three-room flats do not make it to university. I firmly agree with our Education Minister Dr Ng Eng Hen, who declared that “education remains the great social leveller in Singaporean society”. His statement is backed up with the statistic that 50% of children from the bottom third of the socio-economic ladder score in the bottom third of the Primary School Leaving Examination. In recent years, there has been much debate about elitism and the impact that a family’s financial background has on a child’s educational prospects. Therefore, it was greatly reassuring to read about Dr Ng’s great faith in our “unique, meritocratic Singapore system”, which ensures that good, able students from the middle-and-high income groups are not circumscribed or restricted in any way in the name of helping financially disadvantaged students. I would like to commend Ms Rachel Chang on her outstanding article. On behalf of the financially disadvantaged students of Singapore, I thank the fine journalists of the Straits Times for their tireless work in bringing to Singaporeans accurate and objective reporting.
  • What was actually published last Friday, March 18th 2011 A reassuring experience of meritocratic system I READ with keen interest the news that social mobility in Singapore’s education system is still alive and well (‘School system still ‘best way to move up”; March 8). It is indeed heartwarming to learn that almost 50 per cent of children from one- to three-room flats make it to university and polytechnics. I firmly agree with Education Minister Ng Eng Hen, who said that education remains the great social leveller in Singapore society. His statement is backed by the statistic that about 50 per cent of children from the bottom third of the socio-economic bracket score within the top two-thirds of their Primary School Leaving Examination cohort. There has been much debate about elitism and the impact that a family’s financial background has on a child’s educational prospects. Therefore, it was reassuring to read about Dr Ng’s own experience of the ‘unique, meritocratic Singapore system’: he grew up in a three-room flat with five other siblings, and his medical studies at the National University of Singapore were heavily subsidised; later, he trained as a cancer surgeon in the United States using a government scholarship. The system also ensures that good, able students from the middle- and high-income groups are not circumscribed or restricted in any way in the name of helping financially disadvantaged students.
  • To give me the byline would be an outrageous flattery and a gross injustice to the forum editors of ST, who took the liberty of taking my observations about the statistics and subtly replacing them with more politically correct (but significantly and essentially different) statistics.
  • ...3 more annotations...
  • Of course, ST reserves the right to edit my letter for clarity and length. When said statistics in question were directly taken from their original article, though, one has to wonder if there hasn’t been a breakdown in communication over there. I’m dreadfully sorry, forum editors, I should have double-checked my original source (your journalist Ms Rachel Chang) before sending my letter.
  • take a look at how my pride in our meritocratic system in my originally letter has been transfigured into awe at Dr Ng’s background, for example! Dear friends, when an editor takes the time and effort to not just paraphrase but completely and utterly transform your piece in both intent and meaning, then what can we say but bravo.
  • There are surely no lazy slackers over at the Straits Times; instead we have evidently men and women who dedicate time and effort to correct their misguided readers, and protect them from the shame of having their real opinions published.
Weiye Loh

A Data State of Mind | Think Quarterly - 0 views

  • Rosling has maintained a fact-based worldview – an understanding of how global health trends act as a signifier for economic development based on hard data. Today, he argues, countries and corporations alike need to adopt that same data-driven understanding of the world if they are to make sense of the changes we are experiencing in this new century, and the opportunities and challenges that lie ahead.
  • the world has changed so much, what people need isn’t more data but a new mindset. They need a new storage system that can handle this new information. But what I have found over the years is that the CEOs of the biggest companies are actually those that already have the most fact-based worldview, more so than in media, academia or politics. Those CEOs that haven’t grasped the reality of the world have already failed in business. If they don’t understand what is happening in terms of potential new markets in the Middle East, Africa and so on, they are out. So the bigger and more international the organisation, the more fact-based the CEO’s worldview is likely to be. The problem is that they are slow in getting their organisation to follow.
  • Companies as a whole are stuck in the rut of an old mindset. They think in outworn categories and follow habits and assumptions that are not, or only rarely, based on fact.
  • ...10 more annotations...
  • For instance, in terms of education levels, we no longer live in a world that is divided into the West and the rest; our world today stretches from Canada to Yemen with all the other countries somewhere in between. There’s a broad spectrum of levels
  • even when people act within a fact-based worldview, they are used to talking with sterile figures. They are used to standing on a podium, clicking through slide shows in PowerPoint rather than interacting with their presentation. The problem is that companies have a strict separation between their IT department, where datasets are produced, and the design department, so hardly any presenters are proficient in both. Yet this is what we need. Getting people used to talking with animated data is, to my mind, a literacy project.
  • What’s important today is not just financial data but child mortality rates, the number of children per women, education levels, etc. In the world today, it’s not money that drags people into modern times, it’s people that drag money into modern times.
  • I can demonstrate human resources successes in Asia through health being improved, family size decreasing and then education levels increasing. That makes sense: when more children survive, parents accept that there is less need for multiple births, and they can afford to put their children through school. So Pfizer have moved their research and development of drugs to Asia, where there are brilliant young people who are amazing at developing drugs. It’s realising this kind of change that’s important.
  • The problem isn’t that specialised companies lack the data they need, it’s that they don’t go and look for it, they don’t understand how to handle it.
  • What is so strong with animation is that it provides that mindset shift in market segmentation. We can see where there are highly developed countries with a good economy and a healthy and well-educated staff.
  • At the moment, I’m quarrelling with Sweden’s Minister of Foreign Affairs. He says that the West has to make sure its lead over the rest of the world doesn’t erode. This is a completely wrong attitude. Western Europe and other high-income countries have to integrate themselves into the world in the same way big companies are doing. They have to look at the advantages, resources and markets that exist in different places around the world.
  • And some organisations aren’t willing to share their data, even though it would be a win-win situation for everybody and we would do much better in tackling the problems we need to tackle. Last April, the World Bank caved in and finally embraced an open data policy, but the OECD uses tax money to compile data and then sells it in a monopolistic way. The Chinese Statistical Bureau provides data more easily than the OECD. The richest countries in the world don’t have the vision to change.
  • ‘database hugging disorder’
  • we have to instil a clear division of labour between those who provide the datasets – like the World Bank, the World Health Organisation or companies themselves – those who provide new technologies to access or process them, like Google or Microsoft, and those who ‘play’ with them and give data meaning. It’s like a great concert: you need a Mozart or a Chopin to write wonderful music, then you need the instruments and finally the musicians.
Weiye Loh

Geeks at the Beach: 10 Summer Reads About Technology and Your Life - Technology - The C... - 0 views

  • we're so excited about checking e-mail and Facebook that we're neglecting face-to-face relationships, but that it's not too late to make some "corrections" to our high-tech habits. It's time to turn off the BlackBerry for a few minutes and set some ground rules for blending cyberspace with personal space.
  • examples such as Wikipedia and a ride-sharing Web site as proof that "the harnessing of our cognitive surplus allows people to behave in increasingly generous, public, and social ways."
  • the transformative potential of the Internet, as more people use their free time in active, collaborative projects rather than watching television.
  • ...5 more annotations...
  • Mr. Vaidhyanathan, a professor of media studies and law at the University of Virginia and frequent contributor to The Chronicle Review, reminds readers that they aren't consumers of Google's offerings. Rather, their use of Google's services is the product it sells to advertisers. Both books look at the continuing evolution of the Google Books settlement as a key test of how far the company's reach could extend and a sign of how the perception of Google has changed from that of scrappy upstart with a clever motto, "Don't be evil," to global behemoth accused by some of being just that.
  • Is the Internet on its way to getting monopolized? That question underlies Tim Wu's The Master Switch. The eccentric Columbia Law School professor—he's known to dress up as a blue bear at the annual Burning Man festival—recounts how ruthless companies consolidated their power over earlier information industries like the telephone, radio, and film. So which tech giant seems likely to grab control of the net?
  • it feels like we're perpetually on the verge of a tipping point, when e-books will overtake print books as a source of revenue for publishers. John B. Thompson, a sociologist at the University of Cambridge, analyzes the inner workings of the contemporary trade-publishing industry. (He did the same for scholarly publishing in an earlier work, Books in the Digital Age.) Mr. Thompson examines the roles played by agents, editors, and authors as well as differences among small, medium, and large publishing operations, and he probes under the surface of the great digital shift. We're too hung up on the form of the book, he argues: "A revolution has taken place in publishing, but it is a revolution in the process rather than a revolution in the product."
  • technology is actually doing far more to bolster authoritarian regimes than to overturn them, writes Evgeny Morozov in this sharp reality check on the media-fueled notion that information is making everybody free. Mr. Morozov, a visiting scholar at Stanford University, points out that the Iranian government posted "most wanted" pictures of protesters on the Web, leading to several arrests. The Muslim Brotherhood blogs actively in Egypt. And China pays people to make pro-authority statements on the Internet, paying a few cents for each endorsement. The Twitter revolution, in this book, is "overblown and completely unsubstantiated rhetoric."
  • Internet is rewiring our brains and short-circuiting our ability to think. And that has big consequences for teaching, he told The Chronicle last year: "The assumption that the more media, the more messaging, the more social networking you can bring in will lead to better educational outcomes is not only dubious but in many cases is probably just wrong."
Weiye Loh

Some groups having a cow over Marge Simpson's Playboy debut | Breaking Midstate News wi... - 0 views

  • The issue of Playboy magazine that will start hitting newsstands today bears an image of a semi-nude Marge Simpson, Bart's mom.
  • No matter that Marge Simpson is neither really nude nor really, well, REAL, some people are not happy to see her on the racy magazine's cover.Yesterday, the conservative American Family Association, or AFA, yesterday called on 7-Eleven stores to reconsider their decision to sell the issue in their stores.A 7-Eleven spokeswoman said company-owned stores do not typically carry Playboy, but will be able to order this one issue as a "nice collectible." Some franchise 7-Eleven stores do carry Playboy on a regular basis.
  • “It’s irresponsible of 7-Eleven to display porn in front of boys who pop into 7-11s for a hot dog or a Slurpee,” Randy Sharp, AFA special projects director, said in a prepared statement Thursday.
  • ...1 more annotation...
  • “The cover ... can easily lead them into an addictive porn habit,” he said.
    • Weiye Loh
       
      This argument has long been debunked by the Japanese. -_-"
  •  
    Marge Simpson's debut on playboy. Nuff said. lol
Weiye Loh

Singapore does not have Third World Living Standards | the kent ridge common - 0 views

  • I apologise for this long overdue article to highlight the erroneous insinuations by my fellow KRC writer’s post, UBS: Singapore has Third World Living Standards.
  • The Satay Club post’s title was “UBS: Singapore has Russian Standard of Living”. The Original UBS report was even less suggestive, and in fact hardly made any value judgment at all. The original UBS report just presented a whole list of statistics, according to whichever esoteric mathematical calculation they used
  • As my JC economics teacher quipped, “If you abuse the statistics long enough, it will confess.” On one hand, UBS has not suggested that Singapore has third world living standards. On the other hand, I think it is justified to question how my KRC writer has managed to conclude from these statistics that Singapore has “Third World Living Standards”.
  • ...2 more annotations...
  • The terminology of “Third World” and “First World” are also problematic. The more “politically correct” terms used now are “developing” and “developed”. Whatever the charge, whatever your choice of terminology, Moscow and Tallinn are hardly “Third World” or “developing”. I have never been there myself, and unfortunately have no personal account to give, but a brief look at the countries listed below Singapore in the Wage Levels index- Beijing, Shanghai, Santiago de Chile, Buenos Aires, Delhi, Mexico City even – would make me cautious about abstracting from these statistics any indication at all about “living standards”.
  • The living “habits” and rhythms of life in all these various cities are as heterogeneous as these statistics are homogenizing, by placing them all on the same scale of measurement. This is not to say that we cannot have fruitful comparatives across societies – but that these statistics are not sufficient for such a venture. At the very least UBS’ mathematical methodology requires a greater analysis which was not provided in the previous KRC article. The burden of proof here is really on my fellow KRC writer to show that Singapore has Third World living standards, and the analysis that has been offered needs more to work.
Weiye Loh

Art and Attribution: Who is an "Artist"? » Sociological Images - 0 views

  • NPR short on artist Liu Bolin.  Bolin, we are told, “has a habit of painting himself” so as to disappear into his surroundings.  The idea is to illustrate the way in which humans are increasingly “merged” with their environment.
  • So how does he do it?  Well, it turns out that he doesn’t.  Instead, “assistants” spend hours painting him.  And someone else photographs him.  He just stands there.  Watch how the process is described in this one minute clip:

    So what makes an artist?

  • One might argue that it was Bolin who had the idea to illustrate the contemporary human condition in this way. That the “art” in this work is really in his inspiration, while the “work” in this art is what is being done by the assistants. Yet clearly there is “art” in their work, too, given that they are to be credited for creating the eerie illusions with paint. Yet it is Bolin who is named as the artist; his assistants aren’t named at all.  What is it about the art world — or our world more generally — that makes this asymmetrical attribution go unnoticed so much of the time?
  • ...3 more annotations...
  • historically it probably goes back to the master/apprentice, atelier setup of the Renaissance era and earlier. And then with the cult of the “genius” that surrounds artists nowadays, it’s no wonder that assistants would be invisible.
  • In my art history classes about Renaissance and other classical painting, we talked about how often the “artist” would be the master painter, but had a lot of help from one or more assistants when executing the painting. Every now and then one of those assistants/apprentices would be considered good enough to go off and be recognized as an artist on his own, but in general, those guys were pretty nameless despite sometimes decades of service.
  • similar to the way that businesses and organizations have public faces – CEOs, etc. – and the efforts of everyone who works for them are often credited to the CEOs themselves, for better or for worse, whether they deserve the accolades or not. There’s some asymmetrical attribution for you!
Weiye Loh

Turning Privacy "Threats" Into Opportunities - Esther Dyson - Project Syndicate - 0 views

  • ost disclosure statements are not designed to be read; they are designed to be clicked on. But some companies actually want their customers to read and understand the statements. They don’t want customers who might sue, and, just in case, they want to be able to prove that the customers did understand the risks. So the leaders in disclosure statements right now tend to be financial and health-care companies – and also space-travel and extreme-sports vendors. They sincerely want to let their customers know what they are getting into, because a regretful customer is a vengeful one. That means making disclosure statements readable. I would suggest turning them into a quiz. The user would not simply click a single button, but would have to select the right button for each question. For example: What are my chances of dying in space? A) 5% B) 30% C) 1-4% (the correct answer, based on experience so far; current spacecraft are believed to be safer.) Now imagine: Who can see my data? A) I can. B) XYZ Corporation. C) XYZ Corporation’s marketing partners. (Click here to see the list.) D) XYZ Corporation’s affiliates and anyone it chooses. As the customer picks answers, she gets a good idea of what is going on. In fact, if you're a marketer, why not dispense with a single right answer and let the consumer specify what she wants to have happen with her data (and corresponding privileges/access rights if necessary)? That’s much more useful than vague policy statements. Suddenly, the disclosure statement becomes a consumer application that adds value to the vendor-consumer relationship.
  • And show the data themselves rather than a description.
  • this is all very easy if you are the site with which the user communicates directly; it is more difficult if you are in the background, a third party collecting information surreptitiously. But that practice should be stopped, anyway.
  • ...4 more annotations...
  • just as they have with Facebook, users will become more familiar with the idea of setting their own privacy preferences and managing their own data. Smart vendors will learn from Facebook; the rest will lose out to competitors. Visualizing the user's information and providing an intelligible interface is an opportunity for competitive advantage.
  • I see this happening already with a number of companies, including some with which I am involved. For example, in its research surveys, 23andMe asks people questions such as how often they have headaches or whether they have ever been exposed to pesticides, and lets them see (in percentages) how other 23andMe users answer the question. This kind of information is fascinating to most people. TripIt lets you compare and match your own travel plans with those of friends. Earndit lets you compete with others to exercise more and win points and prizes.
  • Consumers increasingly expect to be able to see themselves both as individuals and in context. They will feel more comfortable about sharing data if they feel confident that they know what is shared and what is not. The online world will feel like a well-lighted place with shops, newsstands, and the like, where you can see other people and they can see you. Right now, it more often feels like lurking in a spooky alley with a surveillance camera overlooking the scene.
  • Of course, there will be “useful” data that an individual might not want to share – say, how much alcohol they buy, which diseases they have, or certain of their online searches. They will know how to keep such information discreet, just as they might close the curtains to get undressed in their hotel room after enjoying the view from the balcony. Yes, living online takes a little more thought than living offline. But it is not quite as complex once Internet-based services provide the right tools – and once awareness and control of one’s own data become a habit.
  •  
    companies see consumer data as something that they can use to target ads or offers, or perhaps that they can sell to third parties, but not as something that consumers themselves might want. Of course, this is not an entirely new idea, but most pundits on both sides - privacy advocates and marketers - don't realize that rather than protecting consumers or hiding from them, companies should be bringing them into the game. I believe that successful companies will turn personal data into an asset by giving it back to their customers in an enhanced form. I am not sure exactly how this will happen, but current players will either join this revolution or lose out.
1 - 19 of 19
Showing 20 items per page