Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Reliability

Rss Feed Group items tagged

Weiye Loh

Freakonomics » The Revolution Will Not Be Televised. But It Will Be Tweeted - 0 views

  • information alone does not destabilize an oppressive regime. In fact, more information (and the control of that information) is a major source of political strength for any ruling party. The state controlled media of North Korea is a current example of the power of propaganda, much as it was in the Soviet Union and Nazi Germany, where the state heavily subsidized the diffusion of radios during the 1930s to help spread Nazi propaganda.
  • changes in technology do not by themselves weaken the state. While Twitter played a role in the Iranian protests in 2009, the medium was used effectively by the Iranian regime to spread rumors and disinformation. But, if information becomes not just more widespread but more reliable, the regime’s chances of survival are significantly diminished. In this sense, though social media like Twitter and Facebook appear to be a scattered mess, they are more reliable than state controlled messages.
  • The model predicts that a given percentage increase in information reliability has exactly twice as large an effect on the regime’s chances as the same percentage increase in information quantity, so, overall, an information revolution that leads to roughly equal-sized percentage increases in both these characteristics will reduce a regime’s chances of surviving.-
  •  
    If the quantity of information available to citizens is sufficiently high, then the regime has a better chance of surviving. However, an increase in the reliability of information can reduce the regime's chances. These two effects are always in tension: a regime benefits from an increase in information quantity if and only if an increase in information reliability reduces its chances. The model allows for two kinds of information revolutions. In the first, associated with radio and mass newspapers under the totalitarian regimes of the early twentieth century, an increase in information quantity coincides with a shift towards media institutions more accommodative of the regime and, in this sense, a decrease in information reliability. In this case, both effects help the regime. In the second kind, associated with diffuse technologies like modern social media, an increase in information quantity coincides with a shift towards sources of information less accommodative of the regime and an increase in information reliability. This makes the quantity and reliability effects work against each other.
Weiye Loh

A Brief Primer on Criminal Statistics « Canada « Skeptic North - 0 views

  • Occurrences of crime are properly expressed as the number of incidences per 100,000 people. Total numbers are not informative on their own and it is very easy to manipulate an argument by cherry picking between a total number and a rate.  Beware of claims about crime that use raw incidence numbers. When a change in whole incidence numbers is observed, this might not have any bearing on crime levels at all, because levels of crime are dependent on population.
  • Whole Numbers versus Rates
  • Reliability Not every criminal statistic is equally reliable. Even though we have measures of incidences of crimes across types and subtypes, not every one of these statistics samples the actual incidence of these crimes in the same way. Indeed, very few measure the total incidences very reliably at all. The crime rates that you are most likely to encounter capture only crimes known and substantiated by police. These numbers are vulnerable to variances in how crimes become known and verified by police in the first place. Crimes very often go unreported or undiscovered. Some crimes are more likely to go unreported than others (such as sexual assaults and drug possession), and some crimes are more difficult to substantiate as having occurred than others.
  • ...9 more annotations...
  • Complicating matters further is the fact that these reporting patterns vary over time and are reflected in observed trends.   So, when a change in the police reported crime rate is observed from year to year or across a span of time we may be observing a “real” change, we may be observing a change in how these crimes come to the attention of police, or we may be seeing a mixture of both.
  • Generally, the most reliable criminal statistic is the homicide rate – it’s very difficult, though not impossible, to miss a dead body. In fact, homicides in Canada are counted in the year that they become known to police and not in the year that they occurred.  Our most reliable number is very, very close, but not infallible.
  • Crimes known to the police nearly always under measure the true incidence of crime, so other measures are needed to better complete our understanding. The reported crimes measure is reported every year to Statistics Canada from data that makes up the Uniform Crime Reporting Survey. This is a very rich data set that measures police data very accurately but tells us nothing about unreported crime.
  • We do have some data on unreported crime available. Victims are interviewed (after self-identifying) via the General Social Survey. The survey is conducted every five years
  • This measure captures information in eight crime categories both reported, and not reported to police. It has its own set of interpretation problems and pathways to misuse. The survey relies on self-reporting, so the accuracy of the information will be open to errors due to faulty memories, willingness to report, recording errors etc.
  • From the last data set available, self-identified victims did not report 69% of violent victimizations (sexual assault, robbery and physical assault), 62% of household victimizations (break and enter, motor vehicle/parts theft, household property theft and vandalism), and 71% of personal property theft victimizations.
  • while people generally understand that crimes go unreported and unknown to police, they tend to be surprised and perhaps even shocked at the actual amounts that get unreported. These numbers sound scary. However, the most common reasons reported by victims of violent and household crime for not reporting were: believing the incident was not important enough (68%) believing the police couldn’t do anything about the incident (59%), and stating that the incident was dealt with in another way (42%).
  • Also, note that the survey indicated that 82% of violent incidents did not result in injuries to the victims. Do claims that we should do something about all this hidden crime make sense in light of what this crime looks like in the limited way we can understand it? How could you be reasonably certain that whatever intervention proposed would in fact reduce the actual amount of crime and not just reduce the amount that goes unreported?
  • Data is collected at all levels of the crime continuum with differing levels of accuracy and applicability. This is nicely reflected in the concept of “the crime funnel”. All criminal incidents that are ever committed are at the opening of the funnel. There is “loss” all along the way to the bottom where only a small sample of incidences become known with charges laid, prosecuted successfully and responded to by the justice system.  What goes into the top levels of the funnel affects what we can know at any other point later.
Weiye Loh

Skepticblog » The Decline Effect - 0 views

  • The first group are those with an overly simplistic or naive sense of how science functions. This is a view of science similar to those films created in the 1950s and meant to be watched by students, with the jaunty music playing in the background. This view generally respects science, but has a significant underappreciation for the flaws and complexity of science as a human endeavor. Those with this view are easily scandalized by revelations of the messiness of science.
  • The second cluster is what I would call scientific skepticism – which combines a respect for science and empiricism as a method (really “the” method) for understanding the natural world, with a deep appreciation for all the myriad ways in which the endeavor of science can go wrong. Scientific skeptics, in fact, seek to formally understand the process of science as a human endeavor with all its flaws. It is therefore often skeptics pointing out phenomena such as publication bias, the placebo effect, the need for rigorous controls and blinding, and the many vagaries of statistical analysis. But at the end of the day, as complex and messy the process of science is, a reliable picture of reality is slowly ground out.
  • The third group, often frustrating to scientific skeptics, are the science-deniers (for lack of a better term). They may take a postmodernist approach to science – science is just one narrative with no special relationship to the truth. Whatever you call it, what the science-deniers in essence do is describe all of the features of science that the skeptics do (sometimes annoyingly pretending that they are pointing these features out to skeptics) but then come to a different conclusion at the end – that science (essentially) does not work.
  • ...13 more annotations...
  • this third group – the science deniers – started out in the naive group, and then were so scandalized by the realization that science is a messy human endeavor that the leap right to the nihilistic conclusion that science must therefore be bunk.
  • The article by Lehrer falls generally into this third category. He is discussing what has been called “the decline effect” – the fact that effect sizes in scientific studies tend to decrease over time, sometime to nothing.
  • This term was first applied to the parapsychological literature, and was in fact proposed as a real phenomena of ESP – that ESP effects literally decline over time. Skeptics have criticized this view as magical thinking and hopelessly naive – Occam’s razor favors the conclusion that it is the flawed measurement of ESP, not ESP itself, that is declining over time. 
  • Lehrer, however, applies this idea to all of science, not just parapsychology. He writes: And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.) The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.
  • Lehrer is ultimately referring to aspects of science that skeptics have been pointing out for years (as a way of discerning science from pseudoscience), but Lehrer takes it to the nihilistic conclusion that it is difficult to prove anything, and that ultimately “we still have to choose what to believe.” Bollocks!
  • Lehrer is describing the cutting edge or the fringe of science, and then acting as if it applies all the way down to the core. I think the problem is that there is so much scientific knowledge that we take for granted – so much so that we forget it is knowledge that derived from the scientific method, and at one point was not known.
  • It is telling that Lehrer uses as his primary examples of the decline effect studies from medicine, psychology, and ecology – areas where the signal to noise ratio is lowest in the sciences, because of the highly variable and complex human element. We don’t see as much of a decline effect in physics, for example, where phenomena are more objective and concrete.
  • If the truth itself does not “wear off”, as the headline of Lehrer’s article provocatively states, then what is responsible for this decline effect?
  • it is no surprise that effect science in preliminary studies tend to be positive. This can be explained on the basis of experimenter bias – scientists want to find positive results, and initial experiments are often flawed or less than rigorous. It takes time to figure out how to rigorously study a question, and so early studies will tend not to control for all the necessary variables. There is further publication bias in which positive studies tend to be published more than negative studies.
  • Further, some preliminary research may be based upon chance observations – a false pattern based upon a quirky cluster of events. If these initial observations are used in the preliminary studies, then the statistical fluke will be carried forward. Later studies are then likely to exhibit a regression to the mean, or a return to more statistically likely results (which is exactly why you shouldn’t use initial data when replicating a result, but should use entirely fresh data – a mistake for which astrologers are infamous).
  • skeptics are frequently cautioning against new or preliminary scientific research. Don’t get excited by every new study touted in the lay press, or even by a university’s press release. Most new findings turn out to be wrong. In science, replication is king. Consensus and reliable conclusions are built upon multiple independent lines of evidence, replicated over time, all converging on one conclusion.
  • Lehrer does make some good points in his article, but they are points that skeptics are fond of making. In order to have a  mature and functional appreciation for the process and findings of science, it is necessary to understand how science works in the real world, as practiced by flawed scientists and scientific institutions. This is the skeptical message.
  • But at the same time reliable findings in science are possible, and happen frequently – when results can be replicated and when they fit into the expanding intricate weave of the picture of the natural world being generated by scientific investigation.
Weiye Loh

Measuring Social Media: Who Has Access to the Firehose? - 0 views

  • The question that the audience member asked — and one that we tried to touch on a bit in the panel itself — was who has access to this raw data. Twitter doesn’t comment on who has full access to its firehose, but to Weil’s credit he was at least forthcoming with some of the names, including stalwarts like Microsoft, Google and Yahoo — plus a number of smaller companies.
  • In the case of Twitter, the company offers free access to its API for developers. The API can provide access and insight into information about tweets, replies and keyword searches, but as developers who work with Twitter — or any large scale social network — know, that data isn’t always 100% reliable. Unreliable data is a problem when talking about measurements and analytics, where the data is helping to influence decisions related to social media marketing strategies and allocations of resources.
  • One of the companies that has access to Twitter’s data firehose is Gnip. As we discussed in November, Twitter has entered into a partnership with Gnip that allows the social data provider to resell access to the Twitter firehose.This is great on one level, because it means that businesses and services can access the data. The problem, as noted by panelist Raj Kadam, the CEO of Viralheat, is that Gnip’s access can be prohibitively expensive.
  • ...3 more annotations...
  • The problems with reliable access to analytics and measurement information is by no means limited to Twitter. Facebook data is also tightly controlled. With Facebook, privacy controls built into the API are designed to prevent mass data scraping. This is absolutely the right decision. However, a reality of social media measurement is that Facebook Insights isn’t always reachable and the data collected from the tool is sometimes inaccurate.It’s no surprise there’s a disconnect between the data that marketers and community managers want and the data that can be reliably accessed. Twitter and Facebook were both designed as tools for consumers. It’s only been in the last two years that the platform ecosystem aimed at serving large brands and companies
  • The data that companies like Twitter, Facebook and Foursquare collect are some of their most valuable assets. It isn’t fair to expect a free ride or first-class access to the data by anyone who wants it.Having said that, more transparency about what data is available to services and brands is needed and necessary.We’re just scraping the service of what social media monitoring, measurement and management tools can do. To get to the next level, it’s important that we all question who has access to the firehose.
  • We Need More Transparency for How to Access and Connect with Data
Weiye Loh

Skepticblog » Litigation gone wild! A geologist's take on the Italian seismol... - 0 views

  • Apparently, an Italian lab technician named Giampaolo Giuliani made a prediction about a month before the quake, based on elevated levels of radon gas. However, seismologists have known for a long time that radon levels, like any other “magic bullet” precursor, are unreliable because no two quakes are alike, and no two quakes give the same precursors. Nevertheless, his prediction caused a furor before the quake actually happened. The Director of the Civil Defence, Guido Bertolaso, forced him to remove his findings from the Internet (old versions are still on line). Giuliani was also reported to the police for “causing fear” with his predictions about a quake near Sulmona, which was far from where the quake actually struck. Enzo Boschi, the head of the Italian National Geophysics Institute declared: “Every time there is an earthquake there are people who claim to have predicted it. As far as I know nobody predicted this earthquake with precision. It is not possible to predict earthquakes.” Most of the geological and geophysical organizations around the world made similar statements in support of the proper scientific procedures adopted by the Italian geophysical community. They condemned Giuliani for scaring people using a method that has not shown to be reliable.
  • most the of press coverage I have read (including many cited above) took the sensationalist approach, and cast Guiliani as the little “David” fighting against the “Goliath” of “Big Science”
  • none of the reporters bothered to do any real background research, or consult with any other legitimate seismologist who would confirm that there is no reliable way to predict earthquakes in the short term and Giuliani is misleading people when he says so. Giulian’s “prediction” was sheer luck, and if he had failed, no one would have mentioned it again.
  • ...4 more annotations...
  • Even though he believes in his method, he ignores the huge body of evidence that shows radon gas is no more reliable than any other “predictor”.
  • If the victims insist on suing someone, they should leave the seismologists alone and look into the construction of some of those buildings. The stories out of L’Aquila suggest that the death toll was much higher because of official corruption and shoddy construction, as happens in many countries both before and after big quakes.
  • much of the construction is apparently Mafia-controlled in that area—good luck suing them! Sadly, the ancient medieval buildings that crumbled were the most vulnerable because they were made of unreinforced masonry, the worst possible construction material in earthquake country
  • what does this imply for scientists who are working in a field that might have predictive power? In a litigious society like Italy or the U.S., this is a serious question. If a reputable seismologist does make a prediction and fails, he’s liable, because people will panic and make foolish decisions and then blame the seismologist for their losses. Now the Italian courts are saying that (despite world scientific consensus) seismologists are liable if they don’t predict quakes. They’re damned if they do, and damned if they don’t. In some societies where seismologists work hard at prediction and preparation (such as China and Japan), there is no precedent for suing scientists for doing their jobs properly, and the society and court system does not encourage people to file frivolous suits. But in litigious societies, the system is counterproductive, and stifles research that we would like to see developed. What seismologist would want to work on earthquake prediction if they can be sued? I know of many earth scientists with brilliant ideas not only about earthquake prediction but even ways to defuse earthquakes, slow down global warming, or many other incredible but risky brainstorms—but they dare not propose the idea seriously or begin to implement it for fear of being sued.
  •  
    In the case of most natural disasters, people usually regard such events as "acts of God" and  try to get on with their lives as best they can. No human cause is responsible for great earthquakes, tsunamis, volcanic eruptions, tornadoes, hurricanes, or floods. But in the bizarre world of the Italian legal system, six seismologists and a public official have been charged with manslaughter for NOT predicting the quake! My colleagues in the earth science community were incredulous and staggered at this news. Seismologists and geologists have been saying for decades (at least since the 1970s) that short-term earthquake prediction (within minutes to hours of the event) is impossible, and anyone who claims otherwise is lying. As Charles Richter himself said, "Only fools, liars, and charlatans predict earthquakes." How could anyone then go to court and sue seismologists for following proper scientific procedures?
Building Inspectors Adelaide

Reliable Adelaide Building Inspector - 2 views

I always wanted to have a house of my own. I have been eyeing a property for sale nearby which is really nice. I am planning to buy the property, but, I also wanted to make sure that the price matc...

Building Inspectors Adelaide

started by Building Inspectors Adelaide on 04 Oct 11 no follow-up yet
Weiye Loh

The Black Swan of Cairo | Foreign Affairs - 0 views

  • It is both misguided and dangerous to push unobserved risks further into the statistical tails of the probability distribution of outcomes and allow these high-impact, low-probability "tail risks" to disappear from policymakers' fields of observation.
  • Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.
  • Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups. And it is the same misperception of the properties of natural systems that led to both the economic crisis of 2007-8 and the current turmoil in the Arab world. The policy implications are identical: to make systems robust, all risks must be visible and out in the open -- fluctuat nec mergitur (it fluctuates but does not sink) goes the Latin saying.
  • ...21 more annotations...
  • Just as a robust economic system is one that encourages early failures (the concepts of "fail small" and "fail fast"), the U.S. government should stop supporting dictatorial regimes for the sake of pseudostability and instead allow political noise to rise to the surface. Making an economy robust in the face of business swings requires allowing risk to be visible; the same is true in politics.
  • Both the recent financial crisis and the current political crisis in the Middle East are grounded in the rise of complexity, interdependence, and unpredictability. Policymakers in the United Kingdom and the United States have long promoted policies aimed at eliminating fluctuation -- no more booms and busts in the economy, no more "Iranian surprises" in foreign policy. These policies have almost always produced undesirable outcomes. For example, the U.S. banking system became very fragile following a succession of progressively larger bailouts and government interventions, particularly after the 1983 rescue of major banks (ironically, by the same Reagan administration that trumpeted free markets). In the United States, promoting these bad policies has been a bipartisan effort throughout. Republicans have been good at fragilizing large corporations through bailouts, and Democrats have been good at fragilizing the government. At the same time, the financial system as a whole exhibited little volatility; it kept getting weaker while providing policymakers with the illusion of stability, illustrated most notably when Ben Bernanke, who was then a member of the Board of Governors of the U.S. Federal Reserve, declared the era of "the great moderation" in 2004.
  • Washington stabilized the market with bailouts and by allowing certain companies to grow "too big to fail." Because policymakers believed it was better to do something than to do nothing, they felt obligated to heal the economy rather than wait and see if it healed on its own.
  • The foreign policy equivalent is to support the incumbent no matter what. And just as banks took wild risks thanks to Greenspan's implicit insurance policy, client governments such as Hosni Mubarak's in Egypt for years engaged in overt plunder thanks to similarly reliable U.S. support.
  • Those who seek to prevent volatility on the grounds that any and all bumps in the road must be avoided paradoxically increase the probability that a tail risk will cause a major explosion.
  • In the realm of economics, price controls are designed to constrain volatility on the grounds that stable prices are a good thing. But although these controls might work in some rare situations, the long-term effect of any such system is an eventual and extremely costly blowup whose cleanup costs can far exceed the benefits accrued. The risks of a dictatorship, no matter how seemingly stable, are no different, in the long run, from those of an artificially controlled price.
  • Such attempts to institutionally engineer the world come in two types: those that conform to the world as it is and those that attempt to reform the world. The nature of humans, quite reasonably, is to intervene in an effort to alter their world and the outcomes it produces. But government interventions are laden with unintended -- and unforeseen -- consequences, particularly in complex systems, so humans must work with nature by tolerating systems that absorb human imperfections rather than seek to change them.
  • What is needed is a system that can prevent the harm done to citizens by the dishonesty of business elites; the limited competence of forecasters, economists, and statisticians; and the imperfections of regulation, not one that aims to eliminate these flaws. Humans must try to resist the illusion of control: just as foreign policy should be intelligence-proof (it should minimize its reliance on the competence of information-gathering organizations and the predictions of "experts" in what are inherently unpredictable domains), the economy should be regulator-proof, given that some regulations simply make the system itself more fragile. Due to the complexity of markets, intricate regulations simply serve to generate fees for lawyers and profits for sophisticated derivatives traders who can build complicated financial products that skirt those regulations.
  • The life of a turkey before Thanksgiving is illustrative: the turkey is fed for 1,000 days and every day seems to confirm that the farmer cares for it -- until the last day, when confidence is maximal. The "turkey problem" occurs when a naive analysis of stability is derived from the absence of past variations. Likewise, confidence in stability was maximal at the onset of the financial crisis in 2007.
  • The turkey problem for humans is the result of mistaking one environment for another. Humans simultaneously inhabit two systems: the linear and the complex. The linear domain is characterized by its predictability and the low degree of interaction among its components, which allows the use of mathematical methods that make forecasts reliable. In complex systems, there is an absence of visible causal links between the elements, masking a high degree of interdependence and extremely low predictability. Nonlinear elements are also present, such as those commonly known, and generally misunderstood, as "tipping points." Imagine someone who keeps adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.
  • Engineering, architecture, astronomy, most of physics, and much of common science are linear domains. The complex domain is the realm of the social world, epidemics, and economics. Crucially, the linear domain delivers mild variations without large shocks, whereas the complex domain delivers massive jumps and gaps. Complex systems are misunderstood, mostly because humans' sophistication, obtained over the history of human knowledge in the linear domain, does not transfer properly to the complex domain. Humans can predict a solar eclipse and the trajectory of a space vessel, but not the stock market or Egyptian political events. All man-made complex systems have commonalities and even universalities. Sadly, deceptive calm (followed by Black Swan surprises) seems to be one of those properties.
  • The system is responsible, not the components. But after the financial crisis of 2007-8, many people thought that predicting the subprime meltdown would have helped. It would not have, since it was a symptom of the crisis, not its underlying cause. Likewise, Obama's blaming "bad intelligence" for his administration's failure to predict the crisis in Egypt is symptomatic of both the misunderstanding of complex systems and the bad policies involved.
  • Obama's mistake illustrates the illusion of local causal chains -- that is, confusing catalysts for causes and assuming that one can know which catalyst will produce which effect. The final episode of the upheaval in Egypt was unpredictable for all observers, especially those involved. As such, blaming the CIA is as foolish as funding it to forecast such events. Governments are wasting billions of dollars on attempting to predict events that are produced by interdependent systems and are therefore not statistically understandable at the individual level.
  • Political and economic "tail events" are unpredictable, and their probabilities are not scientifically measurable. No matter how many dollars are spent on research, predicting revolutions is not the same as counting cards; humans will never be able to turn politics into the tractable randomness of blackjack.
  • Most explanations being offered for the current turmoil in the Middle East follow the "catalysts as causes" confusion. The riots in Tunisia and Egypt were initially attributed to rising commodity prices, not to stifling and unpopular dictatorships. But Bahrain and Libya are countries with high gdps that can afford to import grain and other commodities. Again, the focus is wrong even if the logic is comforting. It is the system and its fragility, not events, that must be studied -- what physicists call "percolation theory," in which the properties of the terrain are studied rather than those of a single element of the terrain.
  • When dealing with a system that is inherently unpredictable, what should be done? Differentiating between two types of countries is useful. In the first, changes in government do not lead to meaningful differences in political outcomes (since political tensions are out in the open). In the second type, changes in government lead to both drastic and deeply unpredictable changes.
  • Humans fear randomness -- a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced fitness and increased chances of survival, it can have the reverse effect in today's complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of "great moderation." This is not to say that any and all volatility should be embraced. Insurance should not be banned, for example.
  • But alongside the "catalysts as causes" confusion sit two mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing). This leads to the desire to impose man-made solutions
  • Variation is information. When there is no variation, there is no information. This explains the CIA's failure to predict the Egyptian revolution and, a generation before, the Iranian Revolution -- in both cases, the revolutionaries themselves did not have a clear idea of their relative strength with respect to the regime they were hoping to topple. So rather than subsidize and praise as a "force for stability" every tin-pot potentate on the planet, the U.S. government should encourage countries to let information flow upward through the transparency that comes with political agitation. It should not fear fluctuations per se, since allowing them to be in the open, as Italy and Lebanon both show in different ways, creates the stability of small jumps.
  • As Seneca wrote in De clementia, "Repeated punishment, while it crushes the hatred of a few, stirs the hatred of all . . . just as trees that have been trimmed throw out again countless branches." The imposition of peace through repeated punishment lies at the heart of many seemingly intractable conflicts, including the Israeli-Palestinian stalemate. Furthermore, dealing with seemingly reliable high-level officials rather than the people themselves prevents any peace treaty signed from being robust. The Romans were wise enough to know that only a free man under Roman law could be trusted to engage in a contract; by extension, only a free people can be trusted to abide by a treaty. Treaties that are negotiated with the consent of a broad swath of the populations on both sides of a conflict tend to survive. Just as no central bank is powerful enough to dictate stability, no superpower can be powerful enough to guarantee solid peace alone.
  • As Jean-Jacques Rousseau put it, "A little bit of agitation gives motivation to the soul, and what really makes the species prosper is not peace so much as freedom." With freedom comes some unpredictable fluctuation. This is one of life's packages: there is no freedom without noise -- and no stability without volatility.∂
Building Inspectors Adelaide

Reliable Pre-purchase Building Inspection - 1 views

I am planning to buy the property located near our office. I like the location of the building because it is very accessible to all public transportation from three compass points. It also gives me...

Building Inspectors Adelaide

started by Building Inspectors Adelaide on 03 Nov 11 no follow-up yet
Weiye Loh

Checking how fact-checkers check? - Marginal REVOLUTION - 0 views

  •  
    "Fact-checking has gained prominence as a reformist movement to revitalize truth-seeking ideals in journalism. While fact-checkers are often assumed to code facts accurately, no studies have formally assessed fact-checkers' performance. I evaluate the performance of two major online fact-checkers, Politfact at Tampa Bay Times and Fact Checker at Washington Post, comparing their interrater reliability using a method that is regularly utilized across the social sciences. I show that fact-checkers rarely fact-check the same statement, and when they do, there is little agreement in their ratings. Approximately, 1 in 10 statements is fact-checked by both fact-checking outlets, and among claims that both outlets check, their factual ratings have a Cohen's κ of 0.52, an agreement rate much lower than what is acceptable for social scientific coding. The results suggest that difficulties in fact-checking elites' statements may limit the ability of journalistic fact-checking to hold politicians accountable."
Weiye Loh

The American Spectator : Can't Live With Them… - 1 views

  • ommentators have repeatedly told us in recent years that the gap between rich and poor has been widening. It is true, if you compare the income of those in the top fifth of earners with the income of those in the bottom fifth, that the spread between them increased between 1996 and 2005. But, as Sowell points out, this frequently cited figure is not counting the same people. If you look at individual taxpayers, Sowell notes, those who happened to be in the bottom fifth in 1996 saw their incomes nearly double over the decade, while those who happened to be in the top fifth in 1995 saw gains of only 10 percent on average and those in the top 5 percent actually experienced decline in their incomes. Similar distortions are perpetrated by those bewailing "stagnation" in average household incomes -- without taking into account that households have been getting smaller, as rising wealth allows people to move out of large family homes.
  • Sometimes the distortion seems to be deliberate. Sowell gives the example of an ABC news report in the 1980s focusing on five states where "unemployment is most severe" -- without mentioning that unemployment was actually declining in all the other 45 states. Sometimes there seems to be willful incomprehension. Journalists have earnestly reported that "prisons are ineffective" because two-thirds of prisoners are rearrested within three years of their release. As Sowell comments: "By this kind of reasoning, food is ineffective as a response to hunger because it is only a matter of time after eating before you get hungry again. Like many other things, incarceration only works when it is done."
  • why do intellectuals often seem so lacking in common sense? Sowell thinks it goes with the job-literally: He defines "intellectuals" as "an occupational category [Sowell's emphasis], people whose occupations deal primarily with ideas -- writers, academics and the like." Medical researchers or engineers or even "financial wizards" may apply specialized knowledge in ways that require great intellectual skill, but that does not make them "intellectuals," in Sowell's view: "An intellectual's work begins and ends with ideas [Sowell's emphasis]." So an engineer "is ruined" if his bridges or buildings collapse and so with a financier who "goes broke… the proof of the pudding is ultimately in the eating…. but the ultimate test of a [literary] deconstructionist's ideas is whether other deconstructionists find those ideas interesting, original, persuasive, elegant or ingenious. There is no external test." The ideas dispensed by intellectuals aren't subject to "external" checks or exposed to the test of "verifiability" (apart from what "like-minded individuals" find "plausible") and so intellectuals are not really "accountable" in the same way as people in other occupations.
  • ...7 more annotations...
  • it is not quite true, even among tenured professors in the humanities, that idea-mongers can entirely ignore "external" checks. Even academics want to be respectable, which means they can't entirely ignore the realities that others notice. There were lots of academics talking about the achievements of socialism in the 1970s (I can remember them) but very few talking that way after China and Russia repudiated these fantasies.
  • THE MOST DISTORTING ASPECT of Sowell's account is that, in focusing so much on the delusions of intellectuals, he leaves us more confused about what motivates the rest of society. In a characteristic passage, Sowell protests that "intellectuals...have sought to replace the groups into which people have sorted themselves with groupings created and imposed by the intelligentsia. Ties of family, religion, and patriotism, for example, have long been rated as suspect or detrimental by the intelligentsia, and new ties that intellectuals have created, such as class -- and more recently 'gender' -- have been projected as either more real or more important."
  • There's no disputing the claim that most "intellectuals" -- surely most professors in the humanities-are down on "patriotism" and "religion" and probably even "family." But how did people get to be patriotic and religious in the first place? In Sowell's account, they just "sorted themselves" -- as if by the invisible hand of the market.
  • Let's put aside all the violence and intimidation that went into building so many nations and so many faiths in the past. What is it, even today, that makes people revere this country (or some other); what makes people adhere to a particular faith or church? Don't inspiring words often move people? And those who arrange these words -- aren't they doing something similar to what Sowell says intellectuals do? Is it really true, when it comes to embracing national or religious loyalties, that "the proof of the pudding is in the eating"?
  • Even when it comes to commercial products, people don't always want to be guided by mundane considerations of reliable performance. People like glamour, prestige, associations between the product and things they otherwise admire. That's why companies spend so much on advertising. And that's part of the reason people are willing to pay more for brand names -- to enjoy the associations generated by advertising. Even advertising plays on assumptions about what is admirable and enticing-assumptions that may change from decade to decade, as background opinions change. How many products now flaunt themselves as "green" -- and how many did so 20 years ago?
  • If we closed down universities and stopped subsidizing intellectual publications, would people really judge every proposed policy by external results? Intellectuals tend to see what they expect to see, as Sowell's examples show -- but that's true of almost everyone. We have background notions about how the world works that help us make sense of what we experience. We might have distorted and confused notions, but we don't just perceive isolated facts. People can improve in their understanding, developing background understandings that are more defined or more reliable. That's part of what makes people interested in the ideas of intellectuals -- the hope of improving their own understanding.
  • On Sowell's account, we wouldn't need the contributions of a Friedrich Hayek -- or a Thomas Sowell -- if we didn't have so many intellectuals peddling so many wrong-headed ideas. But the wealthier the society, the more it liberates individuals to make different choices and the more it can afford to indulge even wasteful or foolish choices. I'd say that means not that we have less need of intellectuals, but more need of better ones. 
Weiye Loh

Learn to love uncertainty and failure, say leading thinkers | Edge question | Science |... - 0 views

  • Being comfortable with uncertainty, knowing the limits of what science can tell us, and understanding the worth of failure are all valuable tools that would improve people's lives, according to some of the world's leading thinkers.
  • he ideas were submitted as part of an annual exercise by the web magazine Edge, which invites scientists, philosophers and artists to opine on a major question of the moment. This year it was, "What scientific concept would improve everybody's cognitive toolkit?"
  • the public often misunderstands the scientific process and the nature of scientific doubt. This can fuel public rows over the significance of disagreements between scientists about controversial issues such as climate change and vaccine safety.
  • ...13 more annotations...
  • Carlo Rovelli, a physicist at the University of Aix-Marseille, emphasised the uselessness of certainty. He said that the idea of something being "scientifically proven" was practically an oxymoron and that the very foundation of science is to keep the door open to doubt.
  • "A good scientist is never 'certain'. Lack of certainty is precisely what makes conclusions more reliable than the conclusions of those who are certain: because the good scientist will be ready to shift to a different point of view if better elements of evidence, or novel arguments emerge. Therefore certainty is not only something of no use, but is in fact damaging, if we value reliability."
  • physicist Lawrence Krauss of Arizona State University agreed. "In the public parlance, uncertainty is a bad thing, implying a lack of rigour and predictability. The fact that global warming estimates are uncertain, for example, has been used by many to argue against any action at the present time," he said.
  • however, uncertainty is a central component of what makes science successful. Being able to quantify uncertainty, and incorporate it into models, is what makes science quantitative, rather than qualitative. Indeed, no number, no measurement, no observable in science is exact. Quoting numbers without attaching an uncertainty to them implies they have, in essence, no meaning."
  • Neil Gershenfeld, director of the Massachusetts Institute of Technology's Centre for Bits and Atoms wants everyone to know that "truth" is just a model. "The most common misunderstanding about science is that scientists seek and find truth. They don't – they make and test models," he said.
  • Building models is very different from proclaiming truths. It's a never-ending process of discovery and refinement, not a war to win or destination to reach. Uncertainty is intrinsic to the process of finding out what you don't know, not a weakness to avoid. Bugs are features – violations of expectations are opportunities to refine them. And decisions are made by evaluating what works better, not by invoking received wisdom."
  • writer and web commentator Clay Shirky suggested that people should think more carefully about how they see the world. His suggestion was the Pareto principle, a pattern whereby the top 1% of the population control 35% of the wealth or, on Twitter, the top 2% of users send 60% of the messages. Sometimes known as the "80/20 rule", the Pareto principle means that the average is far from the middle.It is applicable to many complex systems, "And yet, despite a century of scientific familiarity, samples drawn from Pareto distributions are routinely presented to the public as anomalies, which prevents us from thinking clearly about the world," said Shirky. "We should stop thinking that average family income and the income of the median family have anything to do with one another, or that enthusiastic and normal users of communications tools are doing similar things, or that extroverts should be only moderately more connected than normal people. We should stop thinking that the largest future earthquake or market panic will be as large as the largest historical one; the longer a system persists, the likelier it is that an event twice as large as all previous ones is coming."
  • Kevin Kelly, editor-at-large of Wired, pointed to the value of negative results. "We can learn nearly as much from an experiment that does not work as from one that does. Failure is not something to be avoided but rather something to be cultivated. That's a lesson from science that benefits not only laboratory research, but design, sport, engineering, art, entrepreneurship, and even daily life itself. All creative avenues yield the maximum when failures are embraced."
  • Michael Shermer, publisher of the Skeptic Magazine, wrote about the importance of thinking "bottom up not top down", since almost everything in nature and society happens this way.
  • But most people don't see things that way, said Shermer. "Bottom up reasoning is counterintuitive. This is why so many people believe that life was designed from the top down, and why so many think that economies must be designed and that countries should be ruled from the top down."
  • Roger Schank, a psychologist and computer scientist, proposed that we should all know the true meaning of "experimentation", which he said had been ruined by bad schooling, where pupils learn that scientists conduct experiments and if we copy exactly what they did in our high school labs we will get the results they got. "In effect we learn that experimentation is boring, is something done by scientists and has nothing to do with our daily lives."Instead, he said, proper experiments are all about assessing and gathering evidence. "In other words, the scientific activity that surrounds experimentation is about thinking clearly in the face of evidence obtained as the result of an experiment. But people who don't see their actions as experiments, and those who don't know how to reason carefully from data, will continue to learn less well from their own experiences than those who do
  • Lisa Randall, a physicist at Harvard University, argued that perhaps "science" itself would be a useful concept for wider appreciation. "The idea that we can systematically understand certain aspects of the world and make predictions based on what we've learned – while appreciating and categorising the extent and limitations of what we know – plays a big role in how we think.
  • "Many words that summarise the nature of science such as 'cause and effect', 'predictions', and 'experiments', as well as words that describe probabilistic results such as 'mean', 'median', 'standard deviation', and the notion of 'probability' itself help us understand more specifically what this means and how to interpret the world and behaviour within it."
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Random Thoughts Of A Free Thinker: The TCM vs. Western medicine debate -- a philosophic... - 0 views

  • there is a sub-field within the study of philosophy that looks at what should qualify as valid or certain knowledge. And one main divide in this sub-field would perhaps be the divide between empiricism and rationalism. Proponents of the former generally argue that only what can be observed by the senses should qualify as valid knowledge while proponents of the latter are more sceptical about sensory data since such data can be "false" (for example, optical illusions) and instead argue that valid knowledge should be knowledge that is congruent with reason.
  • Another significant divide in this sub-field would be the divide between positivism/scientism and non-positivism/scientism. Essentially, proponents of the former argue that only knowledge that is congruent with scientific reasoning or that can be scientifically proven should qualify as valid knowledge. In contrast, the proponents of non-positivism/scientism is of the stance that although scientific knowledge may indeed be a form of valid knowledge, it is not the only form of valid knowledge; knowledge derived from other sources or methods may be just as valid.
  • Evidently, the latter divide is relevant with regards to this debate over the validity of TCM, or alternative medicine in general, as a form of medical treatment vis-a-vis Western medicine, in that the general impression perhaps that while Western medicine is scientifically proven, the former is however not as scientifically proven. And thus, to those who abide by the stance of positivism/scientism, this will imply that TCM, or alternative medicine in general, is not as valid or reliable a form of medical treatment as Western medicine. On the other hand, as can be seen from the letters written in to the ST Forum to defend TCM, there are those who will argue that although TCM may not be as scientifically proven, this does not however imply that it is not a valid or reliable form of medical treatment.
  • ...6 more annotations...
  • Of course, while there are similarities between the positions adopted in the "positivism/scientism versus non-positivism/scientism" and "Western medicine versus alternative medicine" debates, I suppose that one main difference is however that the latter is not just a theoretical debate but involves people's health and lives.
  • As was mentioned earlier, the general impression is perhaps that while Western medicine, which generally has its roots in Western societies, is scientifically proven, TCM, or alternative medicine, is however not as scientifically proven. The former is thus regarded as the dominant mainstream model of medical treatment while non-Western medical knowledge or treatment is regarded as "alternative medicine".
  • The process by which the above impression was created was, according to the postcolonial theorists, a highly political one. Essentially, it may be argued that along with their political colonisation of non-European territories in the past, the European/Western colonialists also colonised the minds of those living in those territories. This means that along with colonisation, traditional forms of knowledge, including medical knowledge, and cultures in the colonised terrorities were relegated to a non-dominant, if not inferior, position vis-a-vis Western knowledge and culture. And as postcolonial theorists may argue, the legacy and aftermath of this process is still felt today and efforts should be made to reverse it.
  • In light of the above, the increased push to have non-Western forms of medical treatment be recognised as an equally valid model of medical treatment besides that of Western medicine may be seen as part of the effort to reverse the dominance of Western knowledge and culture set in place during the colonial period. Of course, this push to reverse Western dominance is especially relevant in recent times, in light of the economic and political rise of non-Western powers such as China and India (interestingly enough, to the best of my knowledge, when talking about "alternative medicine", people are usually referring to traditional Indian or Chinese medical treatments and not really traditional African medical treatment).
  • Here, it is worthwhile to pause and think for a while: if it is recognised that Western and non-Western medicine are different but equally valid models of medical treatment, would they be complimentary or competing models? Or would they be just different models?
  • Moving on, so far it would seem that , for at least the foreseeable future, Western medicine will retain its dominant "mainstream" position but who knows what the future may hold?
Weiye Loh

Resources for Learning More About Climate Science - NYTimes.com - 0 views

  • hundreds of books about climate change have been published, but not that many of them lay out the basics of the problem in a clear, understandable way. Still fewer provide any rich sense of the history of how the science came to exist in its present form.
  • The Web does have some excellent resources, to be sure. I often send people to Climate Central, a fine site based in Princeton that works to translate climate science into understandable prose. For people starting from a contrarian bent, nothing beats Skeptical Science, a Web site that directly answers various skeptic talking points, with links to some of the original science. And Real Climate is a must-read, since it includes some of the world’s top climate scientists translating their research into layman’s language.
  • many of us want to flee the Web and curl up with a good book. So I was enthused recently when “The Warming Papers” came to my attention.
  • ...1 more annotation...
  • A hefty new volume published by Wiley-Blackwell and edited by the climate scientists David Archer and Raymond Pierrehumbert at the University of Chicago, it’s a rich feast for anyone who wants to trace the history of climate science from its earliest origins to the present.
  •  
    The Web is chockablock with blog posts and other material about climate change, of course, but picking your way through that to the actual science, or even to reliable write-ups on what the science means, is no easy task.
Weiye Loh

The Ashtray: The Ultimatum (Part 1) - NYTimes.com - 0 views

  • “Under no circumstances are you to go to those lectures. Do you hear me?” Kuhn, the head of the Program in the History and Philosophy of Science at Princeton where I was a graduate student, had issued an ultimatum. It concerned the philosopher Saul Kripke’s lectures — later to be called “Naming and Necessity” — which he had originally given at Princeton in 1970 and planned to give again in the Fall, 1972.
  • Whiggishness — in history of science, the tendency to evaluate and interpret past scientific theories not on their own terms, but in the context of current knowledge. The term comes from Herbert Butterfield’s “The Whig Interpretation of History,” written when Butterfield, a future Regius professor of history at Cambridge, was only 31 years old. Butterfield had complained about Whiggishness, describing it as “…the study of the past with direct and perpetual reference to the present” – the tendency to see all history as progressive, and in an extreme form, as an inexorable march to greater liberty and enlightenment. [3] For Butterfield, on the other hand, “…real historical understanding” can be achieved only by “attempting to see life with the eyes of another century than our own.” [4][5].
  • Kuhn had attacked my Whiggish use of the term “displacement current.” [6] I had failed, in his view, to put myself in the mindset of Maxwell’s first attempts at creating a theory of electricity and magnetism. I felt that Kuhn had misinterpreted my paper, and that he — not me — had provided a Whiggish interpretation of Maxwell. I said, “You refuse to look through my telescope.” And he said, “It’s not a telescope, Errol. It’s a kaleidoscope.” (In this respect, he was probably right.) [7].
  • ...9 more annotations...
  • I asked him, “If paradigms are really incommensurable, how is history of science possible? Wouldn’t we be merely interpreting the past in the light of the present? Wouldn’t the past be inaccessible to us? Wouldn’t it be ‘incommensurable?’ ” [8] ¶He started moaning. He put his head in his hands and was muttering, “He’s trying to kill me. He’s trying to kill me.” ¶And then I added, “…except for someone who imagines himself to be God.” ¶It was at this point that Kuhn threw the ashtray at me.
  • I call Kuhn’s reply “The Ashtray Argument.” If someone says something you don’t like, you throw something at him. Preferably something large, heavy, and with sharp edges. Perhaps we were engaged in a debate on the nature of language, meaning and truth. But maybe we just wanted to kill each other.
  • That's the problem with relativism: Who's to say who's right and who's wrong? Somehow I'm not surprised to hear Kuhn was an ashtray-hurler. In the end, what other argument could he make?
  • For us to have a conversation and come to an agreement about the meaning of some word without having to refer to some outside authority like a dictionary, we would of necessity have to be satisfied that our agreement was genuine and not just a polite acknowledgement of each others' right to their opinion, can you agree with that? If so, then let's see if we can agree on the meaning of the word 'know' because that may be the crux of the matter. When I use the word 'know' I mean more than the capacity to apprehend some aspect of the world through language or some other represenational symbolism. Included in the word 'know' is the direct sensorial perception of some aspect of the world. For example, I sense the floor that my feet are now resting upon. I 'know' the floor is really there, I can sense it. Perhaps I don't 'know' what the floor is made of, who put it there, and other incidental facts one could know through the usual symbolism such as language as in a story someone tells me. Nevertheless, the reality I need to 'know' is that the floor, or whatever you may wish to call the solid - relative to my body - flat and level surface supported by more structure then the earth, is really there and reliably capable of supporting me. This is true and useful knowledge that goes directly from the floor itself to my knowing about it - via sensation - that has nothing to do with my interpretive system.
  • Now I am interested in 'knowing' my feet in the same way that my feet and the whole body they are connected to 'know' the floor. I sense my feet sensing the floor. My feet are as real as the floor and I know they are there, sensing the floor because I can sense them. Furthermore, now I 'know' that it is 'I' sensing my feet, sensing the floor. Do you see where I am going with this line of thought? I am including in the word 'know' more meaning than it is commonly given by everyday language. Perhaps it sounds as if I want to expand on the Cartesian formula of cogito ergo sum, and in truth I prefer to say I sense therefore I am. It is my sensations of the world first and foremost that my awareness, such as it is, is actively engaged with reality. Now, any healthy normal animal senses the world but we can't 'know' if they experience reality as we do since we can't have a conversation with them to arrive at agreement. But we humans can have this conversation and possibly agree that we can 'know' the world through sensation. We can even know what is 'I' through sensation. In fact, there is no other way to know 'I' except through sensation. Thought is symbolic representation, not direct sensing, so even though the thoughtful modality of regarding the world may be a far more reliable modality than sensation in predicting what might happen next, its very capacity for such accurate prediction is its biggest weakness, which is its capacity for error
  • Sensation cannot be 'wrong' unless it is used to predict outcomes. Thought can be wrong for both predicting outcomes and for 'knowing' reality. Sensation alone can 'know' reality even though it is relatively unreliable, useless even, for making predictions.
  • If we prioritize our interests by placing predictability over pure knowing through sensation, then of course we will not value the 'knowledge' to be gained through sensation. But if we can switch the priorities - out of sheer curiosity perhaps - then we can enter a realm of knowledge through sensation that is unbelievably spectacular. Our bodies are 'made of' reality, and by methodically exercising our nascent capacity for self sensing, we can connect our knowing 'I' to reality directly. We will not be able to 'know' what it is that we are experiencing in the way we might wish, which is to be able to predict what will happen next or to represent to ourselves symbolically what we might experience when we turn our attention to that sensation. But we can arrive at a depth and breadth of 'knowing' that is utterly unprecedented in our lives by operating that modality.
  • One of the impressions that comes from a sustained practice of self sensing is a clearer feeling for what "I" is and why we have a word for that self referential phenomenon, seemingly located somewhere behind our eyes and between our ears. The thing we call "I" or "me" depending on the context, turns out to be a moving point, a convergence vector for a variety of images, feelings and sensations. It is a reference point into which certain impressions flow and out of which certain impulses to act diverge and which may or may not animate certain muscle groups into action. Following this tricky exercize in attention and sensation, we can quickly see for ourselves that attention is more like a focused beam and awareness is more like a diffuse cloud, but both are composed of energy, and like all energy they vibrate, they oscillate with a certain frequency. That's it for now.
  • I loved the writer's efforts to find a fixed definition of “Incommensurability;” there was of course never a concrete meaning behind the word. Smoke and mirrors.
Weiye Loh

Roger Pielke Jr.'s Blog: Mike Daisey and Higher Truths - 0 views

  • Real life is messy. And as a general rule, the more theatrical the story you hear, and the more it divides the world into goodies vs baddies, the less reliable that story is going to be.
  • some people do feel that certain issues are so important that there should be cause in political debates to overlook lies or misrepresentations in service of a "larger truth" (Yellow cake, anyone?). I have seen this attitude for years in the climate change debate (hey look, just today), and often condoned by scientists and journalists alike.
  • the "global warming: yes or no?" debate has become an obstacle to effective policy action related to climate. Several of these colleagues suggested that I should downplay the policy implications of my work showing that for a range of phenomena and places, future climate impacts depend much more on growing human vulnerability to climate than on projected changes in climate itself (under the assumptions of the Intergovernmental Panel on Climate Change). One colleague wrote, "I think we have a professional (or moral?) obligation to be very careful what we say and how we say it when the stakes are so high." In effect, some of these colleagues were intimating that ends justify means or, in other words, doing the "right thing" for the wrong reasons is OK.
  • ...3 more annotations...
  • When science is used (and misused) in political advocacy, there are frequent opportunities for such situations to arise.
  • I don't think you're being fair to Mike Lemonick. In the article by him that you cite, MIke's provocative question was framed in the context of an analogy he was making to the risks of smoking. For example, in that article, he also says: "So should the overall message be that nobody knows anything? I don’t think so. We would never want to pretend the uncertainty isn’t there, since that would be dishonest. But featuring it prominently is dishonest ,too, just as trumpeting uncertainty in the smoking-cancer connection would have been."Thus, I think you're reading way too much into Mike's piece. That said, I do agree with you that there are implications of the Daisey case for climate communicators and climate journalism. My own related post is here: http://www.collide-a-scape.com/2012/03/19/the-seduction-of-narrative/"
  • I don't want journalists shading the truth in a desire to be "effective" in some way. That is Daisey's tradeoff too.
  •  
    Recall that in the aftermath of initial revelations about Peter Gleick's phishing of the Heartland Institute, we heard defenses of his action that included claims that he was only doing the same thing that journalists do to the importance of looking beyond Gleick's misdeeds at the "larger truth." Consider also what was described in the UEA emails as "pressure to present a nice tidy story" related to climate science as well as the IPCC's outright falsification related to disasters and climate change. Such shenanigans so endemic in the climate change debate that when a journalist openly asks whether the media should tell the whole truth about climate change, no one even bats an eye. 
test and tagging

Excellent Test and Tagging in Adelaide - 1 views

I have been looking for a reliable electrical safety specialist to check on my electrical equipment which we have been using in my restaurant in Adelaide. After a week of searching, I finally found...

test and tagging

started by test and tagging on 24 Nov 11 no follow-up yet
Weiye Loh

Rationally Speaking: Truth from fiction: truth or fiction? - 0 views

  • Literature teaches us about life. Literature helps us understand the world.
  • this belief in truth-from-fiction is the party line for those who champion the merits of literature. Eminent English professor and critic Harold Bloom proclaims, in his bestselling How to Read and Why, that one of the main reasons to read literature is because "we require knowledge, not just of self and others, but of the way things are."
  • why would we expect literature to be a reliable source of knowledge about "the way things are"? After all, the narratives which are the most gripping and satisfying to read are not the most representative of how the world actually works. They have dramatic resolutions, foreshadowing, conflict, climax, and surprise. People tend to get their comeuppance after they misbehave. People who pursue their dream passionately tend to succeed. Disaster tends to strike when you least expect it. These narratives are over-represented in literature because they're more gratifying to read; why would we expect to learn from them about "the way things are"?
  • ...2 more annotations...
  • even if authors were all trying to faithfully represent the world as they perceived it, why would we expect their perceptions to be any more universally true than anyone else's?
  • I can't see any reason to give any more weight to the implicit arguments of a novel than we would give to the explicit arguments of any individual person. And yet when we read a novel or study it in school, especially if it's a hallowed classic, we tend to treat its arguments as truths.
  •  
    FRIDAY, JUNE 18, 2010 Truth from fiction: truth or fiction?
Weiye Loh

Bad Health Habits Blamed on Genetics - Newsweek - 0 views

  • A new study shows just how alluring “My DNA did it!” is to some people.
  • here are serious scientific concerns about the reliability and value of many of the genes linked to disease. And now we have another reason why the hype is worrisome: people who engage in the riskiest-for-health behaviors, and who therefore most need to change, are more likely to blame their genes for their diseases, finds a new study published online in the journal Annals of Behavioral Medicine.
  • Worse, the more behavioral risk factors people have—smoking and eating a high-fat diet and not exercising, for instance—the less likely they are to be interested in information about living healthier.
  • ...1 more annotation...
  • The unhealthier people’s habits were, the more they latched on to genetic explanations for diseases
  •  
    My Alleles Made Me Do It: The Folly of Blaming Bad Behavior on Wonky DNA
Low Yunying

China's Green Dam Internet Filter - 6 views

Article: http://edition.cnn.com/2009/TECH/06/30/china.green.dam/index.html Summary: China has passed a mandate requiring all personal computers sold in the country to be accompanied by a contro...

China pornography filter

started by Low Yunying on 02 Sep 09 no follow-up yet
1 - 20 of 41 Next › Last »
Showing 20 items per page