Skip to main content

Home/ UTS-AEI/ Group items tagged measurement

Rss Feed Group items tagged

Simon Knight

Why we're moving beyond GDP as a measure of human progress | UTS News Room - 0 views

  •  
    How we track our economy influences everything from government spending and taxes to home lending and business investment. The Conversation series The Way We Measure takes a close look at economic indicators to better understand what's going on. Ever since 1944, Gross Domestic Product (GDP) has been a primary measure of economic growth. It's in the news regularly and, even though few can define what it means, there is general acceptance that when GDP is growing, things are good. There are problems with this simplistic formulation.
Simon Knight

The Tangled Story Behind Trump's False Claims Of Voter Fraud | FiveThirtyEight - 0 views

  •  
    Say you have a 3,000-person presidential election survey from a state where 3 percent of the population is black. If your survey is exactly representative of reality, you'd end up with 90 black people out of that 3,000. Then you ask them who they plan to vote for (for our purposes, we're assuming they're all voting). History suggests the vast majority will go with the Democrat. Over the last five presidential elections, Republicans have earned an average of only 7 percent of the black vote nationwide. However, your survey comes back with 19.5 percent of black voters leaning Republican. Now, that's the sort of unexpected result that's likely to draw the attention of a social scientist (or a curious journalist). But it should also make them suspicious. That's because when you're focusing on a tiny population like the black voters of a state with few black citizens, even a measurement error rate of 1 percent can produce an outcome that's wildly different from reality. That error could come from white voters who clicked the wrong box and misidentified their race. It could come from black voters who meant to say they were voting Democratic. In any event, the combination of an imbalanced sample ratio and measurement error can be deadly to attempts at deriving meaning from numbers - a grand piano dangling from a rope above a crenulated, four-tiered wedding cake. Just a handful of miscategorized people and - crash! - your beautiful, fascinating insight collapses into a messy disaster.
Simon Knight

Cash in hand: how big is Australia's black economy? | Australia news | The Guardian - 0 views

  •  
    How do we measure illegal activity? How do we estimate the size of the 'black economy'? Some nice visualisations in this report. The Australian government has announced a taskforce to "crack down on the black economy", with a panel reportedly to consider measures such as removing the $100 note from circulation and limiting cash transactions above a certain limit. One estimate of the underground economy from 1999, which only considered cash transactions and excluded illegal activities, put the size of the underground economy at around 15% of gross domestic product. However, a more recent estimate by the Australian Bureau of Statistics (ABS) in 2013, which encompassed proceeds from illegal activities as well as other areas, estimated the size to be far smaller, at only 2.1% of GDP.
Simon Knight

When the numbers aren't enough: how different data work together in research - 0 views

  •  
    As an epidemiologist, I am interested in disease - and more specifically, who in a population currently has or might get that disease. What is their age, sex, or socioeconomic status? Where do they live? What can people do to limit their chances of getting sick? Questions exploring whether something is likely to happen or not can be answered with quantitative research. By counting and measuring, we quantify (measure) a phenomenon in our world, and present the results through percentages and averages. We use statistics to help interpret the significance of the results. While this approach is very important, it can't tell us everything about a disease and peoples' experiences of it. That's where qualitative data becomes important.
Simon Knight

Fitness trackers' calorie measurements are prone to error - Health News - NHS Choices - 0 views

  •  
    "Fitness trackers out of step when measuring calories, research shows," The Guardian reports. An independent analysis of a number of leading brands found they were all prone to inaccurate recording of energy expenditure.
Simon Knight

Study: What Instagram Can Teach Us About Food Deserts - The Atlantic - 0 views

  •  
    Cool study showing the potential of data analyses to give us new kinds of insights! What does instagram tell us about peoples' nutrition decisions? Food deserts, or places where people have limited access to fresh food, are usually measured by the distance people have to travel to get to a large grocery store. What's harder to measure is what the residents of these areas are actually eating day to day. To do so, researchers typically have to rely on surveys, ...In a recent study, De Choudhury and her colleagues propose another method: mining Instagram. All those artfully arranged plates, all that latte art, just waiting for someone to analyze it!
Simon Knight

Global Health - Our World in Data - 0 views

  •  
    Ourworldindata is a great website discussing lots of different datasets about global issues. This example data-story discusses the issue of global health, giving an overview (and lots of great visualisations), and discussing how we actually measure 'health' (life expectancy, quality of life measures, etc.).
Simon Knight

What happens when misinformation is corrected? Understanding the labeling of content - 0 views

  •  
    What happens once misinformation is corrected? Is it effective at all? A major problem for social media platforms resides in the difficulty to reduce the spread of misinformation. In response, measures such as the labeling of false content and related articles have been created to correct users' perceptions and accuracy assessment. Although this may seem a clever initiative coming from social media platforms, helping users to understand which information can be trusted, restrictive measures also raise pivotal questions. What happens to those posts which are false, but do not display any tag flagging their untruthfulness? Will we be able to discern them?
Simon Knight

Public attitudes to inequality | From Poverty to Power - 0 views

  •  
    When it comes to inequality, a growing body of evidence shows that people across countries underestimate the size of the gap between the rich and poor, including their wages. This can undermine support for policies to tackle inequality and even lead to apathy that consolidates the gap. But how exactly are existing perceptions of inequality measured by social scientists?
Simon Knight

Cluster of UK companies reports highly improbable gender pay gap - ProQuest Central - P... - 0 views

  •  
    Excellent analysis from the FT (you'll need to login to view via the link) that uses knowledge of the Mean and Median to show that some companies have reported incorrect (fabricated?) pay-gap information! One in 20 UK companies that have submitted gender pay gap data to the government have reported numbers that are statistically improbable and therefore almost certainly inaccurate, a Financial Times analysis has found. Sixteen companies, each with more than 250 employees, reported that they paid their male and female staff exactly the same, that is they had a zero average gender pay gap measured by both the mean and median. Experts on pay said that it was highly anomalous for companies of that size to have median and mean pay gaps that were identical because the two statistics measure different things. The mean gap measures the difference between the average male and female salary while the median gap is calculated using the midpoint salary for each gender.
Simon Knight

Average measures of effects can be misleading - Students 4 Best Evidence - 0 views

  •  
    Uses the example of health treatments to illustrate some of the problems with using the average
Simon Knight

The margin of error: 7 tips for journalists writing about polls and surveys - 0 views

  •  
    Journalists often make mistakes when reporting on data such as opinion poll results, federal jobs reports and census surveys because they don't quite understand - or they ignore - the data's margin of error. Data collected from a sample of the population will never perfectly represent the population as a whole. The margin of error, which depends primarily on sample size, is a measure of how precise the estimate is. The margin of error for an opinion poll indicates how close the match is likely to be between the responses of the people in the poll and those of the population as a whole. To help journalists understand margin of error and how to correctly interpret data from polls and surveys, we've put together a list of seven tips, Look for the margin of error - and report it. It tells you and your audience how much the results can vary. Remember that the larger the margin of error, the greater the likelihood the survey estimate will be inaccurate. Make sure a political candidate really has the lead before you report it. Note that there are real trends, and then there are mistaken claims of a trend. Watch your adjectives. (And it might be best to avoid them altogether.) Keep in mind that the margin of error for subgroups of a sample will always be larger than the margin of error for the sample. Use caution when comparing results from different polls and surveys, especially those conducted by different organizations.
Simon Knight

A Million Children Didn't Show Up In The 2010 Census. How Many Will Be Missing In 2020?... - 0 views

  •  
    Since the census is the ultimate measure of population in the U.S., one might wonder how we could even know if its count was off. In other words, who recounts the count? Well, the Census Bureau itself, but using a different data source. After each modern census, the bureau carries out research to gauge the accuracy of the most recent count and to improve the survey for the next time around. The best method for determining the scope of the undercount is refreshingly simple: The bureau compares the total number of recorded births and deaths for people of each birth year, then adds in an estimate of net international migration and … that's it. With that number, the bureau can vet the census - which missed 4.6 percent of kids under 5 in 2010, according to this check.
Simon Knight

California, Coffee and Cancer: One of These Doesn't Belong - The New York Times - 0 views

  •  
    The more serious problem with California's law is one of effect size. Health, and cancer, aren't binary. Consumers can't just be concerned with whether a danger exists; they also need to be concerned about the magnitude of that risk. Even if there's a statistically significant risk between huge quantities of coffee and some cancer (and that's not proven), it's very, very small. Cigarettes have a clear and easily measured negative impact on people's health. Acrylamide, especially the acrylamide in coffee, isn't even close. Warning labels should be applied when a danger is clear, a danger is large and a danger is avoidable. It's not clear that, with respect to acrylamide, any of these criteria are met. It's certainly not the case regarding coffee. Whatever the intentions of Proposition 65, this latest development could do more harm than good.
Simon Knight

WHO global air quality figures reveal 7m die from pollution each year - 0 views

  •  
    Nine in ten people around the world breathe air containing high levels of pollution, according to the latest data from the World Health Organization (WHO). The agency estimates that pollution causes 7 million deaths each year. The latest WHO figures measure the amount of pollutants in the air in more than 4,300 cities, towns and other settlements in 108 countries around the world. More cities than ever are now monitoring their air quality.
Simon Knight

11 questions journalists should ask about public opinion polls - 0 views

  •  
    journalists often write about public opinion polls, which are designed to measure the public's attitudes about an issue or idea. Some of the most high-profile polls center on elections and politics. Newsrooms tend to follow these polls closely to see which candidates are ahead, who's most likely to win and what issues voters feel most strongly about. Other polls also offer insights into how people think. For example, a government agency might commission a poll to get a sense of whether local voters would support a sales tax increase to help fund school construction. Researchers frequently conduct national polls to better understand how Americans feel about public policy topics such as gun control, immigration reform and decriminalizing drug use. When covering polls, it's important for journalists to try to gauge the quality of a poll and make sure claims made about the results actually match the data collected. Sometimes, pollsters overgeneralize or exaggerate their findings. Sometimes, flaws in the way they choose participants or collect data make it tough to tell what the results really mean. Below are 11 questions we suggest journalists ask before reporting on poll results. While most of this information probably won't make it into a story or broadcast, the answers will help journalists decide how to frame a poll's findings - or whether to cover them at all.
Simon Knight

Should newspapers be adding confidence intervals to their graphics? - Storybench - 1 views

  •  
    Should newspapers be adding confidence intervals to their graphics? Why, she asked, are newspapers like hers hesitant to print confidence intervals, a statistical measure of uncertainty? With the exception of noting sampling error in polling data, newspapers like the Times only show uncertainty when they're forced to - and often to prove the opposite of what point data might show.
Simon Knight

Is inequality going up or down? | From Poverty to Power - 0 views

  •  
    Duncan Green (an advisor for Oxfam), writes a great blog on use of evidence in international development and aid. This one (a guest post) is really interesting on how we measure inequality... 'You would think a question like 'Is inequality going up or down?' would be relatively easy to answer, but sadly it is not. At Oxfam we have identified the growing gap between rich and poor and the impact of high inequality as a serious crisis. But how serious is it really?
Simon Knight

What's most likely to kill you? Measuring how deadly our daily activities are - 1 views

  •  
    Interesting discussion of how we perceive risk, and the risks of everyday activities! So let's answer the first question: how likely is a fatal shark attack for an Australian? To get a crude estimate of this, averaged across the whole population, you would divide the number of people who have died due to a shark attack each year (on average three to four each year based on recent data) by the population of Australia (approximately 24 million). This yields a risk of approximately one in eight million per year, which is thankfully very low. Does this assuage your fear? If not, the reason is probably that the imagery of a shark attack is so terrifying. Any unusual and dramatic event has a huge impact on our psyche and this distorts our perception. Also, it's not that easy for us to interpret what a risk expressed as a relative frequency truly means.
Simon Knight

The biggest stats lesson of 2016 - Sense About Science USA - 0 views

  •  
    Data aren't dead, contrary to what some pundits stated post-election [2], rather the limitations of data are not always well reported. While pollsters will be reworking their models following the election, what can media journalists do to improve their overall coverage of statistical issues in the future? First, discuss possible statistical biases, such as errors in sampling and polling, and what impact these might have on the results. Second, always provide measures of uncertainty, and root these uncertainties in real-world examples.
1 - 20 of 25 Next ›
Showing 20 items per page