Skip to main content

Home/ UTS-AEI/ Group items tagged poll

Rss Feed Group items tagged

Simon Knight

11 questions journalists should ask about public opinion polls - 0 views

  •  
    journalists often write about public opinion polls, which are designed to measure the public's attitudes about an issue or idea. Some of the most high-profile polls center on elections and politics. Newsrooms tend to follow these polls closely to see which candidates are ahead, who's most likely to win and what issues voters feel most strongly about. Other polls also offer insights into how people think. For example, a government agency might commission a poll to get a sense of whether local voters would support a sales tax increase to help fund school construction. Researchers frequently conduct national polls to better understand how Americans feel about public policy topics such as gun control, immigration reform and decriminalizing drug use. When covering polls, it's important for journalists to try to gauge the quality of a poll and make sure claims made about the results actually match the data collected. Sometimes, pollsters overgeneralize or exaggerate their findings. Sometimes, flaws in the way they choose participants or collect data make it tough to tell what the results really mean. Below are 11 questions we suggest journalists ask before reporting on poll results. While most of this information probably won't make it into a story or broadcast, the answers will help journalists decide how to frame a poll's findings - or whether to cover them at all.
Simon Knight

Why polls seem to struggle to get it right - on elections and everything else | News & ... - 1 views

  •  
    The public understandably focuses on polling results and how much these results seem to vary. Take two presidential approval polls from March 21. Polling firm Rasmussen Reports reported that 50 percent of Americans approve of President Donald Trump's performance, while, that same day, Gallup stated that only 37 percent do. In late February, the website FiveThirtyEight listed 18 other presidential approval polls in which Trump's approval ratings ranged from 39 percent to 55 percent. Some of these pollsters queried likely voters, some registered voters and others adults, regardless of their voting status. Almost half of the polls relied on phone calls, another half on online polling and a few used a mix of the two. Further complicating matters, it's not entirely clear how calling cellphones or landlines affects a poll's results. Each of these choices has a consequence, and the range of results attests to the degree that these choices can influence results.
Simon Knight

The margin of error: 7 tips for journalists writing about polls and surveys - 0 views

  •  
    Journalists often make mistakes when reporting on data such as opinion poll results, federal jobs reports and census surveys because they don't quite understand - or they ignore - the data's margin of error. Data collected from a sample of the population will never perfectly represent the population as a whole. The margin of error, which depends primarily on sample size, is a measure of how precise the estimate is. The margin of error for an opinion poll indicates how close the match is likely to be between the responses of the people in the poll and those of the population as a whole. To help journalists understand margin of error and how to correctly interpret data from polls and surveys, we've put together a list of seven tips, Look for the margin of error - and report it. It tells you and your audience how much the results can vary. Remember that the larger the margin of error, the greater the likelihood the survey estimate will be inaccurate. Make sure a political candidate really has the lead before you report it. Note that there are real trends, and then there are mistaken claims of a trend. Watch your adjectives. (And it might be best to avoid them altogether.) Keep in mind that the margin of error for subgroups of a sample will always be larger than the margin of error for the sample. Use caution when comparing results from different polls and surveys, especially those conducted by different organizations.
Simon Knight

National poll vs sample survey: how to know what we really think on marriage equality - 0 views

  •  
    The plan to use the Australian Bureau of Statistics to conduct the federal government's postal plebiscite on marriage reform raises an interesting question: wouldn't it be easier, and just as accurate, to ask the ABS to poll a representative sample of the Australian population rather than everyone?
Simon Knight

Who caused the Bay Area's housing shortage? - 0 views

  •  
    EVERYONE HAS A THEORY about who's to blame for the housing shortage that's driving up prices and chasing Bay Area families out of the region. A new poll offers surprising insights into where most of us point the finger: not at the government officials who control what homes are built where, but at the tech companies that have flooded this region with jobs and the real estate developers trying to maximize profits.
Simon Knight

ABC Q&A on Twitter: "How do you avoid conducting research to only prove that you are ri... - 0 views

  •  
    Mona Chalabi on the perils of polling data and the importance of official statistics
Simon Knight

The biggest stats lesson of 2016 - Sense About Science USA - 0 views

  •  
    Data aren't dead, contrary to what some pundits stated post-election [2], rather the limitations of data are not always well reported. While pollsters will be reworking their models following the election, what can media journalists do to improve their overall coverage of statistical issues in the future? First, discuss possible statistical biases, such as errors in sampling and polling, and what impact these might have on the results. Second, always provide measures of uncertainty, and root these uncertainties in real-world examples.
Simon Knight

Data journalism on radio, audio and podcasts - Online Journalism Blog - 0 views

  •  
    examples of data journalism in audio / podcast form - including: Right To Remain Silent is one particularly good example, because it's about bad data: specifically. police who manipulated official statistics. You might also listen to Choosing Wrong, which includes a section about polling. Another favourite of mine is an audio story by The Economist about the prostitution industry, based on data scraped from sex trade websites: More bang for your buck (there are even worse puns in the charts). David Rhodes, a BBC data journalist, has a range of stories on his Audioboom account, including pieces on Radio 4, Radio 5 Live, and a piece discussing "Did Greece really not pay 89.5% of their taxes in 2010" from the excellent factchecking radio programme, More or Less.
Simon Knight

Should newspapers be adding confidence intervals to their graphics? - Storybench - 1 views

  •  
    Should newspapers be adding confidence intervals to their graphics? Why, she asked, are newspapers like hers hesitant to print confidence intervals, a statistical measure of uncertainty? With the exception of noting sampling error in polling data, newspapers like the Times only show uncertainty when they're forced to - and often to prove the opposite of what point data might show.
Simon Knight

The Tangled Story Behind Trump's False Claims Of Voter Fraud | FiveThirtyEight - 0 views

  •  
    Say you have a 3,000-person presidential election survey from a state where 3 percent of the population is black. If your survey is exactly representative of reality, you'd end up with 90 black people out of that 3,000. Then you ask them who they plan to vote for (for our purposes, we're assuming they're all voting). History suggests the vast majority will go with the Democrat. Over the last five presidential elections, Republicans have earned an average of only 7 percent of the black vote nationwide. However, your survey comes back with 19.5 percent of black voters leaning Republican. Now, that's the sort of unexpected result that's likely to draw the attention of a social scientist (or a curious journalist). But it should also make them suspicious. That's because when you're focusing on a tiny population like the black voters of a state with few black citizens, even a measurement error rate of 1 percent can produce an outcome that's wildly different from reality. That error could come from white voters who clicked the wrong box and misidentified their race. It could come from black voters who meant to say they were voting Democratic. In any event, the combination of an imbalanced sample ratio and measurement error can be deadly to attempts at deriving meaning from numbers - a grand piano dangling from a rope above a crenulated, four-tiered wedding cake. Just a handful of miscategorized people and - crash! - your beautiful, fascinating insight collapses into a messy disaster.
Simon Knight

The Media Has A Probability Problem | FiveThirtyEight - 0 views

  •  
    The Media Has A Probability Problem The media's demand for certainty - and its lack of statistical rigor - is a bad match for our complex world.
1 - 11 of 11
Showing 20 items per page