Skip to main content

Home/ Education Links/ Group items tagged analysis

Rss Feed Group items tagged

Jeff Bernstein

Florida DOE: Student Achievement in Florida's Charter Schools - 0 views

  •  
    Section 1002.33(23), Florida Statutes, requires the Florida Department of Education to prepare an annual statewide analysis of student achievement in charter schools versus the achievement of comparable students in traditional public schools. This report of charter school student performance fulfills the statutory requirement for the 2010-11 school year. The analysis examines the average performance of charter school students and traditional public school students using eight years of Florida Comprehensive Assessment Test (FCAT) reading and math scores, as well as the FCAT science test scores that were added to the school grading calculation in 2007-08. Only students who were enrolled in a charter school or a traditional public school for an entire school year are included in the analysis. Limiting the analysis to include only full-year students is consistent with the state's school accountability system for awarding school grades under the A+ Plan. In addition, the report compares charter and traditional public schools in terms of achievement gaps and student learning gains.
Jeff Bernstein

Update of "Failed Promises: Assessing Charter Schools in the Twin Cities" - 0 views

  •  
    The Institute on Race and Poverty's 2008 analysis of charter schools in the Twin Cities metro found that charter schools have failed to deliver on the promises made by charter school proponents. The study showed that charter schools were far more segregated than traditional public schools in the metro, even in school districts where traditional public schools already have high levels of racial segregation. The analysis also showed that charter schools performed worse than traditional public schools. The findings made it clear that, at that time, charter schools offered a poor choice to low-income students and students of color-one between low-performing public schools and charters that fared even worse. Compared to charter schools, other public school choice programs such as the Choice is Yours program offered much better schools to low-income students and students of color. Finally, the report found that charter schools hurt public education in the metro by encouraging racial segregation in the traditional public school system.  This work updates the 2008 study with more recent data-updating the work from the 2007-08 school year to 2010-11 in most cases. The results show that, despite significant changes to the state's charter law during the period, little has changed in the comparison between charters and traditional schools. Charter school students of all races are still much more likely to be attending a segregated school than traditional school students and the trends are largely negative. Charter schools are also still outperformed by their traditional equivalents. Analysis of 2010-11 test score data which controls for other school characteristics shows that charters still lag behind traditional schools, including especially the schools available to Choice is Yours participants.  
Jeff Bernstein

Recent State Action on Teacher Effectiveness | Bellwether Education Partners - 0 views

  •  
    "During the 2010, 2011, and 2012 legislative sessions, a combination of federal policy incentives and newly elected governors and legislative majorities in many states following the 2010 elections sparked a wave of legislation addressing teacher effectiveness. More than 20 states passed legislation designed to address educator effectiveness by mandating annual evaluations based in part on student learning and linking evaluation results to key personnel decisions, including tenure, reductions in force, dismissal of underperforming teachers, and retention. In many cases states passed multiple laws, with later laws building on previous legislation, and also promulgated regulations to implement legislation. A few states acted through regulation only. In an effort to help policymakers, educators, and the public better understand how this flurry of legislative activity shifted the landscape on teacher effectiveness issues-both nationally and at the state level-Bellwether Education Partners analyzed recent teacher effectiveness legislation, regulation, and supporting policy documents from 21 states that took major legislative or regulatory action on teacher effectiveness in the past three years. This analysis builds on a previous analysis of teacher effectiveness legislation in five states that Bellwether published in 2011. Our expanded analysis includes nearly all states that took major legislative action on teacher effectiveness over the past three years."
Jeff Bernstein

Triangulating Principal Effectiveness: How Perspectives of Parents, Teachers, and Assis... - 0 views

  •  
    While the importance of effective principals is undisputed, few studies have addressed what specific skills principals need to promote school success. This study draws on unique data combining survey responses from principals, assistant principals, teachers and parents with rich administrative data to identify which principal skills matter most for school outcomes. Factor analysis of a 42-item task inventory distinguishes five skill categories, yet only one of them, the principals' organization management skills, consistently predicts student achievement growth and other success measures. Analysis of evaluations of principals by assistant principals confirms this central result. Our analysis argues for a broad view of instructional leadership that includes general organizational management skills as a key complement to the work of supporting curriculum and instruction.
Jeff Bernstein

Shanker Blog » Living In The Tails Of The Rhetorical And Teacher Quality Dist... - 0 views

  •  
    "A few weeks ago, Students First NY (SFNY) released a report, in which they presented a very simple analysis of the distribution of "unsatisfactory" teacher evaluation ratings ("U-ratings") across New York City schools in the 2011-12 school year. The report finds that U-ratings are distributed unequally. In particular, they are more common in schools with higher poverty, more minorities, and lower proficiency rates. Thus, the authors conclude, the students who are most in need of help are getting the worst teachers. There is good reason to believe that schools serving larger proportions of disadvantaged students have a tougher time attracting, developing and retaining good teachers, and there is evidence of this, even based on value-added estimates, which adjust for these characteristics (also see here). However, the assumptions upon which this Students First analysis is based are better seen as empirical questions, and, perhaps more importantly, the recommendations they offer are a rather crude, narrow manifestation of market-based reform principles."
Jeff Bernstein

New York City Fair Student Funding reform? Not so fair: exclusive analysis - NY Daily News - 0 views

  •  
    "New schools founded in the last three years get more money per student than schools the city began shutting down this year, a Daily News analysis finds. Under a reform - ironically called Fair Student Funding - the city distributes the bulk of school funding based on the enrollment and demographics of each school. The reform introduced in 2007 hasn't been fully funded because of budget cuts in recent years, but all 30 new schools opening this year get their full share of the money to which they're entitled while the struggling schools remain badly underfunded."
Jeff Bernstein

Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review o... - 0 views

  •  
    "A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an online to a face-to-face condition, (b) measured student learning outcomes, (c) used a rigorous research design, and (d) provided adequate information to calculate an effect size. As a result of this screening, 50 independent effects were identified that could be subjected to meta-analysis. The meta-analysis found that, on average, students in online learning conditions performed modestly better than those receiving face-to-face instruction. The difference between student outcomes for online and face-to-face classes-measured as the difference between treatment and control means, divided by the pooled standard deviation-was larger in those studies contrasting conditions that blended elements of online and face-to-face instruction with conditions taught entirely face-to-face. Analysts noted that these blended conditions often included additional learning time and instructional elements not received by students in control conditions. This finding suggests that the positive effects associated with blended learning should not be attributed to the media, per se. An unexpected finding was the small number of rigorous published studies contrasting online and face-to-face learning conditions for K-12 students. In light of this small corpus, caution is required in generalizing to the K-12 population because the results are derived for the most part from studies in other settings (e.g., medical training, higher education)."
Jeff Bernstein

The Cost of Stupid: Families for Excellent Schools Totally Bogus Analysis of NYC School... - 0 views

  •  
    "Families for Excellent Schools of New York - the Don't Steal Possible folks - has just released an impossibly stupid analysis in which they claim that New York City is simply throwing money at failure. Spending double on failing schools what they do on totally awesome ones (if they really have any awesome ones)."
Jeff Bernstein

Shanker Blog » The Allure Of Teacher Quality - 0 views

  •  
    Fueled by the ever-increasing availability of detailed test score datasets linking teachers to students, the research literature on teachers' test-based effectiveness has grown rapidly, in both size and sophistication. Analysis after analysis finds that, all else being equal, the variation in teachers' estimated effects on students' test growth - the difference between the "top" and "bottom" teachers - is very large. In any given year, some teachers' students make huge progress, others' very little. Even if part of this estimated variation is attributable to confounding factors, the discrepancies are still larger than most any other measurable "input" within the jurisdiction of education policy. The underlying assumption here is that "true" teacher quality varies to a degree that is at least somewhat comparable in magnitude to the spread of the test-based estimates. Perhaps that's the case, but it does not, by itself, help much. The key question is whether and how we can measure teacher performance at the individual level and, more importantly, influence the distribution - that is, to raise the ceiling, the middle and/or the floor. The variation hangs out there like a drug to which we're addicted, but haven't really figured out how to administer.
Jeff Bernstein

What You See May Not Be What You Get: A Brief, Nontechnical Introduction to Overfitting... - 0 views

  •  
    Statistical models, such as linear or logistic regression or survival analysis, are frequently used as a means to answer scientific questions in psychosomatic research. Many who use these techniques, however, apparently fail to appreciate fully the problem of overfitting, ie, capitalizing on the idiosyncrasies of the sample at hand. Overfitted models will fail to replicate in future samples, thus creating considerable uncertainty about the scientific merit of the finding. The present article is a nontechnical discussion of the concept of overfitting and is intended to be accessible to readers with varying levels of statistical expertise. The notion of overfitting is presented in terms of asking too much from the available data. Given a certain number of observations in a data set, there is an upper limit to the complexity of the model that can be derived with any acceptable degree of uncertainty. Complexity arises as a function of the number of degrees of freedom expended (the number of predictors including complex terms such as interactions and nonlinear terms) against the same data set during any stage of the data analysis. Theoretical and empirical evidence-with a special focus on the results of computer simulation studies-is presented to demonstrate the practical consequences of overfitting with respect to scientific inference. Three common practices-automated variable selection, pretesting of candidate predictors, and dichotomization of continuous variables-are shown to pose a considerable risk for spurious findings in models. The dilemma between overfitting and exploring candidate confounders is also discussed. Alternative means of guarding against overfitting are discussed, including variable aggregation and the fixing of coefficients a priori. Techniques that account and correct for complexity, including shrinkage and penalization, also are introduced.
Jeff Bernstein

Testing mandates flunk cost-benefit analysis - The Answer Sheet - The Washington Post - 0 views

  •  
    According to Wikipedia, cost-benefit analysis "is a systematic process for calculating and comparing benefits and costs of a project, decision or government policy (hereafter, 'project'). CBA has two purposes: 1.To determine if it is a sound investment/decision (justification/feasibility), 2.To provide a basis for comparing projects. It involves comparing the total expected cost of each option against the total expected benefits, to see whether the benefits outweigh the costs, and by how much." I believe that it would be prudent to apply this process to the current accountability movement now being administered in public education, primarily in the form of testing mandates such as No Child Left Behind and Race To The Top.
Jeff Bernstein

Leaps of Logic and Sleights of Hand: The Misuse of Educational Research In Policy Debat... - 0 views

  •  
    Did the New York Times sensationalize its account of an analysis of value-added measures of teacher performance it recently featured on its front page, misleading its readers about its policy implications? Have commentators such as the Times' own Nicholas Kristof and bloggers such as Ed Sector's Kevin Carey seized upon the Times' misleading narrative to confirm pre-existing policy biases, rather than do their own careful reading of what is universally acknowledged to be a rather complex study? Was Mayor Bloomberg's cynical use of the analysis and Kristof's column in his State of the City address to teacher bash and union bash, as he cited them to justify his mass closure of PLA schools and his refusal to negotiate meaningful appeals of ineffective ratings, not the logical conclusion of this misrepresentation of educational research? An email exchange I had with one of the co-authors of the study, Raj Chetty of Harvard, provides interesting evidence that the answer to all of these questions is yes.
Jeff Bernstein

MET Project: Gathering Feedback for Teaching - Combining High-Quality Observations with... - 0 views

  •  
    This report is intended for policymakers and practitioners wanting to understand the implications of the Measures of Effective Teaching (MET) project's interim analysis of classroom observations. Those wanting to explore all the technical aspects of the study and analysis also should read the companion research report, available at www.metproject.org.
Jeff Bernstein

Is School Funding Fair? National Report Card - 0 views

  •  
    "Is School Funding Fair? A National Report Card" posits that fairness depends not only on a sufficient level of funding for all students, but also the provision of additional resources to districts where there are more students with greater needs. The National Report Card rates the 50 states on the basis of four separate, but interrelated, "fairness indicators" - funding level, funding distribution, state fiscal effort, and public school coverage. Using a more thorough statistical analysis, the report provides the most in-depth analysis to date of state education finance systems and school funding fairness across the nation. The results show that many states do not fairly allocate education funding to address the needs of their most disadvantaged students, and the schools serving high numbers of those students.
Jeff Bernstein

Dallas Value Added Study - More Analysis | Gary Rubinstein's TFA Blog - 0 views

  •  
    A few weeks ago, I wrote about how I did my own analysis of the 1997 study which is always quoted by Rhee about how 3 effective teachers in a row vs. 3 ineffective teachers in a row is life changing.  Now, as someone who considers himself an effective teacher, and someone who has been taught by effective teachers and also by ineffective teachers, I'm very aware that there is a difference.  The question is whether or not this difference really shows up in standardized test scores accurately enough so that districts can use them reliably as part of evaluations which can lead to teachers getting fired over them.
Jeff Bernstein

Louisiana skipped key standardized testing analysis in 2009-2010, cites budget woes | T... - 0 views

  •  
    The Louisiana Department of Education (LDOE) did not conduct an erasure analysis of the state's standardized test scores for the 2009-2010 academic year due to budget cuts, The American Independent has learned through Freedom Of Information Act requests.
Jeff Bernstein

Third Way Responds but Still Doesn't Get It! « School Finance 101 - 0 views

  •  
    Third Way has posted a response to my critique in which they argue that their analysis do not suffer the egregious flaws my review indicates. Specifically, they bring up my reference to the fact that whenever they are using a "district" level of analysis, they include the Detroit City Schools in their entirety in their sample of "middle class." They argue that they did not do this, but rather only included the middle class schools in Detroit.
Jeff Bernstein

In Reversal, New York State Says It Used Erasure Analysis to Detect Cheating - NYTimes.com - 0 views

  •  
    ...officials revealed this week that the State Education Department had quietly been conducting erasure analysis on some high school Regents exams for more than three years, a process that red-flagged 64 incidences of possible problems, including one that led to the ouster of an assistant principal in the Bronx.
Jeff Bernstein

The impact of no Child Left Behind on student achievement - 0 views

  •  
    The No Child Left Behind (NCLB) Act compelled states to design school accountability systems based on annual student assessments. The effect of this federal legislation on the distribution of student achievement is a highly controversial but centrally important question. This study presents evidence on whether NCLB has influenced student achievement based on an analysis of state-level panel data on student test scores from the National Assessment of Educational Progress (NAEP). The impact of NCLB is identified using a comparative interrupted time series analysis that relies on comparisons of the test-score changes across states that already had school accountability policies in place prior to NCLB and those that did not. Our results indicate that NCLB generated statistically significant increases in the average math performance of fourth graders (effect size 5 0.23 by 2007) as well as improvements at the lower and top percentiles. There is also evidence of improvements in eighth-grade math achievement, particularly among traditionally low-achieving groups and at the lower percentiles. However, we find no evidence that NCLB increased fourth-grade reading achievement.
Jeff Bernstein

Separating fact from fiction in 21 claims about charter schools - The Washington Post - 0 views

  •  
    "The National Alliance for Public Charter Schools  released a report last year titled "Separating Fact & Fiction: What You Need to Know About Charter Schools," which takes 21 statements that it calls "myths" about charters and attempts to debunk them, one by one. Now three education researchers have completed a fact-checking analysis of the charter report, coming to some difference conclusions about each myth.  Following is part of the new analysis, which was published by the National Education Policy Center at the University of Colorado Boulder, and which you can find in full, complete with extensive footnotes on the NEPC website."
1 - 20 of 231 Next › Last »
Showing 20 items per page