Skip to main content

Home/ CTLT and Friends/ Group items tagged assessment

Rss Feed Group items tagged

Joshua Yeidel

Why Your Boss Is Wrong About You - NYTimes.com - 4 views

  •  
    A brief take on the "performance _pre_view", and why the standard performance _re_view tends to diminish, not enhance, performance.
Joshua Yeidel

The Answer Sheet - A principal on standardized vs. teacher-written tests - 0 views

  •  
    High school principal George Wood eloquently contrasts standardized NCLB-style testing with his school's performance assessments.
Joshua Yeidel

Performance Assessment | The Alternative to High Stakes Testing - 0 views

  •  
    " The New York Performance Standards Consortium represents 28 schools across New York State. Formed in 1997, the Consortium opposes high stakes tests arguing that "one size does not fit all." Despite skepticism that an alternative to high stakes tests could work, the New York Performance Standards Consortium has done just that...developed an assessment system that leads to quality teaching, that enhances rather than compromises our students' education. Consortium school graduates go on to college and are successful."
Joshua Yeidel

Higher Education: Assessment & Process Improvement Group News | LinkedIn - 0 views

  •  
    High School Principal George Wood eloquently contrasts standardized NCLB-style testing and his school's term-end performance testing.
Joshua Yeidel

Refining the Recipe for a Degree, Ingredient by Ingredient - Government - The Chronicle... - 1 views

  • Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
  •  
    Lumina Foundation pushes a Eurpopean-style process to define education goals state- and nation-wide, with mixed success. "Chemistry, history, math, and physics have been among the most successful", whileothers have had a hard time beginning. "Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
Joshua Yeidel

Cross-Disciplinary Grading Techniques - ProfHacker - The Chronicle of Higher Education - 0 views

  •  
    "So far, the most useful tool to me, in physics, has been the rubric, which is used widely in grading open-ended assessments in the humanities. "
  •  
    A focus on improving the grading experience, rather than the learning experience, but still a big step forward for (some) hard scientists.
Joshua Yeidel

Students Know Good Teaching When They Get It, Survey Finds - NYTimes.com - 2 views

  •  
    ... as measured by student evals and "value-added modeling".  Note some of the student eval items, though... e.g., students agree or disagree with "In this class, we learn to correct our mistakes."
Joshua Yeidel

Evaluating Teachers: The Important Role of Value-Added [pdf] - 1 views

  •  
    "We conclude that value-added data has an important role to play in teacher evaluation systems, but that there is much to be learned about how best to use value-added information in human resource decisions." No mention of the role of assessment in improvement.
Gary Brown

Measuring Student Learning: Many Tools - Measuring Stick - The Chronicle of Higher Educ... - 2 views

  • The issue that needs to be addressed and spectacularly has been avoided is whether controlled studies (one group does the articulation of and then measurement of outcomes, and a control group does what we have been doing before this mania took hold) can demonstrate or falsify the claim that outcomes assessment results in better-educated students. So far as I can tell, we instead gather data on whether we have in fact been doing outcomes assessment. Not the issue, people. jwp
  •  
    The challenge--not the control study this person calls for, but the perception that outcomes assessment produces outcomes....
Gary Brown

Duncan: Rewarding Teachers for Master's Degrees Is Waste of Money - The Ticker - The Ch... - 1 views

  • Arne Duncan, said state and local governments should rethink their policies of giving pay raises to teachers who have master’s degrees because evidence suggests that the degree alone does not improve student achievement.
  •  
    distinguishes between outcome and impact and/ or illustrates the problems of grades/degrees as credible outcome.
Gary Brown

Home - Journal of Assessment and Accountability Systems in Educator Preparation - 1 views

  •  
    a new journal to note
Theron DesRosier

The Future of Work: As Gartner Sees It - 3 views

  •  
    "Gartner points out that the world of work will probably witness ten major changes in the next ten years. Interesting in that it will change how learning happens in the workplace as well. The eLearning industry will need to account for the coming change and have a strategy in place to deal with the changes."
Joshua Yeidel

Higher Education: Assessment & Process Improvement Group News | LinkedIn - 2 views

  •  
    So here it is: by definition, the value-added component of the D.C. IMPACT evaluation system defines 50 percent of all teachers in grades four through eight as ineffective or minimally effective in influencing their students' learning. And given the imprecision of the value-added scores, just by chance some teachers will be categorized as ineffective or minimally effective two years in a row. The system is rigged to label teachers as ineffective or minimally effective as a precursor to firing them.
  •  
    How assessment of value-added actually works in one setting: the Washington, D.C. public schools. This article actually works the numbers to show that the system is set up to put teachers in the firing zone. Note the tyranny of numerical ratings (some of them subjective) converted into meanings like "minimally effective".
Nils Peterson

Jeff Sheldon on the Readiness for Organizational Learning and Evaluation instrument | A... - 4 views

shared by Nils Peterson on 01 Nov 10 - No Cached
  • The ROLE consists of 78 items grouped into six major constructs: 1) Culture, 2) Leadership, 3) Systems and Structures, 4) Communication, 5) Teams, and 6) Evaluation.
    • Nils Peterson
       
      You can look up the book in Amazon and then view inside and search for Appendix A and read the items in the survey. http://www.amazon.com/Evaluation-Organizations-Systematic-Enhancing-Performance/dp/0738202681#reader_0738202681 This might be useful to OAI in assessing readiness (or understanding what in the university culture challenges readiness) OR it might inform our revision (or justify staying out) of our rubric. An initial glance would indicate that there are some cultural constructs in the university that are counter-indicated by the analysis of the ROLE instrument.
  •  
    " Readiness for Organizational Learning and Evaluation (ROLE). The ROLE (Preskill & Torres, 2000) was designed to help us determine the level of readiness for implementing organizational learning, evaluation practices, and supporting processes"
  •  
    An interesting possibility for a Skylight survey (but more reading needed)
Gary Brown

Disciplines Follow Their Own Paths to Quality - Faculty - The Chronicle of Higher Educa... - 2 views

  • But when it comes to the fundamentals of measuring and improving student learning, engineering professors naturally have more to talk about with their counterparts at, say, Georgia Tech than with the humanities professors at Villanova
    • Gary Brown
       
      Perhaps this is too bad....
  • But there is no nationally normed way to measure the particular kind of critical thinking that students of classics acquire
  • er colleagues have created discipline-specific critical-reasoning tests for classics and political science
  • ...5 more annotations...
  • Political science cultivates skills that are substantially different from those in classics, and in each case those skills can't be measured with a general-education test.
  • he wants to use tests of reasoning that are appropriate for each discipline
  • I believe Richard Paul has spent a lifetime articulating the characteristics of discipline based critical thinking. But anyway, I think it is interesting that an attempt is being made to develop (perhaps) a "national standard" for critical thinking in classics. In order to assess anything effectively we need a standard. Without a standard there are no criteria and therefore no basis from which to assess. But standards do not necessarily have to be established at the national level. This raises the issue of scale. What is the appropriate scale from which to measure the quality and effectiveness of an educational experience? Any valid approach to quality assurance has to be multi-scaled and requires multiple measures over time. But to be honest the issues of standards and scale are really just the tip of the outcomes iceberg.
    • Gary Brown
       
      Missing the notion that the variance is in the activity more than the criteria.  We hear little of embedding nationally normed and weighted assignments and then assessing the implementation and facilitation variables.... mirror, not lens.
  • the UW Study of Undergraduate Learning (UW SOUL). Results from the UW SOUL show that learning in college is disciplinary; therefore, real assessment of learning must occur (with central support and resources)in the academic departments. Generic approaches to assessing thinking, writing, research, quantitative reasoning, and other areas of learning may be measuring something, but they cannot measure learning in college.
  • It turns out there is a six week, or 210+ hour serious reading exposure to two or more domains outside ones own, that "turns on" cross domain mapping as a robust capability. Some people just happen to have accumulated, usually by unseen and unsensed happenstance involvements (rooming with an engineer, son of a dad changing domains/careers, etc.) this minimum level of basics that allows robust metaphor based mapping.
Judy Rumph

Views: Why Are We Assessing? - Inside Higher Ed - 1 views

  • Amid all this progress, however, we seem to have lost our way. Too many of us have focused on the route we’re traveling: whether assessment should be value-added; the improvement versus accountability debate; entering assessment data into a database; pulling together a report for an accreditor. We’ve been so focused on the details of our route that we’ve lost sight of our destinatio
  • Our destination, which is what we should be focusing on, is the purpose of assessment. Over the last decades, we've consistently talked about two purposes of assessment: improvement and accountability. The thinking has been that improvement means using assessment to identify problems — things that need improvement — while accountability means using assessment to show that we're already doing a great job and need no improvement. A great deal has been written about the need to reconcile these two seemingly disparate purposes.
  • The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education
  • ...7 more annotations...
  • Our second common purpose of assessment should be making sure not only that students learn what’s important, but that their learning is of appropriate scope, depth, and rigo
  • Third, we need to accept how good we already are, so we can recognize success when we see i
  • And we haven’t figured out a way to tell the story of our effectiveness in 25 words or less, which is what busy people want and nee
  • Because we're not telling the stories of our successful outcomes in simple, understandable terms, the public continues to define quality using the outdated concept of inputs like faculty credentials, student aptitude, and institutional wealth — things that by themselves don’t say a whole lot about student learning.
  • And people like to invest in success. Because the public doesn't know how good we are at helping students learn, it doesn't yet give us all the support we need in our quest to give our students the best possible education.
  • But while virtually every college and university has had to make draconian budget cuts in the last couple of years, with more to come, I wonder how many are using solid, systematic evidence — including assessment evidence — to inform those decisions.
  • Now is the time to move our focus from the road we are traveling to our destination: a point at which we all are prudent, informed stewards of our resources… a point at which we each have clear, appropriate, justifiable, and externally-informed standards for student learning. Most importantly, now is the time to move our focus from assessment to learning, and to keeping our promises. Only then can we make higher education as great as it needs to be.
  •  
    Yes, this article resonnated with me too. Especially connecting assessment to teaching and learning. The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education.... today we seem to be devoting more time, money, thought, and effort to assessment than to helping faculty help students learn as effectively as possible. When our colleagues have disappointing assessment results, and they don't know what to do to improve them, I wonder how many have been made aware that, in some respects, we are living in a golden age of higher education, coming off a quarter-century of solid research on practices that promote deep, lasting learning. I wonder how many are pointed to the many excellent resources we now have on good teaching practices, including books, journals, conferences and, increasingly, teaching-learning centers right on campus. I wonder how many of the graduate programs they attended include the study and practice of contemporary research on effective higher education pedagogies. No wonder so many of us are struggling to make sense of our assessment results! Too many of us are separating work on assessment from work on improving teaching and learning, when they should be two sides of the same coin. We need to bring our work on teaching, learning, and assessment together.
Gary Brown

Home | AALHE - 2 views

shared by Gary Brown on 22 Oct 10 - Cached
  • The Association for Assessment of Learning in Higher Education, Inc. (AALHE) is an organization of practitioners interested in using effective assessment practice to document and improve student learning.
  • it is designed to be a resource by all who are interested in the improvement of learning,
  •  
    Our membership begins November 1
Joshua Yeidel

News: Measuring 2-Year Students' Success - Inside Higher Ed - 0 views

  •  
    Measuring student success for federal purposes in community colleges -- and only one passing mention of learning. The factory model prevails -- stamp that widget (student) and send it out the door (they sell themselves!)
Joshua Yeidel

Systems Week: Glenda Eoyang on Complexity Demands Simplicity | AEA365 - 2 views

  •  
    "Complex systems tend to exist in one of three states. Each state needs a different evaluation design."
1 - 20 of 109 Next › Last »
Showing 20 items per page