Skip to main content

Home/ CTLT and Friends/ Group items tagged evaluation

Rss Feed Group items tagged

Gary Brown

Sincerity in evaluation - highlights and lowlights « Genuine Evaluation - 3 views

  • Principles of Genuine Evaluation When we set out to explore the notion of ‘Genuine Evaluation’, we identified 5 important aspects of it: VALUE-BASED -transparent and defensible values (criteria of merit and worth and standards of performance) EMPIRICAL – credible evidence about what has happened and what has caused this, USABLE – reported in such a way that it can be understood and used by those who can and should use it (which doesn’t necessarily mean it’s used or used well, of course) SINCERE – a commitment by those commissioning evaluation to respond to information about both success and failure (those doing evaluation can influence this but not control it) HUMBLE – acknowledges its limitations From now until the end of the year, we’re looking at each of these principles and collecting some of the highlights and lowlights  from 2010 (and previously).
  • Sincerity of evaluation is something that is often not talked about in evaluation reports, scholarly papers, or formal presentations, only discussed in the corridors and bars afterwards.  And yet it poses perhaps the greatest threat to the success of individual evaluations and to the whole enterprise of evaluation.
Gary Brown

Empowerment Evaluation - 1 views

  • Empowerment Evaluation in Stanford University's School of Medicine
  • Empowerment evaluation provides a method for gathering, analyzing, and sharing data about a program and its outcomes and encourages faculty, students, and support personnel to actively participate in system changes.
  • It assumes that the more closely stakeholders are involved in reflecting on evaluation findings, the more likely they are to take ownership of the results and to guide curricular decision making and reform.
  • ...8 more annotations...
  • The steps of empowerment evaluation
  • designating a “critical friend” to communicate areas of potential improvement,
  • collecting evaluation data,
  • encouraging a cycle of reflection and action
  • establishing a culture of evidence
  • developing reflective educational practitioners.
  • cultivating a community of learners
  • yearly cycles of improvement at the Stanford University School of Medicine
  •  
    The findings were presented in Academic Medicine, a medical education journal, earlier this year
Gary Brown

SAGE Journals Online - 2 views

shared by Gary Brown on 23 Sep 10 - Cached
  •  
     SAGE Journals Online for the next 3 weeks.
Gary Brown

How to analyze anything. - 5 views

  •  
    shared before, sharing again as it struck me anew
Gary Brown

http://genuineevaluation.com/ - 1 views

shared by Gary Brown on 27 Apr 10 - Cached
  •  
    a reminder to note this resource from AEA
Theron DesRosier

CDC Evaluation Working Group: Framework - 2 views

  • Framework for Program Evaluation
  • Purposes The framework was developed to: Summarize and organize the essential elements of program evaluation Provide a common frame of reference for conducting evaluations Clarify the steps in program evaluation Review standards for effective program evaluation Address misconceptions about the purposes and methods of program evaluation
  • Assigning value and making judgments regarding a program on the basis of evidence requires answering the following questions: What will be evaluated? (i.e. what is "the program" and in what context does it exist) What aspects of the program will be considered when judging program performance? What standards (i.e. type or level of performance) must be reached for the program to be considered successful? What evidence will be used to indicate how the program has performed? What conclusions regarding program performance are justified by comparing the available evidence to the selected standards? How will the lessons learned from the inquiry be used to improve public health effectiveness?
  • ...3 more annotations...
  • These questions should be addressed at the beginning of a program and revisited throughout its implementation. The framework provides a systematic approach for answering these questions.
  • Steps in Evaluation Practice Engage stakeholders Those involved, those affected, primary intended users Describe the program Need, expected effects, activities, resources, stage, context, logic model Focus the evaluation design Purpose, users, uses, questions, methods, agreements Gather credible evidence Indicators, sources, quality, quantity, logistics Justify conclusions Standards, analysis/synthesis, interpretation, judgment, recommendations Ensure use and share lessons learned Design, preparation, feedback, follow-up, dissemination Standards for "Effective" Evaluation Utility Serve the information needs of intended users Feasibility Be realistic, prudent, diplomatic, and frugal Propriety Behave legally, ethically, and with due regard for the welfare of those involved and those affected Accuracy Reveal and convey technically accurate information
  • The challenge is to devise an optimal — as opposed to an ideal — strategy.
  •  
    Framework for Program Evaluation by the CDC This is a good resource for program evaluation. Click through "Steps and Standards" for information on collecting credible evidence and engaging stakeholders.
Gary Brown

Read methods online for free - Methodspace - home of the Research Methods community - 1 views

  • Read methods online
  • Book of the month What Counts as Credible Evidence in Applied Research and Evaluation Practice?
  •  
    This site may be valuable for professional development. We have reason to explore what the evaluation community holds as "credible" evidence, which is the chapter the group is reading this month.
1 - 7 of 7
Showing 20 items per page