Skip to main content

Home/ Groups/ CTLT and Friends
Joshua Yeidel

Why Your Boss Is Wrong About You - NYTimes.com - 4 views

  •  
    A brief take on the "performance _pre_view", and why the standard performance _re_view tends to diminish, not enhance, performance.
Peggy Collins

FERPA and social media - 1 views

  •  
    FERPA is one of the most misunderstood regulations in education. It is commonly assumed that FERPA requires all student coursework to be kept private at all times, and thus prevents the use of social media in the classroom, but this is wrong. FERPA does not prevent instructors from assigning students to create public content as part of their course requirements. If it did, then video documentaries produced in a communications class and shown on TV or the Web, or public art shows of student work from an art class, would be illegal. As one higher education lawyer put it
Joshua Yeidel

The Answer Sheet - A principal on standardized vs. teacher-written tests - 0 views

  •  
    High school principal George Wood eloquently contrasts standardized NCLB-style testing with his school's performance assessments.
Joshua Yeidel

Performance Assessment | The Alternative to High Stakes Testing - 0 views

  •  
    " The New York Performance Standards Consortium represents 28 schools across New York State. Formed in 1997, the Consortium opposes high stakes tests arguing that "one size does not fit all." Despite skepticism that an alternative to high stakes tests could work, the New York Performance Standards Consortium has done just that...developed an assessment system that leads to quality teaching, that enhances rather than compromises our students' education. Consortium school graduates go on to college and are successful."
Joshua Yeidel

Higher Education: Assessment & Process Improvement Group News | LinkedIn - 0 views

  •  
    High School Principal George Wood eloquently contrasts standardized NCLB-style testing and his school's term-end performance testing.
Peggy Collins

SoTL Resources - 1 views

  •  
    Vanderbilt University Center for Teaching-- descriptions of what SOTL is and links to many samples and resources.
Joshua Yeidel

Refining the Recipe for a Degree, Ingredient by Ingredient - Government - The Chronicle... - 1 views

  • Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
  •  
    Lumina Foundation pushes a Eurpopean-style process to define education goals state- and nation-wide, with mixed success. "Chemistry, history, math, and physics have been among the most successful", whileothers have had a hard time beginning. "Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
Joshua Yeidel

Cross-Disciplinary Grading Techniques - ProfHacker - The Chronicle of Higher Education - 0 views

  •  
    "So far, the most useful tool to me, in physics, has been the rubric, which is used widely in grading open-ended assessments in the humanities. "
  •  
    A focus on improving the grading experience, rather than the learning experience, but still a big step forward for (some) hard scientists.
Judy Rumph

Measure or Perish - Commentary - The Chronicle of Higher Education - 3 views

shared by Judy Rumph on 14 Dec 10 - No Cached
  •  
    I found this apropos.
Joshua Yeidel

Using Clickers to Facilitate Peer Review in a Writing Seminar - ProfHacker - The Chroni... - 0 views

  •  
    "Teaching a writing seminar has also provided me with an opportunity to use clickers in ways that are new to me. "
Joshua Yeidel

Students Know Good Teaching When They Get It, Survey Finds - NYTimes.com - 2 views

  •  
    ... as measured by student evals and "value-added modeling".  Note some of the student eval items, though... e.g., students agree or disagree with "In this class, we learn to correct our mistakes."
Joshua Yeidel

Evaluating Teachers: The Important Role of Value-Added [pdf] - 1 views

  •  
    "We conclude that value-added data has an important role to play in teacher evaluation systems, but that there is much to be learned about how best to use value-added information in human resource decisions." No mention of the role of assessment in improvement.
Nils Peterson

U. of Phoenix Reports on Students' Academic Progress - Measuring Stick - The Chronicle ... - 0 views

  • In comparisons of seniors versus freshmen within the university, the 2,428 seniors slightly outperformed 4,003 freshmen in all categories except natural sciences, in which they were equivalent.
    • Nils Peterson
       
      This is the value added measure.
  • The University of Phoenix has released its third “Academic Annual Report,” a document that continues to be notable not so much for the depth of information it provides on its students’ academic progress but for its existence at all.
    • Nils Peterson
       
      Provides a range of measures, inc. demographics, satisfaction, indirect measures of percieved utility and direct measures using national tests.
  • The Phoenix academic report also includes findings on students’ performance relative to hundreds of thousands of students at nearly 400 peer institutions on two standardized tests
  • ...1 more annotation...
  • University of Phoenix seniors slightly underperformed a comparison group of 42,649 seniors at peer institutions in critical thinking, humanities, social sciences, and natural sciences, and moderately underperformed the peer group in reading, writing, and mathematics.
Gary Brown

Sincerity in evaluation - highlights and lowlights « Genuine Evaluation - 3 views

  • Principles of Genuine Evaluation When we set out to explore the notion of ‘Genuine Evaluation’, we identified 5 important aspects of it: VALUE-BASED -transparent and defensible values (criteria of merit and worth and standards of performance) EMPIRICAL – credible evidence about what has happened and what has caused this, USABLE – reported in such a way that it can be understood and used by those who can and should use it (which doesn’t necessarily mean it’s used or used well, of course) SINCERE – a commitment by those commissioning evaluation to respond to information about both success and failure (those doing evaluation can influence this but not control it) HUMBLE – acknowledges its limitations From now until the end of the year, we’re looking at each of these principles and collecting some of the highlights and lowlights  from 2010 (and previously).
  • Sincerity of evaluation is something that is often not talked about in evaluation reports, scholarly papers, or formal presentations, only discussed in the corridors and bars afterwards.  And yet it poses perhaps the greatest threat to the success of individual evaluations and to the whole enterprise of evaluation.
Gary Brown

Measuring Student Learning: Many Tools - Measuring Stick - The Chronicle of Higher Educ... - 2 views

  • The issue that needs to be addressed and spectacularly has been avoided is whether controlled studies (one group does the articulation of and then measurement of outcomes, and a control group does what we have been doing before this mania took hold) can demonstrate or falsify the claim that outcomes assessment results in better-educated students. So far as I can tell, we instead gather data on whether we have in fact been doing outcomes assessment. Not the issue, people. jwp
  •  
    The challenge--not the control study this person calls for, but the perception that outcomes assessment produces outcomes....
Nils Peterson

BBC News - McDonald's to launch own degree - 2 views

  • The two-year foundation degree in managing business operations is a demonstration of how seriously the company takes the training of its staff
    • Nils Peterson
       
      tying the degree to a stakeholder's needs. wonder what Macdonalds has a learning outcomes.
Gary Brown

Cross-Disciplinary Grading Techniques - ProfHacker - The Chronicle of Higher Education - 1 views

  • So far, the most useful tool to me, in physics, has been the rubric, which is used widely in grading open-ended assessments in the humanities.
  • This method has revolutionized the way I grade. No longer do I have to keep track of how many points are deducted from which type of misstep on what problem for how many students. In the past, I often would get through several tests before I realized that I wasn’t being consistent with the deduction of points, and then I’d have to go through and re-grade all the previous tests. Additionally, the rubric method encourages students to refer to a solution, which I post after the test is administered, and they are motivated to meet with me in person to discuss why they got a 2 versus a 3 on a given problem, for example.
  • his opens up the opportunity to talk with them personally about their problem-solving skills and how they can better them. The emphasis is moved away from point-by-point deductions and is redirected to a more holistic view of problem solving.
  •  
    In the heart of the home of the concept inventory--Physics
Gary Brown

A Final Word on the Presidents' Student-Learning Alliance - Measuring Stick - The Chron... - 1 views

  • I was very pleased to see the responses to the announcement of the Presidents’ Alliance as generally welcoming (“commendable,” “laudatory initiative,” “applaud”) the shared commitment of these 71 founding institutions to do more—and do it publicly and cooperatively—with regard to gathering, reporting, and using evidence of student learning.
  • establishing institutional indicators of educational progress that could be valuable in increasing transparency may not suggest what needs changing to improve results
  • As Adelman’s implied critique of the CLA indicates, we may end up with an indicator without connections to practice.
  • ...6 more annotations...
  • The Presidents’ Alliance’s focus on and encouragement of institutional efforts is important to making these connections and steps in a direct way supporting improvement.
  • Second, it is hard to disagree with the notion that ultimately evidence-based improvement will occur only if faculty members are appropriately trained and encouraged to improve their classroom work with undergraduates.
  • Certainly there has to be some connection between and among various levels of assessment—classroom, program, department, and institution—in order to have evidence that serves both to aid improvement and to provide transparency and accountability.
  • Presidents’ Alliance is setting forth a common framework of “critical dimensions” that institutions can use to evaluate and extend their own efforts, efforts that would include better reporting for transparency and accountability and greater involvement of faculty.
  • there is wide variation in where institutions are in their efforts, and we have a long way to go. But what is critical here is the public commitment of these institutions to work on their campuses and together to improve the gathering and reporting of evidence of student learning and, in turn, using evidence to improve outcomes.
  • The involvement of institutions of all types will make it possible to build a more coherent and cohesive professional community in which evidence-based improvement of student learning is tangible, visible, and ongoing.
Joshua Yeidel

Teaching for America - NYTimes.com - 1 views

  •  
    Tom Friedman reports (approvingly) on Arne Duncan's plans to upgrade the teaching profession.  Any gueses how this might apply to "higher" education?
Gary Brown

Duncan: Rewarding Teachers for Master's Degrees Is Waste of Money - The Ticker - The Ch... - 1 views

  • Arne Duncan, said state and local governments should rethink their policies of giving pay raises to teachers who have master’s degrees because evidence suggests that the degree alone does not improve student achievement.
  •  
    distinguishes between outcome and impact and/ or illustrates the problems of grades/degrees as credible outcome.
1 - 20 of 889 Next › Last »
Showing 20 items per page