Skip to main content

Home/ CTLT and Friends/ Group items tagged VSA

Rss Feed Group items tagged

Theron DesRosier

Assessing Learning Outcomes at the University of Cincinnati: Comparing Rubric Assessmen... - 2 views

  •  
    "When the CLA results arrived eight months later, the UC team compared the outcomes of the two assessments. "We found no statistically significant correlation between the CLA scores and the portfolio scores," Escoe says. "In some ways, it's a disappointing finding. If we'd found a correlation, we could tell faculty that the CLA, as an instrument, is measuring the same things that we value and that the CLA can be embedded in a course. But that didn't happen." There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points "in a black box": if a student referred to a specific piece of evidence in a critical-thinking question, he or she simply received one point. In addition, she says, faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind-leading to results that would not correlate to a computer-scored test. In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. "
  •  
    Another institution trying to make sense of the CLA. This study compared student's CLA scores with criteria-based scores of their eportfolios. The study used a modified version of the VALUE rubrics developed by the AACU. Our own Gary Brown was on the team that developed the critical thinking rubric for the VALUE project.
  •  
    "The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. " This begs some questions: what meaning can we attach to these two non-correlated measures? What VSA requirements can rubric-based assessment NOT satisfy? Are those "requirements" really useful?
Nils Peterson

AAC&U News | April 2010 | Feature - 1 views

  • Comparing Rubric Assessments to Standardized Tests
  • First, the university, a public institution of about 40,000 students in Ohio, needed to comply with the Voluntary System of Accountability (VSA), which requires that state institutions provide data about graduation rates, tuition, student characteristics, and student learning outcomes, among other measures, in the consistent format developed by its two sponsoring organizations, the Association of Public and Land-grant Universities (APLU), and the Association of State Colleges and Universities (AASCU).
  • And finally, UC was accepted in 2008 as a member of the fifth cohort of the Inter/National Coalition for Electronic Portfolio Research, a collaborative body with the goal of advancing knowledge about the effect of electronic portfolio use on student learning outcomes.  
  • ...13 more annotations...
  • outcomes required of all UC students—including critical thinking, knowledge integration, social responsibility, and effective communication
  • “The wonderful thing about this approach is that full-time faculty across the university  are gathering data about how their  students are doing, and since they’ll be teaching their courses in the future, they’re really invested in rubric assessment—they really care,” Escoe says. In one case, the capstone survey data revealed that students weren’t doing as well as expected in writing, and faculty from that program adjusted their pedagogy to include more writing assignments and writing assessments throughout the program, not just at the capstone level. As the university prepares to switch from a quarter system to semester system in two years, faculty members are using the capstone survey data to assist their course redesigns, Escoe says.
  • the university planned a “dual pilot” study examining the applicability of electronic portfolio assessment of writing and critical thinking alongside the Collegiate Learning Assessment,
  • The rubrics the UC team used were slightly modified versions of those developed by AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) project. 
  • In the critical thinking rubric assessment, for example, faculty evaluated student proposals for experiential honors projects that they could potentially complete in upcoming years.  The faculty assessors were trained and their rubric assessments “normed” to ensure that interrater reliability was suitably high.
  • “It’s not some nitpicky, onerous administrative add-on. It’s what we do as we teach our courses, and it really helps close that assessment loop.”
  • There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points “in a black box”:
  • faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind—leading to results that would not correlate to a computer-scored test. 
  • “The CLA provides scores at the institutional level. It doesn’t give me a picture of how I can affect those specific students’ learning. So that’s where rubric assessment comes in—you can use it to look at data that’s compiled over time.”
  • Their portfolios are now more like real learning portfolios, not just a few artifacts, and we want to look at them as they go into their third and fourth years to see what they can tell us about students’ whole program of study.”  Hall and Robles are also looking into the possibility of forming relationships with other schools from NCEPR to exchange student e-portfolios and do a larger study on the value of rubric assessment of student learning.
  • “We’re really trying to stress that assessment is pedagogy,”
  • “We found no statistically significant correlation between the CLA scores and the portfolio scores,”
  • In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement.
    • Nils Peterson
       
      CLA did not provide information for continuous program improvement -- we've heard this argument before
  •  
    The lack of correlation might be rephrased--there appears to be no corrlation between what is useful for faculty who teach and what is useful for the VSA. A corollary question: Of what use is the VSA?
Gary Brown

News: Assessing the Assessments - Inside Higher Ed - 2 views

  • The validity of a measure is based on evidence regarding the inferences and assumptions that are intended to be made and the uses to which the measure will be put. Showing that the three tests in question are comparable does not support Shulenburger's assertion regarding the value-added measure as a valid indicator of institutional effectiveness. The claim that public university groups have previously judged the value-added measure as appropriate does not tell us anything about the evidence upon which this judgment was based nor the conditions under which the judgment was reached. As someone familiar with the process, I would assert that there was no compelling evidence presented that these instruments and the value-added measure were validated for making this assertion (no such evidence was available at the time), which is the intended use in the VSA.
  • (however much the sellers of these tests tell you that those samples are "representative"), they provide an easy way out for academic administrators who want to avoid the time-and-effort consuming but incredibly valuable task of developing detailed major program learning outcome statements (even the specialized accrediting bodies don't get down to the level of discrete, operational statements that guide faculty toward appropriate assessment design)
  • f somebody really cared about "value added," they could look at each student's first essay in this course, and compare it with that same student's last essay in this course. This person could then evaluate each individual student's increased mastery of the subject-matter in the course (there's a lot) and also the increased writing skill, if any.
  • ...1 more annotation...
  • These skills cannot be separated out from student success in learning sophisticated subject-matter, because understanding anthropology, or history of science, or organic chemistry, or Japanese painting, is not a matter of absorbing individual facts, but learning facts and ways of thinking about them in a seamless, synthetic way. No assessment scheme that neglects these obvious facts about higher education is going to do anybody any good, and we'll be wasting valuable intellectual and financial resources if we try to design one.
  •  
    ongoing discussion of these tools. Note Longanecker's comment and ask me why.
Nils Peterson

The New Muscle: 5 Quality-of-Learning Projects That Didn't Exist 5 Years Ago - Special ... - 0 views

shared by Nils Peterson on 30 Aug 10 - Cached
  • The New Muscle: 5 Quality-of-Learning Projects That Didn't Exist 5 Years Ago   Lumina Foundation for Education's Tuning USA Year started: 2009 What it does: Supports statewide, faculty-led discussions, meetings, and surveys to define discipline-specific knowledge and skills that college and state officials, students, alumni, and employers can expect graduates of particular degree programs to have.
    • Nils Peterson
       
      That they lump VSA in here with the others suggests to me that the Chronicle's author doesn't distinguish the nuance.
Gary Brown

2 Efforts to Provide Data on Colleges to Consumers Fall Short, Report Says - Administra... - 2 views

  • Higher education will have to be more accountable for its performance and more open to consumers about the actual cost of attending a college, and help people make easier comparisons among institutions, in order to succeed as the nation's economic engine, says a new report from two nonprofit think tanks here.
  • too little information to make informed choices about where they will get the most from their tuition dollars, say researchers at the two organizations, the libertarian-leaning American Enterprise Institute, and Education Sector, which is a proponent of reforming higher education
  • And without a more thorough and open form of accountability, institutions will not have any incentive to make the changes that will improve students' success,
  • ...7 more annotations...
  • "If existing flaws are not resolved, the nation runs the risk of ending up in the worst of all worlds: the appearance of higher education accountability without the reality," the authors say.
  • The two voluntary systems criticized in the study are the University and College Accountability Network, begun in September 2007 by the National Association of Independent Colleges and Universities to provide information about private colleges, and the Voluntary System of Accountability,
  • it does not obligate institutions to gather or reveal any data that are not already available elsewhere,"
  • associations are beginning to offer workshops and other opportunities for system participants to learn how to use the data they're collecting to improve the college experience for students, she said
  • VSA has the testing lobby written all over it
  • We may all appreciate the cultural context inhibiting public accountability but it is also important to understand that this same accountability is lacking internally where it effectively thwarts attempts to manage the institution rationally; i.e., informed with a continuous flow of mission-critical performance information. With the scant objective information at their command, college presidents and their associates must perform as shamans, reading the tea leaves of opinion and passion among stakeholders.
  • On balance, America's institutions of higher education function in a managerial vacuum.
  •  
    More of the same and the discussion is familiar--our challenge is to bring this topic to the attention of our points.
Gary Brown

Accountability Effort for Community Colleges Pushes Forward, and Other Meeting Notes - ... - 1 views

  • A project led by the American Association of Community Colleges to develop common, voluntary standards of accountability for two-year institutions is moving forward, and specific performance measures are being developed, an official at the association said.
  • financed by the Lumina Foundation for Education and the Bill & Melinda Gates Foundation, is now its second phase
  • The project's advocates have begun pushing a public-relations campaign to build support for the accountability effort among colleges.
  • ...2 more annotations...
  • common reporting formats and measures that are appropriate to their institutions
  • Mr. Phillippe said one area of college performance the voluntary accountability system will measure is student persistence and completion, including retention and transfer rates. Student progress toward completion may also be measured by tracking how many students reach certain credit milestones. Other areas that will be measured include colleges' contributions to the work force and economic and community development.
  •  
    Footsteps....
Gary Brown

News: Assessing the Assessments - Inside Higher Ed - 0 views

  • In other words, a college that ranked in the 95th percentile for critical thinking using one of the tests would rank in roughly the same place using the critical thinking component of one of the other two tests, and vice versa.
    • Gary Brown
       
      A stellar example of critical thinking, this sentence.
  • diversity in measurement" to satisfy faculty
1 - 7 of 7
Showing 20 items per page