Skip to main content

Home/ CTLT and Friends/ Group items tagged CLA

Rss Feed Group items tagged

Gary Brown

A Measure of Learning Is Put to the Test - Faculty - The Chronicle of Higher Education - 1 views

  • Others say those who take the test have little motivation to do well, which makes it tough to draw conclusions from their performance.
  • "Everything that No Child Left Behind signified during the Bush administration—we operate 180 degrees away from that," says Roger Benjamin, president of the Council for Aid to Education, which developed and promotes the CLA. "We don't want this to be a high-stakes test. We're putting a stake in the ground on classic liberal-arts issues. I'm willing to rest my oar there. These core abilities, these higher-order skills, are very important, and they're even more important in a knowledge economy where everyone needs to deal with a surplus of information." Only an essay test, like the CLA, he says, can really get at those skills.
  • "The CLA is really an authentic assessment process," says Pedro Reyes, associate vice chancellor for academic planning and assessment at the University of Texas system.
  • ...20 more annotations...
  • "The Board of Regents here saw that it would be an important test because it measures analytical ability, problem-solving ability, critical thinking, and communication. Those are the skills that you want every undergraduate to walk away with." (Other large systems that have embraced the CLA include California State University and the West Virginia system.)
  • value added
  • We began by administering a retired CLA question, a task that had to do with analyzing crime-reduction strategies,
  • performance task that mirrors the CLA
  • Mr. Ernsting and Ms. McConnell are perfectly sincere about using CLA-style tasks to improve instruction on their campuses. But at the same time, colleges have a less high-minded motive for familiarizing students with the CLA style: It just might improve their scores when it comes time to take the actual test.
  • by 2012, the CLA scores of more than 100 colleges will be posted, for all the world to see, on the "College Portrait" Web site of the Voluntary System of Accountability, an effort by more than 300 public colleges and universities to provide information about life and learning on their campuses.
  • If familiarizing students with CLA-style tasks does raise their scores, then the CLA might not be a pure, unmediated reflection of the full range of liberal-arts skills. How exactly should the public interpret the scores of colleges that do not use such training exercises?
  • Trudy W. Banta, a professor of higher education and senior adviser to the chancellor for academic planning and evaluation at Indiana University-Purdue University at Indianapolis, believes it is a serious mistake to publicly release and compare scores on the test. There is too much risk, she says, that policy makers and the public will misinterpret the numbers.
  • most colleges do not use a true longitudinal model: That is, the students who take the CLA in their first year do not take it again in their senior year. The test's value-added model is therefore based on a potentially apples-and-oranges comparison.
  • freshman test-takers' scores are assessed relative to their SAT and ACT scores, and so are senior test-takers' scores. For that reason, colleges cannot game the test by recruiting an academically weak pool of freshmen and a strong pool of seniors.
  • students do not always have much motivation to take the test seriously
  • seniors, who are typically recruited to take the CLA toward the end of their final semester, when they can already taste the graduation champagne.
  • Of the few dozen universities that had already chosen to publish CLA data on that site, roughly a quarter of the reports appeared to include erroneous descriptions of the year-to-year value-added scores.
  • It is clear that CLA scores do reflect some broad properties of a college education.
  • Students' CLA scores improved if they took courses that required a substantial amount of reading and writing. Many students didn't take such courses, and their CLA scores tended to stay flat.
  • Colleges that make demands on students can actually develop their skills on the kinds of things measured by the CLA.
  • Mr. Shavelson believes the CLA's essays and "performance tasks" offer an unusually sophisticated way of measuring what colleges do, without relying too heavily on factual knowledge from any one academic field.
  • Politicians and consumers want easily interpretable scores, while colleges need subtler and more detailed data to make internal improvements.
  • The CLA is used at more than 400 colleges
  • Since its debut a decade ago, it has been widely praised as a sophisticated alternative to multiple-choice tests
Theron DesRosier

Assessing Learning Outcomes at the University of Cincinnati: Comparing Rubric Assessmen... - 2 views

  •  
    "When the CLA results arrived eight months later, the UC team compared the outcomes of the two assessments. "We found no statistically significant correlation between the CLA scores and the portfolio scores," Escoe says. "In some ways, it's a disappointing finding. If we'd found a correlation, we could tell faculty that the CLA, as an instrument, is measuring the same things that we value and that the CLA can be embedded in a course. But that didn't happen." There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points "in a black box": if a student referred to a specific piece of evidence in a critical-thinking question, he or she simply received one point. In addition, she says, faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind-leading to results that would not correlate to a computer-scored test. In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. "
  •  
    Another institution trying to make sense of the CLA. This study compared student's CLA scores with criteria-based scores of their eportfolios. The study used a modified version of the VALUE rubrics developed by the AACU. Our own Gary Brown was on the team that developed the critical thinking rubric for the VALUE project.
  •  
    "The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. " This begs some questions: what meaning can we attach to these two non-correlated measures? What VSA requirements can rubric-based assessment NOT satisfy? Are those "requirements" really useful?
Joshua Yeidel

A Measure of Learning Is Put to the Test - Faculty - The Chronicle of Higher Education - 1 views

  • "The CLA is really an authentic assessment process,"
    • Joshua Yeidel
       
      What is the meaning of "authentic" in this statement? It certainly isn't "situated in the real world" or "of intrinsic value".
  • it measures analytical ability, problem-solving ability, critical thinking, and communication.
  • the CLA typically reports scores on a "value added" basis, controlling for the scores that students earned on the SAT or ACT while in high school.
    • Joshua Yeidel
       
      If SAT and ACT are measuring the same things as CLA, then why not just use them? If they are measuring different things, why "control for" their scores?
  • ...5 more annotations...
  • improved models of instruction.
  • add CLA-style assignments to their liberal-arts courses.
    • Joshua Yeidel
       
      Maybe the best way to prepare for the test, but the best way to develop analytical ability, et. al.?
  • "If a college pays attention to learning and helps students develop their skills—whether they do that by participating in our programs or by doing things on their own—they probably should do better on the CLA,"
    • Joshua Yeidel
       
      Just in case anyone missed the message: pay attention to learning, and you'll _probably_ do better on the CLA. Get students to practice CLA tasks, and you _will_ do better on the CLA.
  • "Standardized tests of generic skills—I'm not talking about testing in the major—are so much a measure of what students bring to college with them that there is very little variance left out of which we might tease the effects of college," says Ms. Banta, who is a longtime critic of the CLA. "There's just not enough variance there to make comparative judgments about the comparative quality of institutions."
    • Joshua Yeidel
       
      It's not clear what "standardized tests" means in this comment. Does the "lack of variance" apply to all assessments (including, e.g., e-portfolios)?
  • Can the CLA fill both of those roles?
  •  
    A summary of the current state of "thinking" with regard to CLA. Many fallacies and contradictions are (unintentionally) exposed. At least CLA appears to be more about skills than content (though the question of how it is graded isn't even raised), but the "performance task" approach is the smallest possible step in that direction.
Nils Peterson

AAC&U News | April 2010 | Feature - 1 views

  • Comparing Rubric Assessments to Standardized Tests
  • First, the university, a public institution of about 40,000 students in Ohio, needed to comply with the Voluntary System of Accountability (VSA), which requires that state institutions provide data about graduation rates, tuition, student characteristics, and student learning outcomes, among other measures, in the consistent format developed by its two sponsoring organizations, the Association of Public and Land-grant Universities (APLU), and the Association of State Colleges and Universities (AASCU).
  • And finally, UC was accepted in 2008 as a member of the fifth cohort of the Inter/National Coalition for Electronic Portfolio Research, a collaborative body with the goal of advancing knowledge about the effect of electronic portfolio use on student learning outcomes.  
  • ...13 more annotations...
  • outcomes required of all UC students—including critical thinking, knowledge integration, social responsibility, and effective communication
  • “The wonderful thing about this approach is that full-time faculty across the university  are gathering data about how their  students are doing, and since they’ll be teaching their courses in the future, they’re really invested in rubric assessment—they really care,” Escoe says. In one case, the capstone survey data revealed that students weren’t doing as well as expected in writing, and faculty from that program adjusted their pedagogy to include more writing assignments and writing assessments throughout the program, not just at the capstone level. As the university prepares to switch from a quarter system to semester system in two years, faculty members are using the capstone survey data to assist their course redesigns, Escoe says.
  • the university planned a “dual pilot” study examining the applicability of electronic portfolio assessment of writing and critical thinking alongside the Collegiate Learning Assessment,
  • The rubrics the UC team used were slightly modified versions of those developed by AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) project. 
  • In the critical thinking rubric assessment, for example, faculty evaluated student proposals for experiential honors projects that they could potentially complete in upcoming years.  The faculty assessors were trained and their rubric assessments “normed” to ensure that interrater reliability was suitably high.
  • “We found no statistically significant correlation between the CLA scores and the portfolio scores,”
  • There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points “in a black box”:
  • faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind—leading to results that would not correlate to a computer-scored test. 
  • “The CLA provides scores at the institutional level. It doesn’t give me a picture of how I can affect those specific students’ learning. So that’s where rubric assessment comes in—you can use it to look at data that’s compiled over time.”
  • Their portfolios are now more like real learning portfolios, not just a few artifacts, and we want to look at them as they go into their third and fourth years to see what they can tell us about students’ whole program of study.”  Hall and Robles are also looking into the possibility of forming relationships with other schools from NCEPR to exchange student e-portfolios and do a larger study on the value of rubric assessment of student learning.
  • “We’re really trying to stress that assessment is pedagogy,”
  • “It’s not some nitpicky, onerous administrative add-on. It’s what we do as we teach our courses, and it really helps close that assessment loop.”
  • In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement.
    • Nils Peterson
       
      CLA did not provide information for continuous program improvement -- we've heard this argument before
  •  
    The lack of correlation might be rephrased--there appears to be no corrlation between what is useful for faculty who teach and what is useful for the VSA. A corollary question: Of what use is the VSA?
Joshua Yeidel

Scholar Raises Doubts About the Value of a Test of Student Learning - Research - The Ch... - 3 views

  • Beginning in 2011, the 331 universities that participate in the Voluntary System of Accountability will be expected to publicly report their students' performance on one of three national tests of college-level learning.
  • But at least one of those three tests—the Collegiate Learning Assessment, or CLA—isn't quite ready to be used as a tool of public accountability, a scholar suggested here on Tuesday during the annual meeting of the Association for Institutional Research.
  • Students' performance on the test was strongly correlated with how long they spent taking it.
  • ...6 more annotations...
  • Besides the CLA, which is sponsored by the Council for Aid to Education, other tests that participants in the voluntary system may use are the Collegiate Assessment of Academic Proficiency, from ACT Inc., and the Measure of Academic Proficiency and Progress, offered by the Educational Testing Service.
  • The test has sometimes been criticized for relying on a cross-sectional system rather than a longitudinal model, in which the same students would be tested in their first and fourth years of college.
  • there have long been concerns about just how motivated students are to perform well on the CLA.
  • Mr. Hosch suggested that small groups of similar colleges should create consortia for measuring student learning. For example, five liberal-arts colleges might create a common pool of faculty members that would evaluate senior theses from all five colleges. "That wouldn't be a national measure," Mr. Hosch said, "but it would be much more authentic."
  • Mr. Shavelson said. "The challenge confronting higher education is for institutions to address the recruitment and motivation issues if they are to get useful data. From my perspective, we need to integrate assessment into teaching and learning as part of students' programs of study, thereby raising the stakes a bit while enhancing motivation of both students and faculty
  • "I do agree with his central point that it would not be prudent to move to an accountability system based on cross-sectional assessments of freshmen and seniors at an institution," said Mr. Arum, who is an author, with Josipa Roksa, of Academically Adrift: Limited Learning on College Campuses, forthcoming from the University of Chicago Press
  •  
    CLA debunking, but the best item may be the forthcoming book on "limited learning on College Campuses."
  •  
    "Micheal Scriven and I spent more than a few years trying to apply his multiple-ranking item tool (a very robust and creative tool, I recommend it to others when the alternative is multiple-choice items) to the assessment of critical thinking in health care professionals. The result might be deemed partially successful, at best. I eventually abandoned the test after about 10,000 administrations because the scoring was so complex we could not place it in non-technical hands."
  •  
    In comments on an article about CLA, Scriven's name comes up...
Gary Brown

Want Students to Take an Optional Test? Wave 25 Bucks at Them - Students - The Chronicl... - 0 views

  • cash, appears to be the single best approach for colleges trying to recruit students to volunteer for institutional assessments and other low-stakes tests with no bearing on their grades.
  • American Educational Research Association
  • A college's choice of which incentive to offer does not appear to have a significant effect on how students end up performing, but it can have a big impact on colleges' ability to round up enough students for the assessments, the study found.
  • ...6 more annotations...
  • "I cannot provide you with the magic bullet that will help you recruit your students and make sure they are performing to the maximum of their ability," Mr. Steedle acknowledged to his audience at the Denver Convention Center. But, he said, his study results make clear that some recruitment strategies are more effective than others, and also offer some notes of caution for those examining students' scores.
  • The study focused on the council's Collegiate Learning Assessment, or CLA, an open-ended test of critical thinking and writing skills which is annually administered by several hundred colleges. Most of the colleges that use the test try to recruit 100 freshmen and 100 seniors to take it, but doing so can be daunting, especially for colleges that administer it in the spring, right when the seniors are focused on wrapping up their work and graduating.
  • The incentives that spurred students the least were the opportunity to help their college as an institution assess student learning, the opportunity to compare themselves to other students, a promise they would be recognized in some college publication, and the opportunity to put participation in the test on their resume.
  • The incentives which students preferred appeared to have no significant bearing on their performance. Those who appeared most inspired by a chance to earn 25 dollars did not perform better on the CLA than those whose responses suggested they would leap at the chance to help out a professor.
  • What accounted for differences in test scores? Students' academic ability going into the test, as measured by characteristics such as their SAT scores, accounted for 34 percent of the variation in CLA scores among individual students. But motivation, independent of ability, accounted for 5 percent of the variation in test scores—a finding that, the paper says, suggests it is "sensible" for colleges to be concerned that students with low motivation are not posting scores that can allow valid comparisons with other students or valid assessments of their individual strengths and weaknesses.
  • A major limitation of the study was that Mr. Steedle had no way of knowing how the students who took the test were recruited. "If many of them were recruited using cash and prizes, it would not be surprising if these students reported cash and prizes as the most preferable incentives," his paper concedes.
  •  
    Since it is not clear if the incentive to participate in this study influenced the decision to participate, it remains similarly unclear if incentives to participate correlate with performance.
Gary Brown

A Final Word on the Presidents' Student-Learning Alliance - Measuring Stick - The Chron... - 1 views

  • I was very pleased to see the responses to the announcement of the Presidents’ Alliance as generally welcoming (“commendable,” “laudatory initiative,” “applaud”) the shared commitment of these 71 founding institutions to do more—and do it publicly and cooperatively—with regard to gathering, reporting, and using evidence of student learning.
  • establishing institutional indicators of educational progress that could be valuable in increasing transparency may not suggest what needs changing to improve results
  • As Adelman’s implied critique of the CLA indicates, we may end up with an indicator without connections to practice.
  • ...6 more annotations...
  • The Presidents’ Alliance’s focus on and encouragement of institutional efforts is important to making these connections and steps in a direct way supporting improvement.
  • Second, it is hard to disagree with the notion that ultimately evidence-based improvement will occur only if faculty members are appropriately trained and encouraged to improve their classroom work with undergraduates.
  • Certainly there has to be some connection between and among various levels of assessment—classroom, program, department, and institution—in order to have evidence that serves both to aid improvement and to provide transparency and accountability.
  • Presidents’ Alliance is setting forth a common framework of “critical dimensions” that institutions can use to evaluate and extend their own efforts, efforts that would include better reporting for transparency and accountability and greater involvement of faculty.
  • there is wide variation in where institutions are in their efforts, and we have a long way to go. But what is critical here is the public commitment of these institutions to work on their campuses and together to improve the gathering and reporting of evidence of student learning and, in turn, using evidence to improve outcomes.
  • The involvement of institutions of all types will make it possible to build a more coherent and cohesive professional community in which evidence-based improvement of student learning is tangible, visible, and ongoing.
Nils Peterson

E-Portfolios for Learning: Limitations of Portfolios - 1 views

  • Today, Shavelson, Klein & Benjamin published an online article on Inside Higher Ed entitled, "The Limitations of Portfolios." The comments to that article are even more illuminating, and highlight the debate about electronic portfolios vs. accountability systems... assessment vs. evaluation. These arguments highlight what I think is a clash in philosophies of learning and assessment, between traditional, behaviorist models and more progressive, cognitive/constructivist models. How do we build assessment strategies that bridge these two approaches? Or is the divide too wide? Do these different perspectives support the need for multiple measures and triangulation?
    • Nils Peterson
       
      Helen responds to CLA proponents
Theron DesRosier

An Expert Surveys the Assessment Landscape - The Chronicle of Higher Education - 2 views

  • What we want is for assessment to become a public, shared responsibility, so there should be departmental leadership.
  •  
    "What we want is for assessment to become a public, shared responsibility, so there should be departmental leadership." George Kuh director of the National Institute for Learning Outcomes Assessment.
  •  
    Kuh also says, "So we're going to spend some time looking at the impact of the Voluntary System of Accountability. It's one thing for schools to sign up, it's another to post the information and to show that they're actually doing something with it. It's not about posting a score on a Web site-it's about doing something with the data." He doesn't take the next step and ask if it is even possible for schools to actually do anything with the data collected from the CLA or ask who has access to the criteria: Students? Faculty? Anyone?
Gary Brown

News: Assessing the Assessments - Inside Higher Ed - 2 views

  • The validity of a measure is based on evidence regarding the inferences and assumptions that are intended to be made and the uses to which the measure will be put. Showing that the three tests in question are comparable does not support Shulenburger's assertion regarding the value-added measure as a valid indicator of institutional effectiveness. The claim that public university groups have previously judged the value-added measure as appropriate does not tell us anything about the evidence upon which this judgment was based nor the conditions under which the judgment was reached. As someone familiar with the process, I would assert that there was no compelling evidence presented that these instruments and the value-added measure were validated for making this assertion (no such evidence was available at the time), which is the intended use in the VSA.
  • (however much the sellers of these tests tell you that those samples are "representative"), they provide an easy way out for academic administrators who want to avoid the time-and-effort consuming but incredibly valuable task of developing detailed major program learning outcome statements (even the specialized accrediting bodies don't get down to the level of discrete, operational statements that guide faculty toward appropriate assessment design)
  • f somebody really cared about "value added," they could look at each student's first essay in this course, and compare it with that same student's last essay in this course. This person could then evaluate each individual student's increased mastery of the subject-matter in the course (there's a lot) and also the increased writing skill, if any.
  • ...1 more annotation...
  • These skills cannot be separated out from student success in learning sophisticated subject-matter, because understanding anthropology, or history of science, or organic chemistry, or Japanese painting, is not a matter of absorbing individual facts, but learning facts and ways of thinking about them in a seamless, synthetic way. No assessment scheme that neglects these obvious facts about higher education is going to do anybody any good, and we'll be wasting valuable intellectual and financial resources if we try to design one.
  •  
    ongoing discussion of these tools. Note Longanecker's comment and ask me why.
Gary Brown

Why Liberal Arts Need Career Services - Commentary - The Chronicle of Higher Education - 1 views

shared by Gary Brown on 04 Nov 09 - Cached
  • I have consulted with deans who say they really want to improve their career-services programs—but no, they can't offer career courses for credit, their professors aren't interested in supervising internships, and they must tread lightly around anything that might be seen by the faculty as encroaching vocationalism.
  • I've also heard from numerous professors, "Our good students go to graduate school. We don't need to focus on those who are looking for jobs."
  • I have also been told, "The professors are too busy teaching to worry about how the students will use their knowledge. It's not their job." And I've had more than one faculty member confess to me that they really aren't sure how what they teach applies in the nonacademic world.
  • ...6 more annotations...
  • There's no way to predict which moments of a liberal-arts education will be directly relevant in the workplace, but it's imperative that students know such moments occur frequently, and that the skills and knowledge they're learning are far from obscure and irrelevant.
  • We use visual-thinking techniques to help students connect the dots between their academic experience and the workplace.
  • But the opportunity to teach a career course that directly draws from a liberal-arts curriculum is not offered at many institutions. Instead, if courses are offered at all, they typically focus on basic job-finding skills like résumé writing and networking, serving to reinforce professors' worst beliefs about career advising: that it distracts and detracts from the educational process.
  • Professors, academic deans, and career-center staff members must work together. Learn what is happening in each other's shops. And don't have just a superficial conversation about services—instead, engage in a conversation about what is truly distinct about the curriculum, what students are learning, and how to make employers care.
  • Career-center personnel should find out what employers are seeking and what they say about your students.
  • If more liberal-arts faculties and career experts get together, watch out—the results could be amazing.
  •  
    As we work with CLA, perhaps inviting Careers Services to join as independent reviewers will help the effort
  •  
    What seems to be missing here is inclusion of employers directly into the learning community. If professors "aren't sure how what they teach applies in the nonacademic world", get somebody directly involved who is, as one voice in the conversation, anyway. Of course this will be challenging to the professoriate, but it could also be revitalizing.
1 - 11 of 11
Showing 20 items per page