Skip to main content

Home/ CTLT and Friends/ Group items tagged assessments

Rss Feed Group items tagged

Gary Brown

News: Assessing the Assessments - Inside Higher Ed - 2 views

  • The validity of a measure is based on evidence regarding the inferences and assumptions that are intended to be made and the uses to which the measure will be put. Showing that the three tests in question are comparable does not support Shulenburger's assertion regarding the value-added measure as a valid indicator of institutional effectiveness. The claim that public university groups have previously judged the value-added measure as appropriate does not tell us anything about the evidence upon which this judgment was based nor the conditions under which the judgment was reached. As someone familiar with the process, I would assert that there was no compelling evidence presented that these instruments and the value-added measure were validated for making this assertion (no such evidence was available at the time), which is the intended use in the VSA.
  • (however much the sellers of these tests tell you that those samples are "representative"), they provide an easy way out for academic administrators who want to avoid the time-and-effort consuming but incredibly valuable task of developing detailed major program learning outcome statements (even the specialized accrediting bodies don't get down to the level of discrete, operational statements that guide faculty toward appropriate assessment design)
  • f somebody really cared about "value added," they could look at each student's first essay in this course, and compare it with that same student's last essay in this course. This person could then evaluate each individual student's increased mastery of the subject-matter in the course (there's a lot) and also the increased writing skill, if any.
  • ...1 more annotation...
  • These skills cannot be separated out from student success in learning sophisticated subject-matter, because understanding anthropology, or history of science, or organic chemistry, or Japanese painting, is not a matter of absorbing individual facts, but learning facts and ways of thinking about them in a seamless, synthetic way. No assessment scheme that neglects these obvious facts about higher education is going to do anybody any good, and we'll be wasting valuable intellectual and financial resources if we try to design one.
  •  
    ongoing discussion of these tools. Note Longanecker's comment and ask me why.
Gary Brown

Program Assessment of Student Learning: July 2010 - 3 views

  • There are lots of considerations when considering a technology solution to the outcomes assessment process.  The first thing is to be very clear about what a system can and cannot do.  It CANNOT do your program assessment and evaluation for you!  The institution or program must first define the intended outcomes and performance indicators.  Without a doubt, that is the most difficult part of the process.  Once the indicators have been defined you need to be clear about the role of students and faculty in the use of the techology.  Also, who is the technology "owner"--who will maintain it, keep the outcomes/indicators current, generate reports, etc. etc.
  •  
    This question returns to us, so here is a resource and key to be able to point to.
Theron DesRosier

Engaging Departments: Assessing Student Learning, Peer Review single issue - 1 views

  •  
    "Description This issue explores how departments are developing assessment approaches that deepen student learning. Recognizing that most faculty identify strongly with their discipline and that students are engaged in more complex and sophisticated practice of liberal learning as they complete their majors, the issue presents articles that advance integrative and engaged learning in and across disciplines. The features draw on sessions and presentations from AAC&U's 2009 Engaging Departments Institute. " A pdf download is available on this page.
Gary Brown

A Measure of Learning Is Put to the Test - Faculty - The Chronicle of Higher Education - 1 views

  • Others say those who take the test have little motivation to do well, which makes it tough to draw conclusions from their performance.
  • "Everything that No Child Left Behind signified during the Bush administration—we operate 180 degrees away from that," says Roger Benjamin, president of the Council for Aid to Education, which developed and promotes the CLA. "We don't want this to be a high-stakes test. We're putting a stake in the ground on classic liberal-arts issues. I'm willing to rest my oar there. These core abilities, these higher-order skills, are very important, and they're even more important in a knowledge economy where everyone needs to deal with a surplus of information." Only an essay test, like the CLA, he says, can really get at those skills.
  • "The CLA is really an authentic assessment process," says Pedro Reyes, associate vice chancellor for academic planning and assessment at the University of Texas system.
  • ...20 more annotations...
  • "The Board of Regents here saw that it would be an important test because it measures analytical ability, problem-solving ability, critical thinking, and communication. Those are the skills that you want every undergraduate to walk away with." (Other large systems that have embraced the CLA include California State University and the West Virginia system.)
  • value added
  • We began by administering a retired CLA question, a task that had to do with analyzing crime-reduction strategies,
  • performance task that mirrors the CLA
  • Mr. Ernsting and Ms. McConnell are perfectly sincere about using CLA-style tasks to improve instruction on their campuses. But at the same time, colleges have a less high-minded motive for familiarizing students with the CLA style: It just might improve their scores when it comes time to take the actual test.
  • by 2012, the CLA scores of more than 100 colleges will be posted, for all the world to see, on the "College Portrait" Web site of the Voluntary System of Accountability, an effort by more than 300 public colleges and universities to provide information about life and learning on their campuses.
  • If familiarizing students with CLA-style tasks does raise their scores, then the CLA might not be a pure, unmediated reflection of the full range of liberal-arts skills. How exactly should the public interpret the scores of colleges that do not use such training exercises?
  • Trudy W. Banta, a professor of higher education and senior adviser to the chancellor for academic planning and evaluation at Indiana University-Purdue University at Indianapolis, believes it is a serious mistake to publicly release and compare scores on the test. There is too much risk, she says, that policy makers and the public will misinterpret the numbers.
  • most colleges do not use a true longitudinal model: That is, the students who take the CLA in their first year do not take it again in their senior year. The test's value-added model is therefore based on a potentially apples-and-oranges comparison.
  • freshman test-takers' scores are assessed relative to their SAT and ACT scores, and so are senior test-takers' scores. For that reason, colleges cannot game the test by recruiting an academically weak pool of freshmen and a strong pool of seniors.
  • students do not always have much motivation to take the test seriously
  • seniors, who are typically recruited to take the CLA toward the end of their final semester, when they can already taste the graduation champagne.
  • Of the few dozen universities that had already chosen to publish CLA data on that site, roughly a quarter of the reports appeared to include erroneous descriptions of the year-to-year value-added scores.
  • It is clear that CLA scores do reflect some broad properties of a college education.
  • Students' CLA scores improved if they took courses that required a substantial amount of reading and writing. Many students didn't take such courses, and their CLA scores tended to stay flat.
  • Colleges that make demands on students can actually develop their skills on the kinds of things measured by the CLA.
  • Mr. Shavelson believes the CLA's essays and "performance tasks" offer an unusually sophisticated way of measuring what colleges do, without relying too heavily on factual knowledge from any one academic field.
  • Politicians and consumers want easily interpretable scores, while colleges need subtler and more detailed data to make internal improvements.
  • The CLA is used at more than 400 colleges
  • Since its debut a decade ago, it has been widely praised as a sophisticated alternative to multiple-choice tests
Gary Brown

Many College Boards Are at Sea in Assessing Student Learning, Survey Finds - Leadership... - 0 views

  • While oversight of educational quality is a critical responsibility of college boards of trustees, a majority of trustees and chief academic officers say boards do not spend enough time discussing student-learning outcomes, and more than a third say boards do not understand how student learning is assessed, says a report issued on Thursday by the Association of Governing Boards of Universities and Colleges.
  • While boards should not get involved in the details of teaching or ways to improve student-learning outcomes, they must hold the administration accountable for identifying needs in the academic programs and then meeting them, the report says. Boards should also make decisions on where to allocate resources based on what works or what should improve.
  • The most commonly received information by boards was college-ranking data
  • ...1 more annotation...
  • Boards should expect to receive useful high-level information on learning outcomes, the report says, and should make comparisons over time and to other institutions. Training in how to understand academic and learning assessments should also be part of orientation for new board members.
  •  
    This piece coupled with the usual commentary reveal again the profound identity crisis shaking education in this country.
Gary Brown

Grazing: Criteria for great assessment tools - 1 views

  •  
    perhaps these sum to utility, but number 5---generativity- would benefit from some unpacking.-
Nils Peterson

It's Time to Improve Academic, Not Just Administrative, Productivity - Chronicle.com - 0 views

  •  
    Kimberly said of this: The focus on activity deals directly with the learning process - one that pushes students to take a more active role - while assessment supplies faculty members with the feedback necessary to diagnose and correct learning problems. Technology allows such active learning processes to be expanded to large courses and, as learning software and databases become better, to use faculty time more effectively. Relates to clickers and skylight learning activities/assessments, in the large class context, as well as the elusive LMS.
Nils Peterson

From Knowledgable to Knowledge-able: Learning in New Media Environments | Academic Commons - 0 views

  • Many faculty may hope to subvert the system, but a variety of social structures work against them. Radical experiments in teaching carry no guarantees and even fewer rewards in most tenure and promotion systems, even if they are successful. In many cases faculty are required to assess their students in a standardized way to fulfill requirements for the curriculum. Nothing is easier to assess than information recall on multiple-choice exams, and the concise and “objective” numbers satisfy committee members busy with their own teaching and research.
    • Nils Peterson
       
      Do we think this is true? Many?
  • In a world of nearly infinite information, we must first address why, facilitate how, and let the what generate naturally from there.
  •  
    "Most university classrooms have gone through a massive transformation in the past ten years. I'm not talking about the numerous initiatives for multiple plasma screens, moveable chairs, round tables, or digital whiteboards. The change is visually more subtle, yet potentially much more transformative."
  •  
    Connect this to the 10 point self assessment we did for AACU comparing institutional vs community-based learning https://teamsite.oue.wsu.edu/ctlt/home/Anonymous%20Access%20Documents/AACU%202009/inst%20vs%20comm%20based%20spectrum.pdf
Gary Brown

Educators Mull How to Motivate Professors to Improve Teaching - Curriculum - The Chroni... - 4 views

  • "Without an unrelenting focus on quality—on defining and measuring and ensuring the learning outcomes of students—any effort to increase college-completion rates would be a hollow effort indeed."
  • If colleges are going to provide high-quality educations to millions of additional students, they said, the institutions will need to develop measures of student learning than can assure parents, employers, and taxpayers that no one's time and money are being wasted.
  • "Effective assessment is critical to ensure that our colleges and universities are delivering the kinds of educational experiences that we believe we actually provide for students," said Ronald A. Crutcher, president of Wheaton College, in Massachusetts, during the opening plenary. "That data is also vital to addressing the skepticism that society has about the value of a liberal education."
  • ...13 more annotations...
  • But many speakers insisted that colleges should go ahead and take drastic steps to improve the quality of their instruction, without using rigid faculty-incentive structures or the fiscal crisis as excuses for inaction.
  • Handing out "teacher of the year" awards may not do much for a college
  • W.E. Deming argued, quality has to be designed into the entire system and supported by top management (that is, every decision made by CEOs and Presidents, and support systems as well as operations) rather than being made the responsibility solely of those delivering 'at the coal face'.
  • I see as a certain cluelessness among those who think one can create substantial change based on volunteerism
  • Current approaches to broaden the instructional repertoires of faculty members include faculty workshops, summer leave, and individual consultations, but these approaches work only for those relatively few faculty members who seek out opportunities to broaden their instructional methods.
  • The approach that makes sense to me is to engage faculty members at the departmental level in a discussion of the future and the implications of the future for their field, their college, their students, and themselves. You are invited to join an ongoing discussion of this issue at http://innovate-ideagora.ning.com/forum/topics/addressing-the-problem-of
  • Putting pressure on professors to improve teaching will not result in better education. The primary reason is that they do not know how to make real improvements. The problem is that in many fields of education there is either not enough research, or they do not have good ways of evaluationg the results of their teaching.
  • Then there needs to be a research based assessment that can be used by individual professors, NOT by the administration.
  • Humanities educatiors either have to learn enough statistics and cognitive science so they can make valid scientific comparisons of different strategies, or they have to work with cognitive scientists and statisticians
  • good teaching takes time
  • On the measurement side, about half of the assessments constructed by faculty fail to meet reasonable minimum standards for validity. (Interestingly, these failures leave the door open to a class action lawsuit. Physicians are successfully sued for failing to apply scientific findings correctly; commerce is replete with lawsuits based on measurement errors.)
  • The elephant in the corner of the room --still-- is that we refuse to measure learning outcomes and impact, especially proficiencies generalized to one's life outside the classroom.
  • until universities stop playing games to make themselves look better because they want to maintain their comfortable positions and actually look at what they can do to improve nothing is going to change.
  •  
    our work, our friends (Ken and Jim), and more context that shapes our strategy.
  •  
    How about using examples of highly motivational lecture and teaching techniques like the Richard Dawkins video I presented on this forum, recently. Even if teacher's do not consciously try to adopt good working techniques, there is at least a strong subconscious human tendency to mimic behaviors. I think that if teachers see more effective techniques, they will automatically begin to adopt adopt them.
Gary Brown

National Institute for Learning Outcomes Assessment - 2 views

  • Three promising alternatives for assessing college students' knowledge and skills. (NILOA Occasional Paper No.2). Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment.
  •  
    Banta and team
Gary Brown

electronic portfolios and student learning outcomes assessment - 0 views

  • There has been an upsurge in reports and press coverage concerning accountability issues and student learning outcomes assessment (SLOA) in higher education. This paper is a brief overview of that upsurge, citing and synthesizing some of the most recent information published about accountability and SLOA.
  •  
    A resource
Joshua Yeidel

Program Assessment of Student Learning - 3 views

  •  
    "It is hoped that, in some small way, this blog can both engage and challenge faculty and administrators alike to become more intentional in their program assessment efforts, creating systematic and efficient processes that actually have the likelihood of improving student learning while honoring faculty time."
  •  
    As recommended by Ashley. Apparently Dr. Rogers' blog is just starting up, so you can "get in on the ground floor".
Joshua Yeidel

Performance Assessment | The Alternative to High Stakes Testing - 0 views

  •  
    " The New York Performance Standards Consortium represents 28 schools across New York State. Formed in 1997, the Consortium opposes high stakes tests arguing that "one size does not fit all." Despite skepticism that an alternative to high stakes tests could work, the New York Performance Standards Consortium has done just that...developed an assessment system that leads to quality teaching, that enhances rather than compromises our students' education. Consortium school graduates go on to college and are successful."
Nils Peterson

How Web-Savvy Edupunks Are Transforming American Higher Education | Page 3 | Fast Company - 0 views

  • If open courseware is about applying technology to sharing knowledge, and Peer2Peer is about social networking for teaching and learning, Bob Mendenhall, president of the online Western Governors University, is proudest of his college's innovation in the third, hardest-to-crack dimension of education: accreditation and assessment.
    • Nils Peterson
       
      Spoke too soon
  •  
    "We said, 'Let's create a university that actually measures learning,' " Mendenhall says. "We do not have credit hours, we do not have grades. We simply have a series of assessments that measure competencies, and on that basis, award the degree." WGU began by convening a national advisory board of employers, including Google and Tenet Healthcare. "We asked them, 'What is it the graduates you're hiring can't do that you wish they could?' We've never had a silence after that question." Then assessments were created to measure each competency area. Mendenhall recalls one student who had been self-employed in IT for 15 years but never earned a degree; he passed all the required assessments in six months and took home his bachelor's without taking a course.
Joshua Yeidel

Fair Assessment Practices - Giving Students Equitable Opportunties to Demonstrate Learning - 0 views

  •  
    Seven principles for conducting assessment of student learning. Basic stuff well summarized.
Judy Rumph

about | outcomes_assessment | planning | NYIT - 1 views

shared by Judy Rumph on 17 Aug 10 - Cached
  • The Assessment Committee of NYIT's Academic Senate is the institutional unit that brings together all program assessment activities at the university - for programs with and without professional accreditation, for programs at all locations, for programs given through all delivery mechanisms. The committee members come from all academic schools and numerous support departments. Its meetings are open and minutes are posted on the web site of the Academic Senate.
  •  
    This page made me think about the public face of our own assessment process and how that can influence perceptions about our process.
Nils Peterson

CITE Journal -- Volume 2, Issue 4 - 0 views

  • The ability to aggregate data for assessment is counted as a plus for CS and a minus for GT
    • Nils Peterson
       
      This analysis preceeds the Harvesting concept.
  • The map includes the portfolio's ability to aid learners in planning, setting goals, and navigating the artifacts learners create and collect.
    • Nils Peterson
       
      Recently, when I have been thinking about program assessment I've been thinking how students might assess courses (before adding the couse to their transcript (aka portfolio) in terms of the student's learning needs for developing proficiency in the 6 WSU goals. Students might also do a course evaluation relative to the 6 goals to give instrutors and fellow students guideposts. SO, the notion here, portfolio as map, would be that the portfolio had a way for the learner to track/map progress toward a goal. Perhaps a series of radar charts associated with a series of artifacts. Learner reflection would lead to conclusion about what aspect of the rubric needed more practice in the creation of the next artifacts going into the portfolio.
Joshua Yeidel

Blog U.: The Challenge of Value-Added - Digital Tweed - Inside Higher Ed - 0 views

  •  
    Quoting a 1984 study, "higher education should ensure that the mounds of data already collected on students are converted into useful information and fed back [to campus officials and faculty] in ways that enhance student learning and lead to improvement in programs, teaching practices, and the environment in which teaching and learning take place." The example given is an analysis of test scores in the Los Angeles Unified School District by the LA Times.
  •  
    It's going to take some assessment (and political) smarts to deflect the notion that existing data can be re-purposed easily to assess "value-added".
Gary Brown

Details | LinkedIn - 0 views

  • Although different members of the academic hierarchy take on different roles regarding student learning, student learning is everyone’s concern in an academic setting. As I specified in my article comments, universities would do well to use their academic support units, which often have evaluation teams (or a designated evaluator) to assist in providing boards the information they need for decision making. Perhaps boards are not aware of those serving in evaluation roles at the university or how those staff members can assist boards in their endeavors.
  • Gary Brown • We have been using the Internet to post program assessment plans and reports (the programs that support this initiative at least), our criteria (rubric) for reviewing them, and then inviting external stakeholders to join in the review process.
Joshua Yeidel

Using Outcome Information: Making Data Pay Off - 1 views

  •  
    Sixth in a series on outcome management for nonprofits. Grist for the mill for any Assessment Handbook we might make. "Systematic use of outcome data pays off. In an independent survey of nearly 400 health and human service organizations, program directors agreed or strongly agreed that implementing program outcome measurement had helped their programs * focus staff on shared goals (88%); * communicate results to stakeholders (88%); * clarify program purpose (86%); * identify effective practices (84%); * compete for resources (83%); * enhance record keeping (80%); and * improve service delivery (76%)."
« First ‹ Previous 41 - 60 of 223 Next › Last »
Showing 20 items per page