Skip to main content

Home/ CTLT and Friends/ Group items tagged outcomes

Rss Feed Group items tagged

1More

Using Outcome Information: Making Data Pay Off - 1 views

  •  
    Sixth in a series on outcome management for nonprofits. Grist for the mill for any Assessment Handbook we might make. "Systematic use of outcome data pays off. In an independent survey of nearly 400 health and human service organizations, program directors agreed or strongly agreed that implementing program outcome measurement had helped their programs * focus staff on shared goals (88%); * communicate results to stakeholders (88%); * clarify program purpose (86%); * identify effective practices (84%); * compete for resources (83%); * enhance record keeping (80%); and * improve service delivery (76%)."
4More

Comments on the report - GEVC Report Comments - University College - Washington State U... - 2 views

  • My primary concern rests with the heavy emphasis on "outcomes based" learning. First, I find it difficult to imagine teaching to outcomes as separate from teaching my content -- I do not consider "content" and "outcomes" as discrete entities; rather, they overlap. This overlap may partly be the reason for the thin and somewhat unconvincing literature on "outcomes based learning." I would therefore like to see in this process a thorough and detailed analysis of the literature on "outcomes" vs content-based learning, followed by thoughtful discussion as to whether the need to focus our energies in a different direction is in fact warranted (and for what reasons). Also, perhaps that same literature can provide guidance on how to create an outcomes driven learning environment while maintaining the spirit of the academic (as opposed to technocratically-oriented) enterprise.
  • Outcomes are simply more refined ways of talking about fundamental purposes of education (on the need for positing our purposes in educating undergraduates, see Derek Bok, Our Underachieving Colleges, ch. 3). Without stating our educational purposes clearly, we can't know whether we are achieving them. "
  • I've clicked just about every link on this website. I still have no idea what the empirical basis is for recommending a "learning goals" based approach over other approaches. The references in the GEVC report, which is where I expected to find the relevant studies, were instead all to other reports. So far as I could tell, there were no direct references to peer-reviewed research.
  • ...1 more annotation...
  • I do not want to read the "three volumes of Pascaralla and Terenzini." Instead, I would appreciate a concise, but thorough, summary of the empirical findings. This would include the sample of institutions studied and how this sample was chosen, the way that student outcomes were measured, and the results.I now understand that many people believe that a "learning goals" approach is desirable, but I still don't understand the empirical basis for their beliefs.
9More

Virtual-TA - 2 views

  • We also developed a technology platform that allows our TAs to electronically insert detailed, actionable feedback directly into student assignments
  • Your instructors give us the schedule of assignments, when student assignments are due, when we might expect to receive them electronically, when the scored assignments will be returned, the learning outcomes on which to score the assignments, the rubrics to be used and the weights to be applied to different learning outcomes. We can use your rubrics to score assignments or design rubrics for sign-off by your faculty members.
  • review and embed feedback using color-coded pushpins (each color corresponds to a specific learning outcome) directly onto the electronic assignments. Color-coded pushpins provide a powerful visual diagnostic.
  • ...5 more annotations...
  • We do not have any contact with your students. Instructors retain full control of the process, from designing the assignments in the first place, to specifying learning outcomes and attaching weights to each outcome. Instructors also review the work of our TAs through a step called the Interim Check, which happens after 10% of the assignments have been completed. Faculty provide feedback, offer any further instructions and eventually sign-off on the work done, before our TAs continue with the remainder of the assignments
  • Finally, upon the request of the instructor, the weights he/she specified to the learning outcomes will be rubric-based scores which are used to generate a composite score for each student assignment
  • As an added bonus, our Virtual-TAs provide a detailed, summative report for the instructor on the overall class performance on the given assignment, which includes a look at how the class fared on each outcome, where the students did well, where they stumbled and what concepts, if any, need reinforcing in class the following week.
  • We can also, upon request, generate reports by Student Learning Outcomes (SLOs). This report can be used by the instructor to immediately address gaps in learning at the individual or classroom level.
  • Think of this as a micro-closing-of-the-loop that happens each week.  Contrast this with the broader, closing-the-loop that accompanies program-level assessment of learning, which might happen at the end of a whole academic year or later!
  •  
    I went to Virtual TA and Highlighted their language describing how it works.
2More

Measuring Student Learning: Many Tools - Measuring Stick - The Chronicle of Higher Educ... - 2 views

  • The issue that needs to be addressed and spectacularly has been avoided is whether controlled studies (one group does the articulation of and then measurement of outcomes, and a control group does what we have been doing before this mania took hold) can demonstrate or falsify the claim that outcomes assessment results in better-educated students. So far as I can tell, we instead gather data on whether we have in fact been doing outcomes assessment. Not the issue, people. jwp
  •  
    The challenge--not the control study this person calls for, but the perception that outcomes assessment produces outcomes....
29More

2009 Annual Meeting | Conference Program - 0 views

  • This session explores the notion that assessment for transformational learning is best utilized as a learning tool. By providing timely, transparent, and appropriate feedback, both to students and to the institution itself, learning is enhanced – a far different motive for assessment than is external accountability.
    • Nils Peterson
       
      need to get to these guys with our harvesting gradebook ideas...
    • Nils Peterson
       
      decided to attend another session. Hersh was OK before lunch, but the talk by Pan looks more promising
  • Academic and corporate communities agree on the urgent need for contemporary, research-based pedagogies of engagement in STEM fields. Participants will learn how leaders from academic departments and institutions have collaborated with leaders from the corporate and business community in regional networks to ensure that graduates meet the expectations of prospective employers and the public.
    • Nils Peterson
       
      here is another session with links to CTLT work, both harvesting gradebook and the ABET work
  • Professor Pan will discuss the reflective teaching methods used to prepare students to recognize and mobilize community assets as they design, implement, and evaluate projects to improve public health.
    • Nils Peterson
       
      Students tasked to learn about a community, ride the bus, make a Doc appt. Then tasked to do a non-clinical health project in that community (they do plenty of clinical stuff elsewhere in the program). Project must build capacity in the community to survive after the student leaves. Example. Work with hispanic parents in Sacramento about parenting issue, ex getting kids to sleep on time. Student had identified problem in the community, but first project idea was show a video, which was not capacity building. Rather than showing the video, used the video as a template and made a new video. Families were actors. Result was spanish DVD that the community could own. Pan thinks this is increased capacity in the community.
  • ...17 more annotations...
  • Freshman Survey annually examines the academic habits of mind of entering first-year students.  Along with academic involvement, the survey examines diversity, civic engagement, college admissions and expectations of college. 
  • The project aims to promote faculty and student assessment of undergraduate research products in relation to outcomes associated with basic research skills and general undergraduate learning principles (communication and quantitative reasoning, critical thinking, and integration and application of knowledge).
  • They focus educators on the magnitude of the challenge to prepare an ever-increasingly diverse, globally-connected student body with the knowledge, ability, processes, and confidence to adapt to diverse environments and respond creatively to the enormous issues facing humankind.
  • One challenge of civic engagement in the co-curriculum is the merging of cost and outcome: creating meaningful experiences for students and the community with small staffs, on small budgets, while still having significant, purposeful impact. 
  • a)claims that faculty are the sole arbiters of what constitutes a liberal education and b) counter claims that student life professionals also possess the knowledge and expertise critical to defining students’ total learning experiences.  
    • Nils Peterson
       
      also, how many angels can dance on the head of a pin?
  • This session introduces a three-year national effort to document how colleges and universities are using assessment data to improve teaching and learning and to facilitate the dissemination and adoption of best practices in the assessment of college learning outcomes.
  • Exciting pedagogies of engagement abound, including undergraduate research, community-engaged learning, interdisciplinary exploration, and international study.  However, such experiences are typically optional and non-credit-bearing for students, and/or “on top of” the workload for faculty. This session explores strategies for integrating engaged learning into the institutional fabric (curriculum, student role, faculty role) and increasing access to these transformative experiences.
  • hands-on experiential learning, especially in collaboration with other students, is a superior pedagogy but how can this be provided in increasingly larger introductory classes? 
  • As educators seek innovative ways to manage knowledge and expand interdisciplinary attention to pressing global issues, as students and parents look for assurances that their tuition investment will pay professional dividends, and as alumni look for meaningful ways to give back to the institutions that nurtured and prepared them, colleges and universities can integrate these disparate goals through the Guilds, intergenerational membership networks that draw strength from the contributions of all of their members.
    • Nils Peterson
       
      see Theron's ideas for COMM.
  • Civic engagement learning derives its power from the engagement of students with real communities—local, national, and global. This panel explores the relationship between student learning and the contexts in which that learning unfolds by examining programs that place students in diverse contexts close to campus and far afield.
  • For institutional assessment to make a difference for student learning its results must result in changes in classroom practice. This session explores ways in which the institutional assessment of student learning, such as the Wabash National Study of Liberal Arts Education and the Collegiate Learning Assessment, can be connected to our classrooms.
  • Interdisciplinary Teaching and Object-Based Learning in Campus Museums
  • To address pressing needs of their communities, government and non-profit agencies are requesting higher education to provide education in an array of human and social services. To serve these needs effectively, higher educationneeds to broaden and deepen its consultation with practitioners in designing new curricula. Colleges and universities would do well to consider a curriculum development model that requires consultation not only with potential employers, but also with practitioners and supervisors of practitioners.
  • Should Academics be Active? Campuses and Cutting Edge Civic Engagement
  • If transformational liberal education requires engaging the whole student across the educational experience, how can colleges and universities renew strategy and allocate resources effectively to support it?  How can assessment be used to improve student learning and strengthen a transformational learning environment? 
    • Nils Peterson
       
      Purpose of university is not to grant degrees, it has something to do with learning. Keeling's perspective is that the learning should be transformative; changing perspective. Liberating and emancipatory Learning is a complex interaction among student and others, new knowledge and experience, event, own aspirations. learners construct meaning from these elements. "we change our minds" altering the brain at the micro-level Brain imaging research demonstrates that analogical learning (abstract) demands more from more areas of the brain than semantic (concrete) learning. Mind is not an abstraction, it is based in the brain, a working physical organ .Learner and the environment matter to the learning. Seeds magazine, current issue on brain imaging and learning. Segway from brain research to need for university to educate the whole student. Uses the term 'transformative learning' meaning to transform the learning (re-wire the brain) but does not use transformative assessment (see wikipedia).
  • But as public debates roil, higher education has been more reactive than proactive on the question of how best to ensure that today’s students are fully prepared for a fast-paced future.
    • Nils Peterson
       
      Bologna process being adopted (slowly) in EU, the idea is to make academic degrees more interchangeable and understandable across the EU three elements * Qualification Frameworks (transnational, national, disciplinary). Frameworks are graduated, with increasing expertise and autonomy required for the upper levels. They sound like broad skills that we might recognize in the WSU CITR. Not clear how they are assessed * Tuning (benchmarking) process * Diploma Supplements (licensure, thesis, other capstone activities) these extend the information in the transcript. US equivalent might be the Kuali Students system for extending the transcript. Emerging dialog on American capability This dialog is coming from 2 directions * on campus * employers Connect to the Greater Exceptions (2000-2005) iniative. Concluded that American HE has islands of innovation. Lead to LEAP (Liberal Education and America's Promise) Initiative (2005-2015). The dialog is converging because of several forces * Changes in the balance of economic and political power. "The rise of the rest (of the world)" * Global economy in which innovation is key to growth and prosperity LEAP attempts to frame the dialog (look for LEAP in AACU website). Miami-Dade CC has announced a LEAP-derived covenant, the goals must span all aspects of their programs. Define liberal education Knowledge of human cultures and the physical and natural world intellectual and practical skills responsibility integrative skills Marker of success is (here is where the Transformative Gradebook fits in): evidence that students can apply the essential learning outcomes to complex, unscripted problems and real-world settings Current failure -- have not tracked our progress, or have found that we are not doing well. See AACU employer survey 5-10% percent of current graduates taking courses that would meet the global competencies (transcript analysis) See NSSE on Personal and social responsibility gains, less tha
  • Dr. Pan will also talk about strategies for breaking down cultural barriers.
    • Nils Peterson
       
      Pan. found a non-profit agency to be a conduit and coordinator to level the power between univ and grass roots orgs. helped with cultural gaps.
6More

Capella University to Receive 2010 CHEA Award - 2 views

  • The Council for Higher Education Accreditation, a national advocate and institutional voice for self-regulation of academic quality through accreditation, has awarded the 2010 CHEA Award for Outstanding Institutional Practice in Student Learning Outcomes to Capella University (MN), one of four institutions that will receive the award in 2010. Capella University is the first online university to receive the award.
  • Capella University’s faculty have developed an outcomes-based curricular model
  • “Capella University is a leader in accountability in higher education. Their work in student learning outcomes exemplifies the progress that institutions are making through the implementation of comprehensive, relevant and effective initiatives,” said CHEA President Judith Eaton. “We are pleased to recognize this institution with the CHEA Award.”
  • ...2 more annotations...
  • our award criteria: 1) articulation and evidence of outcomes; 2) success with regard to outcomes; 3) information to the public about outcomes; and 4) use of outcomes for educational improvement.
  • In addition to Capella University, Portland State University (OR), St. Olaf College (MN) and the University of Arkansas - Fort Smith (AR) also will receive the 2010 CHEA Award. The award will be presented at the 2010 CHEA Annual Conference, which will be held January 25-28 in Washington, D.C
  •  
    Capella has mandatory faculty training program, and then they select from the training program those who will teach. Candidates also pay their own tuition for the "try-out" or training.
4More

Changing Higher Education: An Interview with Lloyd Armstrong, USC « Higher Ed... - 1 views

  • There are obviously real concerns that outcomes measures are measuring the right outcomes.   However, those expressing those concerns seldom are ready to jump in to try to figure out how to measure what they think is important – a position that is ultimately untenable
  • Learning outcomes risk changing the rules of the game by actually looking at learning itself, rather than using the surrogates of wealth, history, and research.  Since we have considerable data that show that these surrogates do not correlate particularly well with learning outcomes (see e.g. Derek Bok’s Our Underachieving Colleges),
  •  As Bok pointed out, to improve learning outcomes, the faculty would have to learn to teach in new ways.  Most academic leaders would prefer not to get into a game that would require that kind of change!  In fact, at this point I believe that the real, critical, disruptive innovation in higher education is transparent learning outcomes measures.  Such measures are likely to enable the innovations discussed in the first question to transform from sustaining to disruptive.
  •  
    another executive source, but notes the critical underpinning reason we NEED to do our work.
2More

Refining the Recipe for a Degree, Ingredient by Ingredient - Government - The Chronicle... - 1 views

  • Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
  •  
    Lumina Foundation pushes a Eurpopean-style process to define education goals state- and nation-wide, with mixed success. "Chemistry, history, math, and physics have been among the most successful", whileothers have had a hard time beginning. "Supporters of the Lumina project say it holds the promise of turning educational assessment from a process that some academics might view as a threat into one that holds a solution, while also creating more-rigorous expectations for student learning. Mr. Jones, the Utah State history-department chairman, recounted in an essay published in the American Historical Association's Perspectives on History how he once blithely told an accreditation team that "historians do not measure their effectiveness in outcomes." But he has changed his mind. The Lumina project, and others, help define what learning is achieved in the process of earning a degree, he said, moving beyond Americans' heavy reliance on the standardized student credit hour as the measure of an education. "The demand for outcomes assessment should be seized as an opportunity for us to actually talk about the habits of mind our discipline needs to instill in our students," Mr. Jones wrote. "It will do us a world of good, and it will save us from the spreadsheets of bureaucrats."
1More

OECD Feasibility Study for the International Assessment of Higher Education Learning Ou... - 3 views

  •  
    "What is AHELO? The OECD Assessment of Higher Education Learning Outcomes (AHELO) is a ground-breaking initiative to assess learning outcomes on an international scale by creating measures that would be valid for all cultures and languages. Between ten and thirty-thousand higher education students in over ten different countries will take part in a feasibility study to determine the bounds of this ambitious project, with an eye to the possible creation of a full-scale AHELO upon its completion."
2More

Outcomes and Distributions in Program Evaluation - 2 views

  •  
    "The key here is to understand that looking only at the total outcome of a program limits your ability to use evaluation data for program improvement."
  •  
    Eric Graig discusses the need to slice and dice the data.
1More

Analyzing Outcome Information: Getting the Most from Data - 1 views

  •  
    Fifth in a series on outcome management. "This guide is unique in offering suggestions to nonprofits for analyzing regularly collected outcome data. The guide focuses on those basic analysis activities that nearly all programs, whether large or small, can do themselves. It offers straightforward, common-sense suggestions. "
19More

Change Magazine - The New Guys in Assessment Town - 0 views

  • if one of the institution’s general education goals is critical thinking, the system makes it possible to call up all the courses and programs that assess student performance on that outcome.
  • bringing together student learning outcomes data at the level of the institution, program, course, and throughout student support services so that “the data flows between and among these levels”
  • Like its competitors, eLumen maps outcomes vertically across courses and programs, but its distinctiveness lies in its capacity to capture what goes on in the classroom. Student names are entered into the system, and faculty use a rubric-like template to record assessment results for every student on every goal. The result is a running record for each student available only to the course instructor (and in a some cases to the students themselves, who can go to the system to  get feedback on recent assessments).
    • Nils Peterson
       
      sounds like harvesting gradebook. assess student work and roll up
    • Joshua Yeidel
       
      This system has some potential for formative use at the per-student leve.
  • ...7 more annotations...
  • “I’m a little wary.  It seems as if, in addition to the assessment feedback we are already giving to students, we might soon be asked to add a data-entry step of filling in boxes in a centralized database for all the student learning outcomes. This is worrisome to those of us already struggling under the weight of all that commenting and essay grading.”
    • Nils Peterson
       
      its either double work, or not being understood that the grading and the assessment can be the same activity. i suspect the former -- grading is being done with different metrics
    • Joshua Yeidel
       
      I am in the unusual position of seeing many papers _after_ they have been graded by a wide variety of teachers. Many of these contain little "assessment feedback" -- many teachers focus on "correcting" the papers and finding some letter or number to assign as a value.
  • “This is where we see many institutions struggling,” Galvin says. “Faculty simply don’t have the time for a deeper involvement in the mechanics of assessment.” Many have never seen a rubric or worked with one, “so generating accurate, objective data for analysis is a challenge.”  
    • Nils Peterson
       
      Rather than faculty using the community to help with assessment, they are outsourcing to a paid assessor -- this is the result of undertaking this thinking while also remaining in the institution-centric end of the spectrum we developed
  • I asked about faculty pushback. “Not so much,” Galvin says, “not after faculty understand that the process is not intended to evaluate their work.”
    • Nils Peterson
       
      red flag
  • the annual reports required by this process were producing “heaps of paper” while failing to track trends and developments over time. “It’s like our departments were starting anew every year,” Chaplot says. “We wanted to find a way to house the data that gave us access to what was done in the past,” which meant moving from discrete paper reports to an electronic database.
    • Joshua Yeidel
       
      It's not clear whether the "database" is housing measurements, narratives and reflections, or all of the above.
  • Can eLumen represent student learning in language? No, but it can quantify the number of boxes checked against number of boxes not checked.”
  • developing a national repository of resources, rubrics, outcomes statements, and the like that can be reviewed and downloaded by users
    • Nils Peterson
       
      in building our repository we could well open-source these tools, no need to lock them up
  • “These solutions cement the idea that assessment is an administrative rather than an educational enterprise, focused largely on accountability. They increasingly remove assessment decision making from the everyday rhythm of teaching and learning and the realm of the faculty.
    • Nils Peterson
       
      Over the wall assessment, see Transformative Assessment rubric for more detail
7More

It's the Learning, Stupid - Lumina Foundation: Helping People Achieve Their Potential - 3 views

  • My thesis is this. We live in a world where much is changing, quickly. Economic crises, technology, ideological division, and a host of other factors have all had a profound influence on who we are and what we do in higher education. But when all is said and done, it is imperative that we not lose sight of what matters most. To paraphrase the oft-used maxim of the famous political consultant James Carville, it's the learning, stupid.
  • We believe that, to significantly increase higher education attainment rates, three intermediate outcomes must first occur: Higher education must use proven strategies to move students to completion. Quality data must be used to improve student performance and inform policy and decision-making at all levels. The outcomes of student learning must be defined, measured, and aligned with workforce needs. To achieve these outcomes (and thus improve success rates), Lumina has decided to pursue several specific strategies. I'll cite just a few of these many different strategies: We will advocate for the redesign, rebranding and improvement of developmental education. We will explore the development of alternative pathways to degrees and credentials. We will push for smoother systems of transferring credit so students can move more easily between institutions, including from community colleges to bachelor's degree programs.
  • "Lumina defines high-quality credentials as degrees and certificates that have well-defined and transparent learning outcomes which provide clear pathways to further education and employment."
  • ...4 more annotations...
  • And—as Footnote One softly but incessantly reminds us—quality, at its core, must be a measure of what students actually learn and are able to do with the knowledge and skills they gain.
  • and yet we seem reluctant or unable to discuss higher education's true purpose: equipping students for success in life.
  • Research has already shown that higher education institutions vary significantly in the value they add to students in terms of what those students actually learn. Various tools and instruments tell us that some institutions add much more value than others, even when looking at students with similar backgrounds and abilities.
  • The idea with tuning is to take various programs within a specific discipline—chemistry, history, psychology, whatever—and agree on a set of learning outcomes that a degree in the field represents. The goal is not for the various programs to teach exactly the same thing in the same way or even for all of the programs to offer the same courses. Rather, programs can employ whatever techniques they prefer, so long as their students can demonstrate mastery of an agreed-upon body of knowledge and set of skills. To use the musical terminology, the various programs are not expected to play the same notes, but to be "tuned" to the same key.
12More

For Accreditation, a Narrow Window of Opportunity - Commentary - The Chronicle of Highe... - 4 views

  • After two years as president of the American Council on Education, I feel compelled to send a wake-up call to campus executives: If federal policy makers are now willing to bail out the nation's leading banks and buy equity stakes in auto makers because those companies are "too big to fail," they will probably have few reservations about regulating an education system that they now understand is "too important to fail."
  • Regardless of party, policy makers are clearly aware of the importance of education and are demanding improved performance and more information, from preschool to graduate school. In this environment, we should expect college accreditation to come under significant scrutiny.
  • It has also clearly signaled its interest in using data to measure institutional performance and student outcomes, and it has invested in state efforts to create student-data systems from pre-kindergarten through graduate school.
  • ...8 more annotations...
  • Higher education has so far navigated its way through the environment of increased regulatory interest without substantial changes to our system of quality assurance or federally mandated outcomes assessment. But that has only bought us time. As we look ahead, we must keep three facts in mind: Interest in accountability is bipartisan, and the pendulum has swung toward more regulation in virtually all sectors. The economic crisis is likely to spur increased calls from policy makers to control college prices and demonstrate that students are getting value for the dollar. The size of the federal budget deficit will force everyone who receives federal support to produce more and better evidence that an investment of federal funds will pay dividends for individuals and society.
  • If we do not seize the opportunity to strengthen voluntary peer accreditation as a rigorous test of institutional quality, grounded in appropriate measures of student learning, we place at risk a precious bulwark against excessive government intervention, a bulwark that has allowed American higher education to flourish. When it comes to safeguarding the quality, diversity, and independence of American higher education, accreditors hold the keys to the kingdom.
  • all accreditors now require colleges and universities to put more emphasis on measuring student-learning outcomes. They should be equally vigilant about ensuring that those data are used to achieve improvements in outcomes
  • share plain-language results of accreditation reviews with the public.
  • It takes very little close reading to see through the self-serving statements here: namely that higher education institutions must do a better PR job pretending they are interested in meaningful reform so as to head off any real reform that migh come from the federal authorities.
  • THEREFORE, let me voice a wakeup call for those who are really interested in reform--not that there are many.1.There will never be any meaningful reform unless we have a centralized and nationalized higher educational system. Leaving higher education in the hands of individual institutions is no longer effective and is in fact what has led to the present state we find ourselves in. Year after countless year we have been promised changes in higher education and year after year nothing changes. IF CHANGE IS TO COME IT MUST BE FORCED ONTO HIGHER EDUCATION FROM THE OUTSIDE.
  • Higher education in America can no longer afford to be organized around the useless market capitalism that forces too many financially marginalized institutions to compete for less and less.
  • Keeping Quiet by Pablo NerudaIf we were not so singled-mindedabout keeping our lives moving,and for once could do nothing,perhaps a huge silencemight interrupt this sadnessof never understanding ourselvesand of threatening ourselves with death.
  •  
    It is heating up again
11More

The Future of Wannabe U. - The Chronicle Review - The Chronicle of Higher Education - 2 views

  • Alice didn't tell me about the topics of her research; instead she listed the number of articles she had written, where they had been submitted and accepted, the reputation of the journals, the data sets she was constructing, and how many articles she could milk from each data set.
  • colleges and universities have transformed themselves from participants in an audit culture to accomplices in an accountability regime.
  • higher education has inaugurated an accountability regime—a politics of surveillance, control, and market management that disguises itself as value-neutral and scientific administration.
  • ...7 more annotations...
  • annabe administrator noted that the recipient had published well more than 100 articles. He never said why those articles mattered.
  • And all we have are numbers about teaching. And we don't know what the difference is between a [summary measure of] 7.3 and a 7.7 or an 8.2 and an 8.5."
  • The problem is that such numbers have no meaning. They cannot indicate the quality of a student's education.
  • or can the many metrics that commonly appear in academic (strategic) plans, like student credit hours per full-time-equivalent faculty member, or the percentage of classes with more than 50 students. Those productivity measures (for they are indeed productivity measures) might as well apply to the assembly-line workers who fabricate the proverbial widget, for one cannot tell what the metrics have to do with the supposed purpose of institutions of higher education—to create and transmit knowledge. That includes leading students to the possibility of a fuller life and an appreciation of the world around them and expanding their horizons.
  • But, like the fitness club's expensive cardio machines, a significant increase in faculty research, in the quality of student experiences (including learning), in the institution's service to its state, or in its standing among its peers may cost more than a university can afford to invest or would even dream of paying.
  • Such metrics are a speedup of the academic assembly line, not an intensification or improvement of student learning. Indeed, sometimes a boost in some measures, like an increase in the number of first-year students participating in "living and learning communities," may even detract from what students learn. (Wan U.'s pre-pharmacy living-and-learning community is so competitive that students keep track of one another's grades more than they help one another study. Last year one student turned off her roommate's alarm clock so that she would miss an exam and thus no longer compete for admission to the School of Pharmacy.)
  • Even metrics intended to indicate what students may have learned seem to have more to do with controlling faculty members than with gauging education. Take student-outcomes assessments, meant to be evaluations of whether courses have achieved their goals. They search for fault where earlier researchers would not have dreamed to look. When parents in the 1950s asked why Johnny couldn't read, teachers may have responded that it was Johnny's fault; they had prepared detailed lesson plans. Today student-outcomes assessment does not even try to discover whether Johnny attended class; instead it produces metrics about outcomes without considering Johnny's input.
  •  
    A good one to wrestle with.  It may be worth formulating distinctions we hold, and steering accordingly.
2More

Duncan: Rewarding Teachers for Master's Degrees Is Waste of Money - The Ticker - The Ch... - 1 views

  • Arne Duncan, said state and local governments should rethink their policies of giving pay raises to teachers who have master’s degrees because evidence suggests that the degree alone does not improve student achievement.
  •  
    distinguishes between outcome and impact and/ or illustrates the problems of grades/degrees as credible outcome.
2More

News: 'You Can't Measure What We Teach' - Inside Higher Ed - 0 views

  •  
    "Despite those diverging starting points, the discussion revealed quite a bit more common ground than any of the panelists probably would have predicted. Let's be clear: Where they ended up was hardly a breakthrough on the scale of solving the Middle East puzzle. But there was general agreement among them that: * Any effort to try to measure learning in the humanities through what McCulloch-Lovell deemed "[Margaret] Spellings-type assessment" -- defined as tests or other types of measures that could be easily compared across colleges and neatly sum up many of the learning outcomes one would seek in humanities students -- was doomed to fail, and should. * It might be possible, and could be valuable, for humanists to reach broad agreement on the skills, abilities, and knowledge they might seek to instill in their students, and that agreement on those goals might be a starting point for identifying effective ways to measure how well students have mastered those outcomes. * It is incumbent on humanities professors and academics generally to decide for themselves how to assess whether their students are learning, less to satisfy external calls for accountability than because it is the right thing for academics, as professionals who care about their students, to do. "
  •  
    Assessment meeting at the accreditors -- driven by expectations of a demand for accountability, with not one mention of improvement.
1More

IJ-SoTL - A Method for Collaboratively Developing and Validating a Rubric - 1 views

  •  
    "Assessing student learning outcomes relative to a valid and reliable standard that is academically-sound and employer-relevant presents a challenge to the scholarship of teaching and learning. In this paper, readers are guided through a method for collaboratively developing and validating a rubric that integrates baseline data collected from academics and professionals. The method addresses two additional goals: (1) to formulate and test a rubric as a teaching and learning protocol for a multi-section course taught by various instructors; and (2) to assure that students' learning outcomes are consistently assessed against the rubric regardless of teacher or section. Steps in the process include formulating the rubric, collecting data, and sequentially analyzing the techniques used to validate the rubric and to insure precision in grading papers in multiple sections of a course."
1More

News: The Challenge of Comparability - Inside Higher Ed - 0 views

  •  
    But when it came to defining sets of common learning outcomes for specific degree programs -- Transparency by Design's most distinguishing characteristic -- commonality was hard to come by. Questions to apply to any institution could be: 1) For any given program, what specific student learning outcomes are graduates expected to demonstrate? 2) By what standards and measurements are students being evaluated? 3) How well have graduating students done relative to these expectations? Comparability of results (the 3rd question) depends on transparency of goals and expectations (the 1st question) and transparency of measures (the 2nd question).
2More

Program Assessment of Student Learning: July 2010 - 3 views

  • There are lots of considerations when considering a technology solution to the outcomes assessment process.  The first thing is to be very clear about what a system can and cannot do.  It CANNOT do your program assessment and evaluation for you!  The institution or program must first define the intended outcomes and performance indicators.  Without a doubt, that is the most difficult part of the process.  Once the indicators have been defined you need to be clear about the role of students and faculty in the use of the techology.  Also, who is the technology "owner"--who will maintain it, keep the outcomes/indicators current, generate reports, etc. etc.
  •  
    This question returns to us, so here is a resource and key to be able to point to.
1 - 20 of 97 Next › Last »
Showing 20 items per page