Skip to main content

Home/ CTLT and Friends/ Group items tagged programming

Rss Feed Group items tagged

Theron DesRosier

CDC Evaluation Working Group: Framework - 2 views

  • Framework for Program Evaluation
  • Purposes The framework was developed to: Summarize and organize the essential elements of program evaluation Provide a common frame of reference for conducting evaluations Clarify the steps in program evaluation Review standards for effective program evaluation Address misconceptions about the purposes and methods of program evaluation
  • Assigning value and making judgments regarding a program on the basis of evidence requires answering the following questions: What will be evaluated? (i.e. what is "the program" and in what context does it exist) What aspects of the program will be considered when judging program performance? What standards (i.e. type or level of performance) must be reached for the program to be considered successful? What evidence will be used to indicate how the program has performed? What conclusions regarding program performance are justified by comparing the available evidence to the selected standards? How will the lessons learned from the inquiry be used to improve public health effectiveness?
  • ...3 more annotations...
  • These questions should be addressed at the beginning of a program and revisited throughout its implementation. The framework provides a systematic approach for answering these questions.
  • Steps in Evaluation Practice Engage stakeholders Those involved, those affected, primary intended users Describe the program Need, expected effects, activities, resources, stage, context, logic model Focus the evaluation design Purpose, users, uses, questions, methods, agreements Gather credible evidence Indicators, sources, quality, quantity, logistics Justify conclusions Standards, analysis/synthesis, interpretation, judgment, recommendations Ensure use and share lessons learned Design, preparation, feedback, follow-up, dissemination Standards for "Effective" Evaluation Utility Serve the information needs of intended users Feasibility Be realistic, prudent, diplomatic, and frugal Propriety Behave legally, ethically, and with due regard for the welfare of those involved and those affected Accuracy Reveal and convey technically accurate information
  • The challenge is to devise an optimal — as opposed to an ideal — strategy.
  •  
    Framework for Program Evaluation by the CDC This is a good resource for program evaluation. Click through "Steps and Standards" for information on collecting credible evidence and engaging stakeholders.
Joshua Yeidel

Using Outcome Information: Making Data Pay Off - 1 views

  •  
    Sixth in a series on outcome management for nonprofits. Grist for the mill for any Assessment Handbook we might make. "Systematic use of outcome data pays off. In an independent survey of nearly 400 health and human service organizations, program directors agreed or strongly agreed that implementing program outcome measurement had helped their programs * focus staff on shared goals (88%); * communicate results to stakeholders (88%); * clarify program purpose (86%); * identify effective practices (84%); * compete for resources (83%); * enhance record keeping (80%); and * improve service delivery (76%)."
Theron DesRosier

Government Innovators Network: A Portal for Democratic Governance and Innovation - 0 views

  •  
    A Portal for Innovative Ideas This portal is produced by the Ash Institute for Democratic Governance and Innovation at Harvard Kennedy School, and is a marketplace of ideas and examples of government innovation. Browse or search to access news, documents, descriptions of award-winning programs, and information on events in your area of interest related to innovation. * RSS Feeds are available for each individual topic area. * We invite you to register to access online events, and to receive the biweekly Innovators Insights newsletter. * And, we encourage you to visit the Ash Institute YouTube Channel. The Ash Institute's Innovations in American Government Awards Program, and its affiliated international programs, are integral to the Government Innovators Network. Learn about the IAG program and how to apply.
Kimberly Green

Strategic National Arts Alumni Project (SNAAP) - 0 views

  •  
    WSU is participating in this survey. Looks interesting, follow up on students who graduate with an arts degree. Could be useful in program assessment in a number of ways ( a model, sample questions, as well as ways to leverage nationally collected data.) Kimberly Welcome to the Strategic National Arts Alumni Project (SNAAP), an annual online survey, data management, and institutional improvement system designed to enhance the impact of arts-school education. SNAAP partners with arts high schools, art and design colleges, conservatories and arts programs within colleges and universities to administer the survey to their graduates. SNAAP is a project of the Indiana University Center for Postsecondary Research in collaboration with the Vanderbilt University Curb Center for Art, Enterprise, and Public Policy. Lead funding is provided by the Surdna Foundation, with major partnership support from the Houston Endowment, Barr Foundation, Cleveland Foundation, Educational Foundation of America and the National Endowment for the Arts. improvement system designed to enhance the impact of arts-school education. SNAAP partners with arts high schools, art and design colleges, conservatories and arts programs within colleges and universities to administer the survey to their graduates. SNAAP is a project of the Indiana University Center for Postsecondary Research in collaboration with the Vanderbilt University Curb Center for Art, Enterprise, and Public Policy. Lead funding is provided by the Surdna Foundation, with major partnership support from the Houston Endowment, Barr Foundation, Cleveland Foundation, Educational Foundation of America and the National Endowment for the Arts."
Joshua Yeidel

Jim Dudley on Letting Go of Rigid Adherence to What Evaluation Should Look Like | AEA365 - 1 views

  •  
    "Recently, in working with a board of directors of a grassroots organization, I was reminded of how important it is to "let go" of rigid adherence to typologies and other traditional notions of what an evaluation should look like. For example, I completed an evaluation that incorporated elements of all of the stages of program development - a needs assessment (e.g., how much do board members know about their programs and budget), a process evaluation (e.g., how well do the board members communicate with each other when they meet), and an outcome evaluation (e.g., how effective is their marketing plan for recruiting children and families for its programs)."
  •  
    Needs evaluation, process evaluation, outcomes evaluation -- all useful for improvement.
Nils Peterson

2009 Annual Meeting | Conference Program - 0 views

  • This session explores the notion that assessment for transformational learning is best utilized as a learning tool. By providing timely, transparent, and appropriate feedback, both to students and to the institution itself, learning is enhanced – a far different motive for assessment than is external accountability.
    • Nils Peterson
       
      need to get to these guys with our harvesting gradebook ideas...
    • Nils Peterson
       
      decided to attend another session. Hersh was OK before lunch, but the talk by Pan looks more promising
  • Academic and corporate communities agree on the urgent need for contemporary, research-based pedagogies of engagement in STEM fields. Participants will learn how leaders from academic departments and institutions have collaborated with leaders from the corporate and business community in regional networks to ensure that graduates meet the expectations of prospective employers and the public.
    • Nils Peterson
       
      here is another session with links to CTLT work, both harvesting gradebook and the ABET work
  • Professor Pan will discuss the reflective teaching methods used to prepare students to recognize and mobilize community assets as they design, implement, and evaluate projects to improve public health.
    • Nils Peterson
       
      Students tasked to learn about a community, ride the bus, make a Doc appt. Then tasked to do a non-clinical health project in that community (they do plenty of clinical stuff elsewhere in the program). Project must build capacity in the community to survive after the student leaves. Example. Work with hispanic parents in Sacramento about parenting issue, ex getting kids to sleep on time. Student had identified problem in the community, but first project idea was show a video, which was not capacity building. Rather than showing the video, used the video as a template and made a new video. Families were actors. Result was spanish DVD that the community could own. Pan thinks this is increased capacity in the community.
  • ...17 more annotations...
  • Freshman Survey annually examines the academic habits of mind of entering first-year students.  Along with academic involvement, the survey examines diversity, civic engagement, college admissions and expectations of college. 
  • The project aims to promote faculty and student assessment of undergraduate research products in relation to outcomes associated with basic research skills and general undergraduate learning principles (communication and quantitative reasoning, critical thinking, and integration and application of knowledge).
  • They focus educators on the magnitude of the challenge to prepare an ever-increasingly diverse, globally-connected student body with the knowledge, ability, processes, and confidence to adapt to diverse environments and respond creatively to the enormous issues facing humankind.
  • One challenge of civic engagement in the co-curriculum is the merging of cost and outcome: creating meaningful experiences for students and the community with small staffs, on small budgets, while still having significant, purposeful impact. 
  • a)claims that faculty are the sole arbiters of what constitutes a liberal education and b) counter claims that student life professionals also possess the knowledge and expertise critical to defining students’ total learning experiences.  
    • Nils Peterson
       
      also, how many angels can dance on the head of a pin?
  • This session introduces a three-year national effort to document how colleges and universities are using assessment data to improve teaching and learning and to facilitate the dissemination and adoption of best practices in the assessment of college learning outcomes.
  • Exciting pedagogies of engagement abound, including undergraduate research, community-engaged learning, interdisciplinary exploration, and international study.  However, such experiences are typically optional and non-credit-bearing for students, and/or “on top of” the workload for faculty. This session explores strategies for integrating engaged learning into the institutional fabric (curriculum, student role, faculty role) and increasing access to these transformative experiences.
  • hands-on experiential learning, especially in collaboration with other students, is a superior pedagogy but how can this be provided in increasingly larger introductory classes? 
  • As educators seek innovative ways to manage knowledge and expand interdisciplinary attention to pressing global issues, as students and parents look for assurances that their tuition investment will pay professional dividends, and as alumni look for meaningful ways to give back to the institutions that nurtured and prepared them, colleges and universities can integrate these disparate goals through the Guilds, intergenerational membership networks that draw strength from the contributions of all of their members.
    • Nils Peterson
       
      see Theron's ideas for COMM.
  • Civic engagement learning derives its power from the engagement of students with real communities—local, national, and global. This panel explores the relationship between student learning and the contexts in which that learning unfolds by examining programs that place students in diverse contexts close to campus and far afield.
  • For institutional assessment to make a difference for student learning its results must result in changes in classroom practice. This session explores ways in which the institutional assessment of student learning, such as the Wabash National Study of Liberal Arts Education and the Collegiate Learning Assessment, can be connected to our classrooms.
  • Interdisciplinary Teaching and Object-Based Learning in Campus Museums
  • To address pressing needs of their communities, government and non-profit agencies are requesting higher education to provide education in an array of human and social services. To serve these needs effectively, higher educationneeds to broaden and deepen its consultation with practitioners in designing new curricula. Colleges and universities would do well to consider a curriculum development model that requires consultation not only with potential employers, but also with practitioners and supervisors of practitioners.
  • Should Academics be Active? Campuses and Cutting Edge Civic Engagement
  • If transformational liberal education requires engaging the whole student across the educational experience, how can colleges and universities renew strategy and allocate resources effectively to support it?  How can assessment be used to improve student learning and strengthen a transformational learning environment? 
    • Nils Peterson
       
      Purpose of university is not to grant degrees, it has something to do with learning. Keeling's perspective is that the learning should be transformative; changing perspective. Liberating and emancipatory Learning is a complex interaction among student and others, new knowledge and experience, event, own aspirations. learners construct meaning from these elements. "we change our minds" altering the brain at the micro-level Brain imaging research demonstrates that analogical learning (abstract) demands more from more areas of the brain than semantic (concrete) learning. Mind is not an abstraction, it is based in the brain, a working physical organ .Learner and the environment matter to the learning. Seeds magazine, current issue on brain imaging and learning. Segway from brain research to need for university to educate the whole student. Uses the term 'transformative learning' meaning to transform the learning (re-wire the brain) but does not use transformative assessment (see wikipedia).
  • But as public debates roil, higher education has been more reactive than proactive on the question of how best to ensure that today’s students are fully prepared for a fast-paced future.
    • Nils Peterson
       
      Bologna process being adopted (slowly) in EU, the idea is to make academic degrees more interchangeable and understandable across the EU three elements * Qualification Frameworks (transnational, national, disciplinary). Frameworks are graduated, with increasing expertise and autonomy required for the upper levels. They sound like broad skills that we might recognize in the WSU CITR. Not clear how they are assessed * Tuning (benchmarking) process * Diploma Supplements (licensure, thesis, other capstone activities) these extend the information in the transcript. US equivalent might be the Kuali Students system for extending the transcript. Emerging dialog on American capability This dialog is coming from 2 directions * on campus * employers Connect to the Greater Exceptions (2000-2005) iniative. Concluded that American HE has islands of innovation. Lead to LEAP (Liberal Education and America's Promise) Initiative (2005-2015). The dialog is converging because of several forces * Changes in the balance of economic and political power. "The rise of the rest (of the world)" * Global economy in which innovation is key to growth and prosperity LEAP attempts to frame the dialog (look for LEAP in AACU website). Miami-Dade CC has announced a LEAP-derived covenant, the goals must span all aspects of their programs. Define liberal education Knowledge of human cultures and the physical and natural world intellectual and practical skills responsibility integrative skills Marker of success is (here is where the Transformative Gradebook fits in): evidence that students can apply the essential learning outcomes to complex, unscripted problems and real-world settings Current failure -- have not tracked our progress, or have found that we are not doing well. See AACU employer survey 5-10% percent of current graduates taking courses that would meet the global competencies (transcript analysis) See NSSE on Personal and social responsibility gains, less tha
  • Dr. Pan will also talk about strategies for breaking down cultural barriers.
    • Nils Peterson
       
      Pan. found a non-profit agency to be a conduit and coordinator to level the power between univ and grass roots orgs. helped with cultural gaps.
Joshua Yeidel

Digication :: NCCC Art Department Program Evaluation :: Purpose of Evaluation - 0 views

  •  
    An eportfolio for program evaluation by the Northwest Connecticut Community College Art Department. Slick, well-organized, and pretty using Digication as a platform and host. A fine portfolio, which could well be a model for our programs, except that there is not a single direct measure of student learning outcomes.
Gary Brown

It's the Learning, Stupid - Lumina Foundation: Helping People Achieve Their Potential - 3 views

  • My thesis is this. We live in a world where much is changing, quickly. Economic crises, technology, ideological division, and a host of other factors have all had a profound influence on who we are and what we do in higher education. But when all is said and done, it is imperative that we not lose sight of what matters most. To paraphrase the oft-used maxim of the famous political consultant James Carville, it's the learning, stupid.
  • We believe that, to significantly increase higher education attainment rates, three intermediate outcomes must first occur: Higher education must use proven strategies to move students to completion. Quality data must be used to improve student performance and inform policy and decision-making at all levels. The outcomes of student learning must be defined, measured, and aligned with workforce needs. To achieve these outcomes (and thus improve success rates), Lumina has decided to pursue several specific strategies. I'll cite just a few of these many different strategies: We will advocate for the redesign, rebranding and improvement of developmental education. We will explore the development of alternative pathways to degrees and credentials. We will push for smoother systems of transferring credit so students can move more easily between institutions, including from community colleges to bachelor's degree programs.
  • "Lumina defines high-quality credentials as degrees and certificates that have well-defined and transparent learning outcomes which provide clear pathways to further education and employment."
  • ...4 more annotations...
  • And—as Footnote One softly but incessantly reminds us—quality, at its core, must be a measure of what students actually learn and are able to do with the knowledge and skills they gain.
  • and yet we seem reluctant or unable to discuss higher education's true purpose: equipping students for success in life.
  • Research has already shown that higher education institutions vary significantly in the value they add to students in terms of what those students actually learn. Various tools and instruments tell us that some institutions add much more value than others, even when looking at students with similar backgrounds and abilities.
  • The idea with tuning is to take various programs within a specific discipline—chemistry, history, psychology, whatever—and agree on a set of learning outcomes that a degree in the field represents. The goal is not for the various programs to teach exactly the same thing in the same way or even for all of the programs to offer the same courses. Rather, programs can employ whatever techniques they prefer, so long as their students can demonstrate mastery of an agreed-upon body of knowledge and set of skills. To use the musical terminology, the various programs are not expected to play the same notes, but to be "tuned" to the same key.
Joshua Yeidel

Outcomes and Distributions in Program Evaluation - 2 views

  •  
    "The key here is to understand that looking only at the total outcome of a program limits your ability to use evaluation data for program improvement."
  •  
    Eric Graig discusses the need to slice and dice the data.
Nils Peterson

AAC&U News | April 2010 | Feature - 1 views

  • Comparing Rubric Assessments to Standardized Tests
  • First, the university, a public institution of about 40,000 students in Ohio, needed to comply with the Voluntary System of Accountability (VSA), which requires that state institutions provide data about graduation rates, tuition, student characteristics, and student learning outcomes, among other measures, in the consistent format developed by its two sponsoring organizations, the Association of Public and Land-grant Universities (APLU), and the Association of State Colleges and Universities (AASCU).
  • And finally, UC was accepted in 2008 as a member of the fifth cohort of the Inter/National Coalition for Electronic Portfolio Research, a collaborative body with the goal of advancing knowledge about the effect of electronic portfolio use on student learning outcomes.  
  • ...13 more annotations...
  • outcomes required of all UC students—including critical thinking, knowledge integration, social responsibility, and effective communication
  • “The wonderful thing about this approach is that full-time faculty across the university  are gathering data about how their  students are doing, and since they’ll be teaching their courses in the future, they’re really invested in rubric assessment—they really care,” Escoe says. In one case, the capstone survey data revealed that students weren’t doing as well as expected in writing, and faculty from that program adjusted their pedagogy to include more writing assignments and writing assessments throughout the program, not just at the capstone level. As the university prepares to switch from a quarter system to semester system in two years, faculty members are using the capstone survey data to assist their course redesigns, Escoe says.
  • the university planned a “dual pilot” study examining the applicability of electronic portfolio assessment of writing and critical thinking alongside the Collegiate Learning Assessment,
  • The rubrics the UC team used were slightly modified versions of those developed by AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) project. 
  • In the critical thinking rubric assessment, for example, faculty evaluated student proposals for experiential honors projects that they could potentially complete in upcoming years.  The faculty assessors were trained and their rubric assessments “normed” to ensure that interrater reliability was suitably high.
  • “We found no statistically significant correlation between the CLA scores and the portfolio scores,”
  • There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points “in a black box”:
  • faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind—leading to results that would not correlate to a computer-scored test. 
  • “The CLA provides scores at the institutional level. It doesn’t give me a picture of how I can affect those specific students’ learning. So that’s where rubric assessment comes in—you can use it to look at data that’s compiled over time.”
  • Their portfolios are now more like real learning portfolios, not just a few artifacts, and we want to look at them as they go into their third and fourth years to see what they can tell us about students’ whole program of study.”  Hall and Robles are also looking into the possibility of forming relationships with other schools from NCEPR to exchange student e-portfolios and do a larger study on the value of rubric assessment of student learning.
  • “We’re really trying to stress that assessment is pedagogy,”
  • “It’s not some nitpicky, onerous administrative add-on. It’s what we do as we teach our courses, and it really helps close that assessment loop.”
  • In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement.
    • Nils Peterson
       
      CLA did not provide information for continuous program improvement -- we've heard this argument before
  •  
    The lack of correlation might be rephrased--there appears to be no corrlation between what is useful for faculty who teach and what is useful for the VSA. A corollary question: Of what use is the VSA?
Nils Peterson

Education Department Proposes To End Federal Funding To For-Profit Colleges Whose Stude... - 2 views

  • The Education Department proposed much-anticipated regulations Friday that would cut off federal aid to for-profit college programs if too many of their students default on loans or don't earn enough after graduation to repay them.
  • To qualify for federal student aid programs, career college programs must prepare students for "gainful employment."
  • But shares were mixed among companies such as ITT Educational Services Inc., Corinthian Colleges Inc., Education Management Corp. and Career Education Corp. Those companies operate career colleges focusing more on two-year programs or lower-income students and may need to make big changes
Nils Peterson

How would you design an ICT/education program for impact? | A World Bank Blog on ICT us... - 0 views

  • Country x has, in various ways, been host to numerous initiatives to introduce computers into its schools and, to lesser extents, to train teachers and students on their use, and schools have piloted a variety of digital learning materials and education software applications.  It is now ready, country leaders say, to invest in a rigorous, randomized trial of an educational technology initiative as a prelude to a very ambitious, large-scale roll-out of the use of educational technologies nationwide. It asks: What programs or specific interventions should we consider?
    • Nils Peterson
       
      World Bank Sr. Policy Wonk asking for help thinking through this quesion in a WB branded blog.
  • What would be a useful response to such inquiries?  How would you design a program for measurable impact in a way that is immediately policy-relevant for decisionmakers contemplating large investments in the use of technology in the education sector, and what would this program look like?
Gary Brown

Teacher-Education Programs Are Unaccountable and Undemanding, Report Says - Government ... - 2 views

  • Most states are doing little or nothing to hold teacher-education programs accountable for the quality of their graduates, according to a new report that also criticizes colleges for setting low standards for education majors.
  • Colleges, by contrast, are largely not selective enough in accepting students for education programs, lack a rigorous curriculum, and don't give teaching candidates enough classroom training.
  • the American Association of Colleges for Teacher Education, said that the report was timely and that her association was working to unify its members on the theme of accountability.
  •  
    Apparently NCATE is not sufficient according to some.
Judy Rumph

about | outcomes_assessment | planning | NYIT - 1 views

shared by Judy Rumph on 17 Aug 10 - Cached
  • The Assessment Committee of NYIT's Academic Senate is the institutional unit that brings together all program assessment activities at the university - for programs with and without professional accreditation, for programs at all locations, for programs given through all delivery mechanisms. The committee members come from all academic schools and numerous support departments. Its meetings are open and minutes are posted on the web site of the Academic Senate.
  •  
    This page made me think about the public face of our own assessment process and how that can influence perceptions about our process.
Gary Brown

Details | LinkedIn - 0 views

  • Although different members of the academic hierarchy take on different roles regarding student learning, student learning is everyone’s concern in an academic setting. As I specified in my article comments, universities would do well to use their academic support units, which often have evaluation teams (or a designated evaluator) to assist in providing boards the information they need for decision making. Perhaps boards are not aware of those serving in evaluation roles at the university or how those staff members can assist boards in their endeavors.
  • Gary Brown • We have been using the Internet to post program assessment plans and reports (the programs that support this initiative at least), our criteria (rubric) for reviewing them, and then inviting external stakeholders to join in the review process.
Gary Brown

A Critic Sees Deep Problems in the Doctoral Rankings - Faculty - The Chronicle of Highe... - 1 views

  • This week he posted a public critique of the NRC study on his university's Web site.
  • "Little credence should be given" to the NRC's ranges of rankings.
  • There's not very much real information about quality in the simple measures they've got."
  • ...4 more annotations...
  • The NRC project's directors say that those small samples are not a problem, because the reputational scores were not converted directly into program assessments. Instead, the scores were used to develop a profile of the kinds of traits that faculty members value in doctoral programs in their field.
  • For one thing, Mr. Stigler says, the relationships between programs' reputations and the various program traits are probably not simple and linear.
  • if these correlations between reputation and citations were plotted on a graph, the most accurate representation would be a curved line, not a straight line. (The curve would occur at the tipping point where high citation levels make reputations go sky-high.)
  • Mr. Stigler says that it was a mistake for the NRC to so thoroughly abandon the reputational measures it used in its previous doctoral studies, in 1982 and 1995. Reputational surveys are widely criticized, he says, but they do provide a check on certain kinds of qualitative measures.
  •  
    What is not challenged is the validity and utility of the construct itself--reputation rankings.
S Spaeth

Matthews et al: Selecting influential members of social networks - 0 views

  •  
    Opinion leaders are influential members of their social networks, strategically selected for their ability to sway community norms. The aims of the study were to assess: 1) whether it is feasible to identify student opinion leaders (SOLs) and their social networks among Grade 11 students at two high schools in Cape Town, South Africa; and 2) whether these opinion leaders would be willing to be involved in an HIV/AIDS prevention program in their school. The students (N = 412) completed a semi-structured, anonymous, self-administered questionnaire. ... Of these, all but two at each school were willing and available to participate in a HIV/AIDS prevention program. ---------- Focuses on HIV/AIDS prevention but can we use principles in other contexts and Facebook recommendation tools to support the process?
  •  
    I've been thinking about how to support the development and visibility of SOLs using technology, without creating a creepy treehouse. How do we make them more visible and accessible?
Joshua Yeidel

Program Assessment of Student Learning - 3 views

  •  
    "It is hoped that, in some small way, this blog can both engage and challenge faculty and administrators alike to become more intentional in their program assessment efforts, creating systematic and efficient processes that actually have the likelihood of improving student learning while honoring faculty time."
  •  
    As recommended by Ashley. Apparently Dr. Rogers' blog is just starting up, so you can "get in on the ground floor".
Gary Brown

Better Monitoring of Teacher-Training Programs Is Recommended - Government - The Chroni... - 0 views

shared by Gary Brown on 30 Apr 10 - Cached
    • Gary Brown
       
      Weird since most Education programs focus on...research mehtods...
  • "There's a lot of talk out there about alternative routes into teaching being very different from traditional routes, and we found that that distinction just is not meaningful," Ms. Lagemann said. Colleges of education vary widely in their methods, and many alternative-certification programs require participants to take classes at colleges and universities, she pointed out. In some cases, students pursuing a traditional education degree and students in an alternative-certification program are in the same class.
Corinna Lo

News: The Challenge of Comparability - Inside Higher Ed - 0 views

  •  
    But when it came to defining sets of common learning outcomes for specific degree programs -- Transparency by Design's most distinguishing characteristic -- commonality was hard to come by. Questions to apply to any institution could be: 1) For any given program, what specific student learning outcomes are graduates expected to demonstrate? 2) By what standards and measurements are students being evaluated? 3) How well have graduating students done relative to these expectations? Comparability of results (the 3rd question) depends on transparency of goals and expectations (the 1st question) and transparency of measures (the 2nd question).
1 - 20 of 122 Next › Last »
Showing 20 items per page