Skip to main content

Home/ CTLT and Friends/ Group items tagged programming

Rss Feed Group items tagged

1More

Google and WPP Marketing Research Awards - 0 views

  •  
    "Google and the WPP Group have teamed up to create a new research program to improve understanding and practices in online marketing, and to better understand the relationship between online and offline media. The Google and WPP Marketing Research Awards Program expects to support up to 12 awards in the range from $50,000 to $70,000. Awards will be in the form of unrestricted gifts to academic institutions, under the names of the researchers who submitted the proposal. Award recipients will be invited to participate in a meeting highlighting work in this area and will be encouraged to make their results available online and in professional publications."
1More

News: Sophie's Choice for 2-Year Colleges - Inside Higher Ed - 0 views

  •  
    "I am afraid that if we continue to get cuts at the level we are seeing, we may see a very quiet and disturbing transition from comprehensive, open door community colleges to niche colleges that are not comprehensive in their missions." Delta is also eliminating academic programs that don't fit into the two missions that are being protected: pre-transfer programs and job training. What will go? A lot of remedial education. The college will keep remedial courses for those who need just a course or two to be ready for college level work. But for the courses that enroll hundreds of students a semester who need years of remedial education to get ready for college, Delta is going to say no. Includes basic math, English as a second language (for beginners, newly arrived immigrants), courses aimed at senior citizens
1More

The Wired Campus - Online Programs: Profits are There, Technological Innovation Is Not ... - 0 views

  •  
    "Online programs are generally profitable. But despite the buzz about Web 2.0, the education they provide is still dominated by rudimentary, text-based technology." "Any innovation... [is] really to supplement what is still a pretty rudimentary core."
9More

At Colleges, Assessment Satisfies Only Accreditors - Letters to the Editor - The Chroni... - 2 views

  • Some of that is due to the influence of the traditional academic freedom that faculty members have enjoyed. Some of it is ego. And some of it is lack of understanding of how it can work. There is also a huge disconnect between satisfying outside parties, like accreditors and the government, and using assessment as a quality-improvement system.
  • We are driven by regional accreditation and program-level accreditation, not by quality improvement. At our institution, we talk about assessment a lot, and do just enough to satisfy the requirements of our outside reviewers.
  • Standardized direct measures, like the Major Field Test for M.B.A. graduates?
  • ...5 more annotations...
  • The problem with the test is that it does not directly align with our program's learning outcomes and it does not yield useful information for closing the loop. So why do we use it? Because it is accepted by accreditors as a direct measure and it is less expensive and time-consuming than more useful tools.
  • Without exception, the most useful information for improving the program and student learning comes from the anecdotal and indirect information.
  • We don't have the time and the resources to do what we really want to do to continuously improve the quality of our programs and instruction. We don't have a culture of continuous improvement. We don't make changes on a regular basis, because we are trapped by the catalog publishing cycle, accreditation visits, and the entrenched misunderstanding of the purposes of assessment.
  • The institutions that use it are ones that have adequate resources to do so. The time necessary for training, whole-system involvement, and developing the programs for improvement is daunting. And it is only being used by one regional accrediting body, as far as I know.
  • Until higher education as a whole is willing to look at changing its approach to assessment, I don't think it will happen
  •  
    The challenge and another piece of evidence that the nuances of assessment as it related to teaching and learning remain elusive.
2More

Office of the President: Perspectives Home - 1 views

  • Clearly, a world-class research university cannot long stand on such a shaky IT foundation. In fact, in the  generally glowing accreditation report filed by the Northwest Commission on Colleges and Universities about our university this summer, one recommendation read: “The Committee recommends that Washington State University provide contemporary information management systems that will address the needs of the future for its student, academic and management support requirements.”
    • Nils Peterson
       
      Perhaps the President recalls the Spring preliminary accreditation report more clearly than the final report sent to him in the summer and linked at accreditation.wsu.edu which does not have the "glowing" comments but does say "...the Commission finds that Recommendations 1,2, and 3 of the Spring 2009 Comprehensive Evaluation Report are areas where Washington State University is substantially in compliance with Commission criteria for accreditation, but in need of improvement. The two additional Recommendations follow below. Recommendation 2 states that the implementation of the educational assessment plan remains inconsistent across the University despite promising starts and a number of exemplary successes in selected programs. The Commission therefore recommends that the Universìty continue to enhance and strengthen its assessment process. This process needs to be extended to all of the University's educational programs, including graduate programs, and programs offered at the branch campuses (Standard 2.8).
2More

Program Assessment of Student Learning: July 2010 - 3 views

  • There are lots of considerations when considering a technology solution to the outcomes assessment process.  The first thing is to be very clear about what a system can and cannot do.  It CANNOT do your program assessment and evaluation for you!  The institution or program must first define the intended outcomes and performance indicators.  Without a doubt, that is the most difficult part of the process.  Once the indicators have been defined you need to be clear about the role of students and faculty in the use of the techology.  Also, who is the technology "owner"--who will maintain it, keep the outcomes/indicators current, generate reports, etc. etc.
  •  
    This question returns to us, so here is a resource and key to be able to point to.
2More

GAO - Generally Accepted Government Auditing Standards - 1 views

  • Our evaluator colleagues who work at GAO, and many others working in agencies and organizations that are responsible for oversight of, and focus on accountability for, government programs, often refer to the Yellow Book Standards. These agencies or organizations emphasize the importance of their independence from program officials and enjoy significant protections for their independence through statutory provisions, organizational location apart from program offices, direct reporting channels to the highest level official in their agency and governing legislative bodies, heightened tenure protections, and traditions emphasizing their independence.
  •  
    Good to have on the radar as DOE challenges the efficacy of accreditation, and not incidentally underpinning a principle of good evaluation.
3More

Assessing Learning Outcomes at the University of Cincinnati: Comparing Rubric Assessmen... - 2 views

  •  
    "When the CLA results arrived eight months later, the UC team compared the outcomes of the two assessments. "We found no statistically significant correlation between the CLA scores and the portfolio scores," Escoe says. "In some ways, it's a disappointing finding. If we'd found a correlation, we could tell faculty that the CLA, as an instrument, is measuring the same things that we value and that the CLA can be embedded in a course. But that didn't happen." There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points "in a black box": if a student referred to a specific piece of evidence in a critical-thinking question, he or she simply received one point. In addition, she says, faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind-leading to results that would not correlate to a computer-scored test. In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. "
  •  
    Another institution trying to make sense of the CLA. This study compared student's CLA scores with criteria-based scores of their eportfolios. The study used a modified version of the VALUE rubrics developed by the AACU. Our own Gary Brown was on the team that developed the critical thinking rubric for the VALUE project.
  •  
    "The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. " This begs some questions: what meaning can we attach to these two non-correlated measures? What VSA requirements can rubric-based assessment NOT satisfy? Are those "requirements" really useful?
5More

Let's Make Rankings That Matter - Commentary - The Chronicle of Higher Education - 3 views

  • By outsourcing evaluation of our doctoral programs to an external agency, we allow ourselves to play the double game of insulating ourselves from the criticisms they may raise by questioning their accuracy, while embracing the praise they bestow.
  • The solution to the problem is obvious: Universities should provide relevant information to potential students and faculty members themselves, instead of relying on an outside body to do it for them, years too late. How? By carrying out yearly audits of their doctoral programs.
  • The ubiquitous rise of social networking and open access to information via electronic media facilitate this approach to self-evaluation of academic departments. There is no need to depend on an obsolete system that irregularly publishes rankings when all of the necessary tools—e-mail, databases, Web sites—are available at all institutions of higher learning.
  • ...2 more annotations...
  • A great paradox of modern academe is that our institutions take pride in being on the cutting edge of new ideas and innovations, yet remain resistant and even hostile to the openness made possible by technology
  • We should not hide our departments' deficiencies in debatable rankings, but rather be honest about those limitations in order to aggressively pursue solutions that will strengthen doctoral programs and the institutions in which they play a vital role.
1More

Program on Networked Governance - John F. Kennedy School of Government - 0 views

  •  
    "The traditional notion of hierarchical, top down, government has always been an imperfect match for the decentralized governance system of the US. However, much of what government does requires co-production of policy among agencies that have no formal authority over each other, fundamentally undermining the traditional Weberian image of bureaucracy. Networked governance refers to a growing body of research on the interconnectedness of essentially sovereign units, which examines how those interconnections facilitate or inhibit the functioning of the overall system. The objective of this program is two-fold: (1) to foster research on networked governance and (2) to provide a forum to discuss the challenges of networked governance."
24More

Views: Changing the Equation - Inside Higher Ed - 1 views

  • But each year, after some gnashing of teeth, we opted to set tuition and institutional aid at levels that would maximize our net tuition revenue. Why? We were following conventional wisdom that said that investing more resources translates into higher quality and higher quality attracts more resources
  • But each year, after some gnashing of teeth, we opted to set tuition and institutional aid at levels that would maximize our net tuition revenue. Why? We were following conventional wisdom that said that investing more resources translates into higher quality and higher quality attracts more resource
  • But each year, after some gnashing of teeth, we opted to set tuition and institutional aid at levels that would maximize our net tuition revenue. Why? We were following conventional wisdom that said that investing more resources translates into higher quality and higher quality attracts more resources
  • ...19 more annotations...
  • year we strug
  • year we strug
  • those who control influential rating systems of the sort published by U.S. News & World Report -- define academic quality as small classes taught by distinguished faculty, grand campuses with impressive libraries and laboratories, and bright students heavily recruited. Since all of these indicators of quality are costly, my college’s pursuit of quality, like that of so many others, led us to seek more revenue to spend on quality improvements. And the strategy worked.
  • Based on those concerns, and informed by the literature on the “teaching to learning” paradigm shift, we began to change our focus from what we were teaching to what and how our students were learning.
  • No one wants to cut costs if their reputation for quality will suffer, yet no one wants to fall off the cliff.
  • When quality is defined by those things that require substantial resources, efforts to reduce costs are doomed to failure
  • some of the best thinkers in higher education have urged us to define the quality in terms of student outcomes.
  • Faculty said they wanted to move away from giving lectures and then having students parrot the information back to them on tests. They said they were tired of complaining that students couldn’t write well or think critically, but not having the time to address those problems because there was so much material to cover. And they were concerned when they read that employers had reported in national surveys that, while graduates knew a lot about the subjects they studied, they didn’t know how to apply what they had learned to practical problems or work in teams or with people from different racial and ethnic backgrounds.
  • Our applications have doubled over the last decade and now, for the first time in our 134-year history, we receive the majority of our applications from out-of-state students.
  • We established what we call college-wide learning goals that focus on "essential" skills and attributes that are critical for success in our increasingly complex world. These include critical and analytical thinking, creativity, writing and other communication skills, leadership, collaboration and teamwork, and global consciousness, social responsibility and ethical awareness.
  • despite claims to the contrary, many of the factors that drive up costs add little value. Research conducted by Dennis Jones and Jane Wellman found that “there is no consistent relationship between spending and performance, whether that is measured by spending against degree production, measures of student engagement, evidence of high impact practices, students’ satisfaction with their education, or future earnings.” Indeed, they concluded that “the absolute level of resources is less important than the way those resources are used.”
  • After more than a year, the group had developed what we now describe as a low-residency, project- and competency-based program. Here students don’t take courses or earn grades. The requirements for the degree are for students to complete a series of projects, captured in an electronic portfolio,
  • students must acquire and apply specific competencies
  • Faculty spend their time coaching students, providing them with feedback on their projects and running two-day residencies that bring students to campus periodically to learn through intensive face-to-face interaction
  • After a year and a half, the evidence suggests that students are learning as much as, if not more than, those enrolled in our traditional business program
  • As the campus learns more about the demonstration project, other faculty are expressing interest in applying its design principles to courses and degree programs in their fields. They created a Learning Coalition as a forum to explore different ways to capitalize on the potential of the learning paradigm.
  • a problem-based general education curriculum
  • At the very least, finding innovative ways to lower costs without compromising student learning is wise competitive positioning for an uncertain future
  • the focus of student evaluations has changed noticeably. Instead of focusing almost 100% on the instructor and whether he/she was good, bad, or indifferent, our students' evaluations are now focusing on the students themselves - as to what they learned, how much they have learned, and how much fun they had learning.
    • Nils Peterson
       
      gary diigoed this article. this comment shines another light -- the focus of the course eval shifted from faculty member to course & student learning when the focus shifted from teaching to learning
  •  
    A must read spotted by Jane Sherman--I've highlighed, as usual, much of it.
5More

In Many States, Public Higher Education Is Hitting a Point of 'Peril' - Government - Th... - 0 views

  • Nevada universities are preparing to close colleges, departments, and programs; demoralized professors are fleeing the state; and thousands of students are being shut out of classes at community colleges. The prospect of shutting down an entire institution remains a "distinct possibility" for the future, the chancellor says.
  • the resiliency of public financial support for American higher education is threatened, putting quality, capacity, and the underlying ability to meet student and societal needs at risk
  • "Higher education is changing by virtue of 1,000 painful cuts," said Stephen R. Portch, a former chancellor of the University System of Georgia. If public colleges cannot revamp their structures—such as by creating ways to measure learning more effectively and allowing capable students to earn degrees more quickly­—state tax systems will continue to limit spending on colleges in ways that will erode quality, Mr. Portch said, leaving faculty members to teach more and more students and take more and more unpaid furlough days, alongside fewer and fewer colleagues. "Business isn't coming back to normal this time," he says.
  •  
    Nevada universities are preparing to close colleges, departments, and programs; demoralized professors are fleeing the state; and thousands of students are being shut out of classes at community colleges. The prospect of shutting down an entire institution remains a "distinct possibility" for the future, the chancellor says. The resiliency of public financial support for American higher education is threatened, putting quality, capacity, and the underlying ability to meet student and societal needs at risk.
  •  
    Washington is not the hardest-hit state. Our work can be seen as having a direct bearing on this crisis.
8More

War News Radio | Academic Commons - 0 views

  • War News Radio (WNR) is an award winning, student-run radio show produced by Swarthmore College in Swarthmore, Pennsylvania. It is carried by over thirty-seven radio stations across the United States, Canada and Italy, and podcasts are available through our Web site. It attempts to fill the gaps in the media's coverage of the conflicts in Iraq and Afghanistan by providing balanced and in-depth reporting, historical perspective, and personal stories.
    • Nils Peterson
       
      Intersting piece about students working on an authentic problem within the College, but outside its credit awarding structure
  • Robert Fisk, one of the best journalists covering conflicts in the Middle East, described this as a kind of "hotel journalism." "More and more Western reporters in Baghdad" he writes in a survey of media coverage in Iraq, "are reporting from their hotels rather than the streets of Iraq's towns and cities."1 If the journalist in Iraq could prepare his or her reports by relying on phone interviews, Swarthmore students could do that as well.
    • Nils Peterson
       
      Theron brought this work to my attention a couple years ago. They end up using Skype as one of their tools
  • Initially college administrators and faculty explored the idea of incorporating War News Radio into the college curriculum, where students involved in the program could receive credit for their broadcast work. Students took courses through the film and media studies department and completed required readings on the Middle East. However, it was hard to do both things at the same time and the college stopped giving credit, which made the show more focused on reporting. And then it became clear that an experienced journalist was needed to guide the students.
    • Nils Peterson
       
      A couple threads connect here. One is Daniel Pink's Autonomy, Mastery and Purpose (intrinsic rewards) being more important in a creative endevor than extrinsic rewards (course grades). The other idea is a mentor from the Community of Practice rather than from inside the university
  • ...2 more annotations...
  • students were becoming better reporters and the show became more professional as it moved to a weekly format. Stations throughout the U.S. began to take interest in what WNR was covering as the shows were uploaded to Public Radio Exchange (PRX), a Web-based platform for digital distribution, review, and licensing of radio programs. Students' reports were now being heard by thousands of people in the U.S. and abroad. With this publicity, students felt increasingly responsible for meeting weekly deadlines and producing a high quality program. Currently staff members contribute more than twenty hours of work into every show
  • In addition to placing Swarthmore on the map, it has boosted the number of applicants. WNR is “one of two or three things that have influenced applicants to the college, so that people who want to come to Swarthmore and have to write the essay: "Why Swarthmore?" one of the most frequently cited things in the last few years has been War News Radio,”
2More

Designing Effective Assessments: Q&A with Trudy Banta - 0 views

  • One-hundred forty-six assessment examples were sent to us, and we used all of those in one way or another in the book. I think it’s a pretty fair sample of what’s going on in higher education assessment. Yet most of the programs that we looked at had only been underway for two, three, or four years. When we asked what the long-term impact of doing assessment and using the findings to improve programs had been, in only six percent of the cases were the authors able to say that student learning had been improved.
  •  
    Though and advertisement for a workshop, Trudy Banta confirms our own suspicions. The blurb here further confirms that we need not look far for models--our energy will be better spent making our work at WSU a model.
19More

Change Magazine - The New Guys in Assessment Town - 0 views

  • if one of the institution’s general education goals is critical thinking, the system makes it possible to call up all the courses and programs that assess student performance on that outcome.
  • bringing together student learning outcomes data at the level of the institution, program, course, and throughout student support services so that “the data flows between and among these levels”
  • Like its competitors, eLumen maps outcomes vertically across courses and programs, but its distinctiveness lies in its capacity to capture what goes on in the classroom. Student names are entered into the system, and faculty use a rubric-like template to record assessment results for every student on every goal. The result is a running record for each student available only to the course instructor (and in a some cases to the students themselves, who can go to the system to  get feedback on recent assessments).
    • Nils Peterson
       
      sounds like harvesting gradebook. assess student work and roll up
    • Joshua Yeidel
       
      This system has some potential for formative use at the per-student leve.
  • ...7 more annotations...
  • “I’m a little wary.  It seems as if, in addition to the assessment feedback we are already giving to students, we might soon be asked to add a data-entry step of filling in boxes in a centralized database for all the student learning outcomes. This is worrisome to those of us already struggling under the weight of all that commenting and essay grading.”
    • Nils Peterson
       
      its either double work, or not being understood that the grading and the assessment can be the same activity. i suspect the former -- grading is being done with different metrics
    • Joshua Yeidel
       
      I am in the unusual position of seeing many papers _after_ they have been graded by a wide variety of teachers. Many of these contain little "assessment feedback" -- many teachers focus on "correcting" the papers and finding some letter or number to assign as a value.
  • “This is where we see many institutions struggling,” Galvin says. “Faculty simply don’t have the time for a deeper involvement in the mechanics of assessment.” Many have never seen a rubric or worked with one, “so generating accurate, objective data for analysis is a challenge.”  
    • Nils Peterson
       
      Rather than faculty using the community to help with assessment, they are outsourcing to a paid assessor -- this is the result of undertaking this thinking while also remaining in the institution-centric end of the spectrum we developed
  • I asked about faculty pushback. “Not so much,” Galvin says, “not after faculty understand that the process is not intended to evaluate their work.”
    • Nils Peterson
       
      red flag
  • the annual reports required by this process were producing “heaps of paper” while failing to track trends and developments over time. “It’s like our departments were starting anew every year,” Chaplot says. “We wanted to find a way to house the data that gave us access to what was done in the past,” which meant moving from discrete paper reports to an electronic database.
    • Joshua Yeidel
       
      It's not clear whether the "database" is housing measurements, narratives and reflections, or all of the above.
  • Can eLumen represent student learning in language? No, but it can quantify the number of boxes checked against number of boxes not checked.”
  • developing a national repository of resources, rubrics, outcomes statements, and the like that can be reviewed and downloaded by users
    • Nils Peterson
       
      in building our repository we could well open-source these tools, no need to lock them up
  • “These solutions cement the idea that assessment is an administrative rather than an educational enterprise, focused largely on accountability. They increasingly remove assessment decision making from the everyday rhythm of teaching and learning and the realm of the faculty.
    • Nils Peterson
       
      Over the wall assessment, see Transformative Assessment rubric for more detail
5More

WSU Today Online - Current Article List - 0 views

  • the goal of the program is for students to submit their portfolios at the start of their junior year, and only about 34 percent are managing to do that.
  • Writing Assessment Program received the 2009 “Writing Program Certificate of Excellence”
  • If students delay completing their portfolio until late in their junior year, or into their senior year, she said, “it undermines the instructional integrity of the assessment.”
  • ...1 more annotation...
  • 70 percent of students submitted a paper as part of their portfolio that had been completed in a non-WSU course
  •  
    I ponder these highlights
1More

Movie Clips and Copyright - 0 views

  •  
    Video clips -- sometimes the copyright question comes up, so this green light is good news. Video clips may lend themselves to scenario-based assessments -- instead of reading a long article, students could look at a digitally presented case to analyze and critique -- might open up a lot of possibilities for assessment activities. a latest round of rule changes, issued Monday by the U.S. Copyright Office, dealing with what is legal and what is not as far as decrypting and repurposing copyrighted content. One change in particular is making waves in academe: an exemption that allows professors in all fields and "film and media studies students" to hack encrypted DVD content and clip "short portions" into documentary films and "non-commercial videos." (The agency does not define "short portions.") This means that any professors can legally extract movie clips and incorporate them into lectures, as long as they are willing to decrypt them - a task made relatively easy by widely available programs known as "DVD rippers." The exemption also permits professors to use ripped content in non-classroom settings that are similarly protected under "fair use" - such as presentations at academic conferences.
1More

Clemson University e-portfolio winners - 3 views

  •  
    Students used different technologies, not one set mandated system for the e-portfolios. In 2006, Clemson University implemented the ePortfolio Program that requires all undergraduates to create and submit a digital portfolio as evidence of academic and experiential mastery of Clemson's core competencies. Students collect work from their classes and elsewhere, connecting (tagging) it to the competencies (Written and Oral Communication; Reasoning, Critical Thinking and Problem Solving; Mathematical, Scientific and Technological Literacy; Social Science and Cross-Cultural Awareness; Arts and Humanities; and Ethical Judgment) throughout their undergraduate experience.
6More

Views: Accreditation's Accidental Transformation - Inside Higher Ed - 0 views

  • Why the national attention? Why the second-guessing of the accreditation decisions? It is part of the accidental transformation of accreditation.
  • Academic quality assurance and collegiality -- the defining features of traditional accreditation -- are, at least for now, taking a backseat to consumer protection and compliance with law and regulation. Government and the public expect accreditation to essentially provide a guarantee that students are getting what they pay for in terms of the education they seek.
  • Blame the powerful demand that, above all, colleges and universities provide credentials that lead directly to employment or advancement of employment. Driven by public concerns about the difficult job market and the persistent rise in the price of tuition, accrediting organizations are now expected to assure that the colleges, universities and programs they accredit will produce these pragmatic results.
  • ...2 more annotations...
  • The worth of higher education is determined less and less through the professional judgments made by the academic community. The deference at one time accorded accrediting organizations to decide the worth of colleges and universities is diminished and perhaps disappearing.
  • Do we know the consequences of this accidental transformation? Are we prepared to accept them? These changes may be unintended, but they are dramatic and far-reaching. Is this how we want to proceed? Judith S. Eaton is president of the Council for Higher Education Accreditation.
  •  
    It is this discussion that programs that approach accreditation perfunctorily need to attend.
2More

Blog U.: The Challenge of Value-Added - Digital Tweed - Inside Higher Ed - 0 views

  •  
    Quoting a 1984 study, "higher education should ensure that the mounds of data already collected on students are converted into useful information and fed back [to campus officials and faculty] in ways that enhance student learning and lead to improvement in programs, teaching practices, and the environment in which teaching and learning take place." The example given is an analysis of test scores in the Los Angeles Unified School District by the LA Times.
  •  
    It's going to take some assessment (and political) smarts to deflect the notion that existing data can be re-purposed easily to assess "value-added".
‹ Previous 21 - 40 of 122 Next › Last »
Showing 20 items per page