Skip to main content

Home/ CTLT and Friends/ Group items tagged portfolios

Rss Feed Group items tagged

Theron DesRosier

How-To ‎(Portfolio)‎ - 0 views

  •  
    Creating an Interactive Portfolio with Google Sites Process of Creating an Electronic Portfolio - Using examples from my Google Sites portfolio developed by Helen C. Barrett, Ph.D.
Nils Peterson

AAC&U News | April 2010 | Feature - 1 views

  • Comparing Rubric Assessments to Standardized Tests
  • First, the university, a public institution of about 40,000 students in Ohio, needed to comply with the Voluntary System of Accountability (VSA), which requires that state institutions provide data about graduation rates, tuition, student characteristics, and student learning outcomes, among other measures, in the consistent format developed by its two sponsoring organizations, the Association of Public and Land-grant Universities (APLU), and the Association of State Colleges and Universities (AASCU).
  • And finally, UC was accepted in 2008 as a member of the fifth cohort of the Inter/National Coalition for Electronic Portfolio Research, a collaborative body with the goal of advancing knowledge about the effect of electronic portfolio use on student learning outcomes.  
  • ...13 more annotations...
  • outcomes required of all UC students—including critical thinking, knowledge integration, social responsibility, and effective communication
  • “The wonderful thing about this approach is that full-time faculty across the university  are gathering data about how their  students are doing, and since they’ll be teaching their courses in the future, they’re really invested in rubric assessment—they really care,” Escoe says. In one case, the capstone survey data revealed that students weren’t doing as well as expected in writing, and faculty from that program adjusted their pedagogy to include more writing assignments and writing assessments throughout the program, not just at the capstone level. As the university prepares to switch from a quarter system to semester system in two years, faculty members are using the capstone survey data to assist their course redesigns, Escoe says.
  • the university planned a “dual pilot” study examining the applicability of electronic portfolio assessment of writing and critical thinking alongside the Collegiate Learning Assessment,
  • The rubrics the UC team used were slightly modified versions of those developed by AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) project. 
  • In the critical thinking rubric assessment, for example, faculty evaluated student proposals for experiential honors projects that they could potentially complete in upcoming years.  The faculty assessors were trained and their rubric assessments “normed” to ensure that interrater reliability was suitably high.
  • “We found no statistically significant correlation between the CLA scores and the portfolio scores,”
  • There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points “in a black box”:
  • faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind—leading to results that would not correlate to a computer-scored test. 
  • “The CLA provides scores at the institutional level. It doesn’t give me a picture of how I can affect those specific students’ learning. So that’s where rubric assessment comes in—you can use it to look at data that’s compiled over time.”
  • Their portfolios are now more like real learning portfolios, not just a few artifacts, and we want to look at them as they go into their third and fourth years to see what they can tell us about students’ whole program of study.”  Hall and Robles are also looking into the possibility of forming relationships with other schools from NCEPR to exchange student e-portfolios and do a larger study on the value of rubric assessment of student learning.
  • “We’re really trying to stress that assessment is pedagogy,”
  • “It’s not some nitpicky, onerous administrative add-on. It’s what we do as we teach our courses, and it really helps close that assessment loop.”
  • In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement.
    • Nils Peterson
       
      CLA did not provide information for continuous program improvement -- we've heard this argument before
  •  
    The lack of correlation might be rephrased--there appears to be no corrlation between what is useful for faculty who teach and what is useful for the VSA. A corollary question: Of what use is the VSA?
Theron DesRosier

Course Portfolio Initiative - 0 views

  •  
    Examples of course portfolios from Indiana University Bloomington. All of these link to the Pew Course Portfolio Peer Review of Teaching Project http://www.courseportfolio.org/peer/pages/index.jsp
Gary Brown

News: More Meaningful Accreditation - Inside Higher Ed - 0 views

  • ts most distinctive feature is that it would clearly separate "compliance" from "improvement." Colleges would be required to build "portfolios" of data and materials, documenting (through more frequent peer reviews) their compliance with the association's many standards, with much of the information being made public. On a parallel track, or "pathway," colleges would have the flexibility to propose their own projects or themes as the focus of the self-improvement piece of their accreditation review, and would be judged (once the projects were approved by a peer team) by how well they carried out the plan. (Colleges the commission deems to be troubled would have a "pathway" chosen for them, to address their shortcomings.)
  • educe the paperwork burden on institutions (by making the portfolio electronic and limiting the written report for the portfolio to 50 pages), and make the process more valuable for colleges by letting them largely define for themselves where they want to improve and what they want to accomplish.
  • "We want to make accreditation so valuable to institutions that they would do it without Title IV," she said in an interview after the presentation. "The only way we can protect the improvement piece, and make it valuable to institutions to aim high, is if we separate it from the compliance piece."
  • ...3 more annotations...
  • Mainly what happens in the current structure, she said, is that the compliance role is so onerous and so dominates the process that, in too many cases, colleges fail to get anything meaningful out of the improvement portion. That, she said, is why separating the two is so essential.
  • s initially conceptualized, the commission's revised process would have institutions build electronic portfolios made up of (1) an annual institutional data update the accreditor already uses, (2) a collection of "evidence of quality and capacity" drawn from existing sources (other accrediting reports), federal surveys and audits, and a "50-page, evidence-based report that demonstrates fulfillment of the criteria for accreditation," based largely on the information in (1) and (2), commission documents say. A panel of peer reviewers would "rigorously" review the data (without a site visit) at various intervals -- how much more frequently than the current 10-year accreditation review would probably depend on the perceived health of the college -- and make a recommendation on whether to approve the institution for re-accreditation.
  • "The portfolio portion really should be what's tied to continued accreditation," said one member of the audience. "As soon as you tie the pathway portion into that, you make it a very different exercise, as we're going to want to make a good case, to make ourselves look good."
Joshua Yeidel

Digication e-Portfolios: Highered - Assessment - 0 views

  •  
    "Our web-based assessment solution for tracking, comparing, and reporting on student progress and performance gives faculty and administrators the tools they need to assess a class, department, or institution based on your standards, goals, or objectives. The Digication AMS integrates tightly with our award winning e-Portfolio system, enabling students to record and showcase learning outcomes within customizable, media friendly templates."
  •  
    Could this start out as with program portfolios, and bgrow to include student work?
Peggy Collins

Official Google Docs Blog: Electronic Portfolios with Google Apps - 0 views

  •  
    looks like Google has officially adopted Helen Barrett's method of e-portfolios with Google apps. Posted on "Googlel Docs Blog"
  •  
    looks like google has officially adopted Helen Barrett's method of e-portfolios with Google apps. Posted on "Google Docs Blog"
Nils Peterson

E-Portfolios for Learning: Limitations of Portfolios - 1 views

  • Today, Shavelson, Klein & Benjamin published an online article on Inside Higher Ed entitled, "The Limitations of Portfolios." The comments to that article are even more illuminating, and highlight the debate about electronic portfolios vs. accountability systems... assessment vs. evaluation. These arguments highlight what I think is a clash in philosophies of learning and assessment, between traditional, behaviorist models and more progressive, cognitive/constructivist models. How do we build assessment strategies that bridge these two approaches? Or is the divide too wide? Do these different perspectives support the need for multiple measures and triangulation?
    • Nils Peterson
       
      Helen responds to CLA proponents
Nils Peterson

CITE Journal -- Volume 2, Issue 4 - 0 views

  • The ability to aggregate data for assessment is counted as a plus for CS and a minus for GT
    • Nils Peterson
       
      This analysis preceeds the Harvesting concept.
  • The map includes the portfolio's ability to aid learners in planning, setting goals, and navigating the artifacts learners create and collect.
    • Nils Peterson
       
      Recently, when I have been thinking about program assessment I've been thinking how students might assess courses (before adding the couse to their transcript (aka portfolio) in terms of the student's learning needs for developing proficiency in the 6 WSU goals. Students might also do a course evaluation relative to the 6 goals to give instrutors and fellow students guideposts. SO, the notion here, portfolio as map, would be that the portfolio had a way for the learner to track/map progress toward a goal. Perhaps a series of radar charts associated with a series of artifacts. Learner reflection would lead to conclusion about what aspect of the rubric needed more practice in the creation of the next artifacts going into the portfolio.
Theron DesRosier

Assessing Learning Outcomes at the University of Cincinnati: Comparing Rubric Assessmen... - 2 views

  •  
    "When the CLA results arrived eight months later, the UC team compared the outcomes of the two assessments. "We found no statistically significant correlation between the CLA scores and the portfolio scores," Escoe says. "In some ways, it's a disappointing finding. If we'd found a correlation, we could tell faculty that the CLA, as an instrument, is measuring the same things that we value and that the CLA can be embedded in a course. But that didn't happen." There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points "in a black box": if a student referred to a specific piece of evidence in a critical-thinking question, he or she simply received one point. In addition, she says, faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind-leading to results that would not correlate to a computer-scored test. In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. "
  •  
    Another institution trying to make sense of the CLA. This study compared student's CLA scores with criteria-based scores of their eportfolios. The study used a modified version of the VALUE rubrics developed by the AACU. Our own Gary Brown was on the team that developed the critical thinking rubric for the VALUE project.
  •  
    "The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. " This begs some questions: what meaning can we attach to these two non-correlated measures? What VSA requirements can rubric-based assessment NOT satisfy? Are those "requirements" really useful?
Peggy Collins

Clemson University e-portfolio winners - 3 views

  •  
    Students used different technologies, not one set mandated system for the e-portfolios. In 2006, Clemson University implemented the ePortfolio Program that requires all undergraduates to create and submit a digital portfolio as evidence of academic and experiential mastery of Clemson's core competencies. Students collect work from their classes and elsewhere, connecting (tagging) it to the competencies (Written and Oral Communication; Reasoning, Critical Thinking and Problem Solving; Mathematical, Scientific and Technological Literacy; Social Science and Cross-Cultural Awareness; Arts and Humanities; and Ethical Judgment) throughout their undergraduate experience.
Joshua Yeidel

Web Slides on Harvesting Grade Book - 0 views

  •  
    A cool technique for portfolio-on-the-web
Nils Peterson

Balancing the Two Faces of ePortfolios - Researching Lifelong ePortfolios and Web 2.0 |... - 0 views

  •  
    Helen Barrett attempting to balance portfolio typologies
  •  
    diagram of relationship between workspace and showcase portfolio
Joshua Yeidel

Digication :: NCCC Art Department Program Evaluation :: Purpose of Evaluation - 0 views

  •  
    An eportfolio for program evaluation by the Northwest Connecticut Community College Art Department. Slick, well-organized, and pretty using Digication as a platform and host. A fine portfolio, which could well be a model for our programs, except that there is not a single direct measure of student learning outcomes.
Gary Brown

WSU Today Online - Current Article List - 0 views

  • the goal of the program is for students to submit their portfolios at the start of their junior year, and only about 34 percent are managing to do that.
  • Writing Assessment Program received the 2009 “Writing Program Certificate of Excellence”
  • If students delay completing their portfolio until late in their junior year, or into their senior year, she said, “it undermines the instructional integrity of the assessment.”
  • ...1 more annotation...
  • 70 percent of students submitted a paper as part of their portfolio that had been completed in a non-WSU course
  •  
    I ponder these highlights
Nils Peterson

Dave's Educational Blog - 0 views

  • If all of our students are remembering the same things, the things that they learned for their standards test, the collaborative work between those students will only differ insofar as they have lived different lives OUTSIDE of school. In this sense, the education system plays NO part whatsoever in contributing to the creative economy.
    • Nils Peterson
       
      Recalling Bransford and the amout of time in our lives we are learning vs the amount of time in school
  •  
    portfolio implications: In the rhizomatic model of learning, curriculum is not driven by predefined inputs from experts; it is constructed and negotiated in real time by the contributions of those engaged in the learning process. This community acts as th
Gary Brown

Reviewers Unhappy with Portfolio 'Stuff' Demand Evidence -- Campus Technology - 1 views

  • An e-mail comment from one reviewer: “In reviewing about 100-some-odd accreditation reports in the last few months, it has been useful in our work here at Washington State University to distinguish ‘stuff’ from evidence. We have adopted an understanding that evidence is material or data that has been analyzed and that can be used, as dictionary definitions state, as ‘proof.’ A student gathers ‘stuff’ in the ePortfolio, selects, reflects, etc., and presents evidence that makes a case (or not)… The use of this distinction has been indispensable here. An embarrassing amount of academic assessment work culminates in the presentation of ‘stuff’ that has not been analyzed--student evaluations, grades, pass rates, retention, etc. After reading these ‘self studies,’ we ask the stumping question--fine, but what have you learned? Much of the ‘evidence’ we review has been presented without thought or with the general assumption that it is somehow self-evident… But too often that kind of evidence has not focused on an issue or problem or question. It is evidence that provides proof of nothing.
  •  
    a bit of a context shift, but....
Theron DesRosier

Wired Campus: Electronic Portfolios: a Path to the Future of Learning - Chron... - 0 views

  •  
    irst, ePortfolios can integrate student learning in an expanded range of media, literacies, and viable intellectual work. As the robust ePortfolio projects at Washington State, Clemson, and Pennsylvania State Universities illustrate, ePortfolios enable students to collect work and reflections on their learning through text, imagery, and multimedia artifacts. Given that we are already living in a culture where visual communication is as influential as written text, the ability to represent learning through integrated media will be essential.
Theron DesRosier

pagi: eLearning - 0 views

  • ePortfolio ePortfolios, the Harvesting Gradebook, Accountability, and Community (!!!) Harvesting gradebook Learning from the transformative grade book Implementing the transformed grade book Transformed gradebook worked example (!!) Best example: Calaboz ePortfolio (!!) Guide to Rating Integrative & Critical Thinking (!!!) Grant Wiggins, Authentic Education Hub and spoke model of course design (!!!) ePortfolio as the core learning application Case Studies of Electronic Portfolios for Learning
  •  
    Nils found this. It is a Spanish concept map on eLearning that includes CTLT and the Harvesting Gradebook.
Joshua Yeidel

Blogging as Pedagogic Practice Across the Curriculum - Serendipity35 - 0 views

  •  
    Teachers are using college-wide blogging tools or free blogging services for different disciplines as a way to address e-portfolios, audience, publishing practices, copyright and plagiarism, authentic writing and writing in a digital age with hypertext.
Gary Brown

News: Different Paths to Full Professor - Inside Higher Ed - 1 views

  • Ohio State is embarking on discussions on how to change the way professors are evaluated for promotion to full professor. University officials argue that, as in tenure reviews, research appears to be the dominant factor at that stage, despite official policies to weigh teaching and service as well.
  • The concept in play would end the myth that candidates for full professor (and maybe, someday, candidates for tenure) should be great in everything. Why? Because most professors aren't great at everything.
  • Once research eminence is verified, teaching and service must be found only to be "adequate."
  • ...3 more annotations...
  • This approach is insidiously harmful," Alutto said. "First, it generates cynicism among productive faculty, as they realize the 'game' being played. Second, it frustrates productive faculty who contribute to their disciplines and the university in unique and powerful ways other than -- or in addition to -- traditional research. Third, it flies in the face of everything we know about the need for a balanced portfolio of skills to achieve institutional success."
  • Measuring impact is always difficult, particularly when it comes to teaching and service," he said. "But it can be done if we focus on the significance of these activities as it extends beyond our own institution -- just as we expect such broad effects with traditional scholarship. Thus, indicators of impact on other institutions, recognition by professional associations, broad adoption of teaching materials (textbooks, software, etc.) by other institutions, evidence of effects on policy formulation and so on -- all these are appropriate independent indicators of effectiveness."
  • Gerber said, the idea of "counting" such contributions in faculty evaluations is an embrace of Ernest Boyer's ideas about "the scholarship of teaching," ideas that have had much more influence outside research universities than within them.
  •  
    Reconsidering SoTL at Ohio State
  •  
    Responding to this portion: This approach is insidiously harmful," Alutto said. "First, it generates cynicism among productive faculty, as they realize the 'game' being played. Second, it frustrates productive faculty who contribute to their disciplines and the university in unique and powerful ways other than -- or in addition to -- traditional research. Third, it flies in the face of everything we know about the need for a balanced portfolio of skills to achieve institutional success." How does OAI navigate these real concerns / hurdles with our program assessment efforts? If we convince/force leadership to "value" teaching and SoTL but it carries little or no weight in terms of promotion and tenure (I give you Carol Anelli, for example), then don't we become part of that "game"?
1 - 20 of 45 Next › Last »
Showing 20 items per page