Skip to main content

Home/ CTLT and Friends/ Group items tagged assessment

Rss Feed Group items tagged

Lorena O'English

http://teachpaperless.blogspot.com/2010/03/using-jing-to-assess-online-student.html - 2 views

  •  
    This is an interesting idea - and not limited to Jing - there are a bunch of free and easy to use screencasting tools...
Gary Brown

Learning to Hate Learning Objectives - The Chronicle Review - The Chronicle of Higher E... - 4 views

  • Brottman's essay is a dangerous display of educational malpractice. Those who argue that principles of good assessment intrude upon teaching and learning disclose the painful fact that many educators are not adequately prepared to teach.
  •  
    Read it and weep.
  •  
    I think this reader comment captures it: Right--it's not about the students learning anything--it's about YOUR learning, and you let them come along for the ride. How could you fit that into learning objectives? Please. This is why people think all of us are navel-gazing, self-indulgent mopes.
  •  
    Doesn't it depend on the nature of the learning objectives? I mean, you could list a set of facts and skills levels students should have attained. You could specify a number of discrete facts and skills to be attained within certain areas of the course curriculum. Or, you could do something more creative such as measure the number of claims with evidence in student writing that is within the subject matter of the course to demonstrate a level of articulation.

    At CTLT, I never did become fully settled on certain subject types though, like mathematics and natural sciences. Depending on the subject matter, specific facts like natural laws and methods must be discretely learned and learned perfectly. And, indeed in some subjects, there is such a thing as perfect understanding where anything even slightly less is failure to learn. This is rigid, yes.. But I do not see the alternative in some subjects and teachers of those subjects certainly don't either. I do think that sometimes there can be more flexibility in the order of learning of discrete fundamentals. Learning out of order often convinced me of the importance of things skipped, causing me to go back and study more comprehensively on my own, in my own time, and according to my own interest.
Gary Brown

Encyclopedia of Educational Technology - 1 views

  • The revised taxonomy (Anderson and Krathwohl, 2001) incorporates both the kind of knowledge to be learned (knowledge dimension) and the process used to learn (cognitive process), allowing for the instructional designer to efficiently align objectives to assessment techniques. Both dimensions are illustrated in the following table that can be used to help write clear, focused objectives.
  • Teachers may also use the new taxonomy dimensions to examine current objectives in units, and to revise the objectives so that they will align with one another, and with assessments.
  • Anderson and Krathwohl also list specific verbs that can be used when writing objectives for each column of the cognitive process dimension.
  •  
    Bloom has not gone away, and this revision helps delimit the nominalist implications
Nils Peterson

News: Assessment Disconnect - Inside Higher Ed - 7 views

  •  
    Theron left an interesting comment to this, the whole piece is a timely read and connects to OAI's staff workshop 1/28/10
Theron DesRosier

The Atlantic Century: Benchmarking EU and U.S. Innovation and Competitiveness | The Inf... - 1 views

  •  
    "ITIF uses 16 indicators to assess the global innovation-based competitiveness of 36 countries and 4 regions. This report finds that while the U.S. still leads the EU in innovation-based competitiveness, it ranks sixth overall. Moreover, the U.S. ranks last in progress toward the new knowledge-based innovation economy over the last decade."
Gary Brown

Can We Afford Our State Colleges? - Brainstorm - The Chronicle of Higher Education - 0 views

  • xperts do not agree on the precise numbers, but over the past generation we have moved from an environment in which states paid for 70 percent of cost and students paid 30 percent, to a situation in which those numbers have exactly reversed. Increasingly, tuition accounts for the lion’s share of institutional budgets, with state appropriations playing a minority role.
  • The sense I got Friday was that higher-education professionals do not expect the ”good old days” to return
  • the apparent consensus that public education needs to be more productive, because there was no discussion of the definition of productivity. 
  • ...4 more annotations...
  • But even when instruction was (implicitly) assumed to be the measure of productivity, there was no discussion of measurable learning outcomes.
  • 5. It is true that public colleges do not measure learning outcomes. Neither does anyone else. U.S. universities resist this kind of accountability in every way they can think of. Since 1985, when the modern assessment movement gained traction, higher education can only be said to have been temporizing, getting ready to get ready through endless committees that go nowhere.
  • Most institutions continue to invoke apodictic notions of quality and refuse to define quality in modern terms (suitability to purpose; quality for whom and for what purpose) or to address the issue of value added, where career schools and community colleges will generally lead. At this time, there is virtually no institution-wide assessment system in place that would pass muster in a 501 measurement science course.
  • 6. Yes, public institutions need restructuring to make them more accountable and productive. Our independent colleges and universities need the same kind of restructuring and the agenda is rightfully one of public interest. The common perception that taxpayers do not support our private institutions is false.----------------------Robert W TuckerPresidentInterEd, Inc.www.InterEd.com
  •  
    Tucker's note in the comments again suggests the challenge and opportunity of the WSU model.
Gary Brown

YouTube - Assessment Quickies #1: What Are Student Learning Outcomes? - 3 views

shared by Gary Brown on 22 Apr 10 - Cached
  • Assessment Quickies #1: What Are Student Learning Outcomes?
  •  
    a useful resource for our partners here at WSU from a new Cal State partner.
Theron DesRosier

Virtual-TA - 2 views

  • We also developed a technology platform that allows our TAs to electronically insert detailed, actionable feedback directly into student assignments
  • Your instructors give us the schedule of assignments, when student assignments are due, when we might expect to receive them electronically, when the scored assignments will be returned, the learning outcomes on which to score the assignments, the rubrics to be used and the weights to be applied to different learning outcomes. We can use your rubrics to score assignments or design rubrics for sign-off by your faculty members.
  • review and embed feedback using color-coded pushpins (each color corresponds to a specific learning outcome) directly onto the electronic assignments. Color-coded pushpins provide a powerful visual diagnostic.
  • ...5 more annotations...
  • We do not have any contact with your students. Instructors retain full control of the process, from designing the assignments in the first place, to specifying learning outcomes and attaching weights to each outcome. Instructors also review the work of our TAs through a step called the Interim Check, which happens after 10% of the assignments have been completed. Faculty provide feedback, offer any further instructions and eventually sign-off on the work done, before our TAs continue with the remainder of the assignments
  • Finally, upon the request of the instructor, the weights he/she specified to the learning outcomes will be rubric-based scores which are used to generate a composite score for each student assignment
  • As an added bonus, our Virtual-TAs provide a detailed, summative report for the instructor on the overall class performance on the given assignment, which includes a look at how the class fared on each outcome, where the students did well, where they stumbled and what concepts, if any, need reinforcing in class the following week.
  • We can also, upon request, generate reports by Student Learning Outcomes (SLOs). This report can be used by the instructor to immediately address gaps in learning at the individual or classroom level.
  • Think of this as a micro-closing-of-the-loop that happens each week.  Contrast this with the broader, closing-the-loop that accompanies program-level assessment of learning, which might happen at the end of a whole academic year or later!
  •  
    I went to Virtual TA and Highlighted their language describing how it works.
Gary Brown

Evaluations That Make the Grade: 4 Ways to Improve Rating the Faculty - Teaching - The ... - 1 views

  • For students, the act of filling out those forms is sometimes a fleeting, half-conscious moment. But for instructors whose careers can live and die by student evaluations, getting back the forms is an hour of high anxiety
  • "They have destroyed higher education." Mr. Crumbley believes the forms lead inexorably to grade inflation and the dumbing down of the curriculum.
  • Texas enacted a law that will require every public college to post each faculty member's student-evaluation scores on a public Web site.
  • ...10 more annotations...
  • The IDEA Center, an education research group based at Kansas State University, has been spreading its particular course-evaluation gospel since 1975. The central innovation of the IDEA system is that departments can tailor their evaluation forms to emphasize whichever learning objectives are most important in their discipline.
  • (Roughly 350 colleges use the IDEA Center's system, though in some cases only a single department or academic unit participates.)
  • The new North Texas instrument that came from these efforts tries to correct for biases that are beyond an instructor's control. The questionnaire asks students, for example, whether the classroom had an appropriate size and layout for the course. If students were unhappy with the classroom, and if it appears that their unhappiness inappropriately colored their evaluations of the instructor, the system can adjust the instructor's scores accordingly.
  • Elaine Seymour, who was then director of ethnography and evaluation research at the University of Colorado at Boulder, was assisting with a National Science Foundation project to improve the quality of science instruction at the college level. She found that many instructors were reluctant to try new teaching techniques because they feared their course-evaluation ratings might decline.
  • "So the ability to do some quantitative analysis of these comments really allows you to take a more nuanced and effective look at what these students are really saying."
  • Mr. Frick and his colleagues found that his new course-evaluation form was strongly correlated with both students' and instructors' own measures of how well the students had mastered each course's learning goals.
  • The survey instrument, known as SALG, for Student Assessment of their Learning Gains, is now used by instructors across the country. The project's Web site contains more than 900 templates, mostly for courses in the sciences.
  • "Students are the inventory," Mr. Crumbley says. "The real stakeholders in higher education are employers, society, the people who hire our graduates. But what we do is ask the inventory if a professor is good or bad. At General Motors," he says, "you don't ask the cars which factory workers are good at their jobs. You check the cars for defects, you ask the drivers, and that's how you know how the workers are doing."
  • William H. Pallett, president of the IDEA Center, says that when course rating surveys are well-designed and instructors make clear that they care about them, students will answer honestly and thoughtfully.
  • In Mr. Bain's view, student evaluations should be just one of several tools colleges use to assess teaching. Peers should regularly visit one another's classrooms, he argues. And professors should develop "teaching portfolios" that demonstrate their ability to do the kinds of instruction that are most important in their particular disciplines. "It's kind of ironic that we grab onto something that seems fixed and fast and absolute, rather than something that seems a little bit messy," he says. "Making decisions about the ability of someone to cultivate someone else's learning is inherently a messy process. It can't be reduced to a formula."
  •  
    Old friends at the Idea Center, and an old but persistent issue.
Joshua Yeidel

Higher Education: Assessment & Process Improvement Group News | LinkedIn - 0 views

  •  
    High School Principal George Wood eloquently contrasts standardized NCLB-style testing and his school's term-end performance testing.
Corinna Lo

Blackboard Outcomes Assessment Webcast - Moving Beyond Accreditation: Using Institution... - 0 views

  •  
    The first 12 minutes of the webcast is worth watching. He opened up with a story of the investigation of cholera outbreak during Victorian era in London, and brought that into how it related to student success. He then summarized what the key methods of measurement were, and some lessons Learned: An "interdisciplinary" led to unconventional, yet innovative methods of investigation. The researchers relied on multiple forms of measurement to come to their conclusion. The visualization of their data was important to proving their case to others.
Gary Brown

The Wired Campus - Duke Professor Uses 'Crowdsourcing' to Grade - The Chronicle of High... - 0 views

  • Learning is more than earning an A says Cathy N. Davidson, the professor, who recently returned to teach English and interdisciplinary studies after eight years in administration. But students don't always see it that way. Vying for an A by trying to figure out what a professor wants or through the least amount of work has made the traditional grading scale superficial, she says."You've got this real mismatch between the kind of participatory learning that’s happening online and outside of the classroom, and the top-down, hierarchical learning and rigid assessment schemes that we’re using in the classroom from grades K through 12 and all the way up to graduate school," Ms. Davidson says. "In school systems today, we’re putting more and more emphasis on quantitative assessment in an era when, out of the classroom, students are learning through an entirely different way of collaboration, customizing, and interacting."
  •  
    We need to contact Cathy Davidson and work together on this.
Theron DesRosier

Revolution in the Classroom - The Atlantic (August 12, 2009) - 0 views

  •  
    An article in the Atlantic today by Clayton Christensen discusses "Revolution in the Classroom" In a paragraph on data collection he says the following: Creating effective methods for measuring student progress is crucial to ensuring that material is actually being learned. And implementing such assessments using an online system could be incredibly potent: rather than simply testing students all at once at the end of an instructional module, this would allow continuous verification of subject mastery as instruction was still underway. Teachers would be able to receive constant feedback about progress or the lack thereof and then make informed decisions about the best learning path for each student. Thus, individual students could spend more or less time, as needed, on certain modules. And as long as the end result - mastery - was the same for all, the process and time allotted for achieving it need not be uniform." The "module" focus is a little disturbing but the rest is helpful.
Peggy Collins

The End in Mind » Assessment as a Social Activity - 0 views

  •  
    jon mott blog post on harvesting gradebook with our video
Gary Brown

On Hiring - Redefining Faculty Roles - The Chronicle of Higher Education - 0 views

  • aculty duties and expectations have diversified and become more complex, but there clearly has not been a concomitant change in the traditional expectations for faculty performance.To take one example: at many institutions, assessment programs have added substantial burdens to faculty members, who must both plan and execute them. I suspect, though I do not know, that such additional burdens are heavier at teaching-oriented colleges and universities that also have higher standard teaching loads than more research-oriented institutions. There's also increased pressure on faculty members to involve undergraduate students in research, an initiative that takes various forms at various institutions but that is prevalent across institutional types.
  •  
    lamenting the increased burden involved in changing faculty roles, but misses the implications of SoTL and synergies. It is not more work but different, but communicating that vantage is our challenge.
Nils Peterson

Accreditor for Teaching Programs Puts New Emphasis on Research and Real Life - Chronicl... - 0 views

  • “Learning these aspects of teaching in a contrived setting just isn’t doing the job.” Future teachers should be receiving this instruction and guidance from mentors who are working
    • Nils Peterson
       
      A call for learning in community -- what is missing is any discussion of how to harvest feedback. Be a classic case for posting a lesson plan and its assessment, and its products and asking teachers, peers, parents to assess and comment
Nils Peterson

Assessment and Teaching of 21st Century Skills ~ Stephen's Web ~ by Stephen Downes - 0 views

  • While people contemporary business work with others and use subject knowledge and a variety of technological tools and resources to analyze and solve complex, ill-structured problems or to create products for authentic audiences
    • Nils Peterson
       
      another quote in the report "The study found that as ICT is taken up by a firm,  computers substitute for workers who perform  routine physical and cognitive tasks but they complement workers who perform non‐routine  problem solving tasks. "
  •  
    Item Gary emailed around
Nils Peterson

Edge: THE IMPENDING DEMISE OF THE UNIVERSITY By Don Tapscott - 1 views

  • For those of us like me who have been working on the Internet for years, it was very clear you couldn't encounter free software and you couldn't encounter Wikipedia and you couldn't encounter all of the wealth of cultural materials that people create and exchange, and the valuable actual software that people create, without an understanding that something much more complex is happening than the dominant ideology of the last 40 years or so. But you could if you weren't looking there, because we were used in the industrial system to think in these terms.
    • Nils Peterson
       
      Hard to read because of the double negatives. He's saying there is lots of evidence of a new model in operation
  • It's a model that is teacher-focused, one-way, one-size-fits-all and the student is isolated in the learning process. Yet the students, who have grown up in an interactive digital world, learn differently. Schooled on Google and Wikipedia, they want to inquire, not rely on the professor for a detailed roadmap. They want an animated conversation, not a lecture. They want an interactive education, not a broadcast one
    • Nils Peterson
       
      and it has implications for assessment and vehicles for assessment (portfolios)
  •  
    "In the industrial model of student mass production, the teacher is the broadcaster. A broadcast is by definition the transmission of information from transmitter to receiver in a one-way, linear fashion. The teacher is the transmitter and student is a receptor in the learning process. The formula goes like this: "I'm a professor and I have knowledge. You're a student, you're an empty vessel and you don't. Get ready, here it comes. Your goal is to take this data into your short-term memory and through practice and repetition build deeper cognitive structures so you can recall it to me when I test you."... The definition of a lecture has become the process in which the notes of the teacher go to the notes of the student without going through the brains of either. "
Nils Peterson

Washington State's Dilemma: How to Serve Up a Book Criticizing the Food Industry - Chro... - 0 views

  •  
    The last paragraph is a good one and the core question I think we have been working on with harvesting gradebook - make not only the student work, but the assignment and the assessment criteria open as community property. Why not have community involved in the conversation about what is important to study?
Nils Peterson

Stimulus Spot Check | ProPublica: Stimulus Chase - 0 views

  • Below is a random sample we assembled of 520 of the 5,800 stimulus-funded transportation projects nationwide, showing how much money to date the federal Department of Transportation has disbursed to individual transportation projects nationwide. We're asking you to help us figure out the status of these projects — whether the project has been started or has been completed, what company got the contract, and how many jobs the company says it retained or created for its stimulus contract.
    • Nils Peterson
       
      A different approach to harvesting. In this case, the audit is being commissioned by a 3rd party, the auditors are the community. The assessment criteria are simple (another assessment should come from state & local inspectors). The interesting data are the presence or status of the projects compared to what is claimed by the funder.
« First ‹ Previous 101 - 120 of 223 Next › Last »
Showing 20 items per page