Change Magazine - The New Guys in Assessment Town - 0 views
-
if one of the institution’s general education goals is critical thinking, the system makes it possible to call up all the courses and programs that assess student performance on that outcome.
-
bringing together student learning outcomes data at the level of the institution, program, course, and throughout student support services so that “the data flows between and among these levels”
-
Like its competitors, eLumen maps outcomes vertically across courses and programs, but its distinctiveness lies in its capacity to capture what goes on in the classroom. Student names are entered into the system, and faculty use a rubric-like template to record assessment results for every student on every goal. The result is a running record for each student available only to the course instructor (and in a some cases to the students themselves, who can go to the system to get feedback on recent assessments).
- ...7 more annotations...
-
“I’m a little wary. It seems as if, in addition to the assessment feedback we are already giving to students, we might soon be asked to add a data-entry step of filling in boxes in a centralized database for all the student learning outcomes. This is worrisome to those of us already struggling under the weight of all that commenting and essay grading.”
-
its either double work, or not being understood that the grading and the assessment can be the same activity. i suspect the former -- grading is being done with different metrics
-
I am in the unusual position of seeing many papers _after_ they have been graded by a wide variety of teachers. Many of these contain little "assessment feedback" -- many teachers focus on "correcting" the papers and finding some letter or number to assign as a value.
-
-
“This is where we see many institutions struggling,” Galvin says. “Faculty simply don’t have the time for a deeper involvement in the mechanics of assessment.” Many have never seen a rubric or worked with one, “so generating accurate, objective data for analysis is a challenge.”
-
I asked about faculty pushback. “Not so much,” Galvin says, “not after faculty understand that the process is not intended to evaluate their work.”
-
the annual reports required by this process were producing “heaps of paper” while failing to track trends and developments over time. “It’s like our departments were starting anew every year,” Chaplot says. “We wanted to find a way to house the data that gave us access to what was done in the past,” which meant moving from discrete paper reports to an electronic database.
-
Can eLumen represent student learning in language? No, but it can quantify the number of boxes checked against number of boxes not checked.”
-
developing a national repository of resources, rubrics, outcomes statements, and the like that can be reviewed and downloaded by users
-
“These solutions cement the idea that assessment is an administrative rather than an educational enterprise, focused largely on accountability. They increasingly remove assessment decision making from the everyday rhythm of teaching and learning and the realm of the faculty.