Skip to main content

Home/ CTLT and Friends/ Group items matching "faculty" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Joshua Yeidel

Blog U.: The Challenge of Value-Added - Digital Tweed - Inside Higher Ed - 0 views

  •  
    Quoting a 1984 study, "higher education should ensure that the mounds of data already collected on students are converted into useful information and fed back [to campus officials and faculty] in ways that enhance student learning and lead to improvement in programs, teaching practices, and the environment in which teaching and learning take place." The example given is an analysis of test scores in the Los Angeles Unified School District by the LA Times.
  •  
    It's going to take some assessment (and political) smarts to deflect the notion that existing data can be re-purposed easily to assess "value-added".
Gary Brown

A Critic Sees Deep Problems in the Doctoral Rankings - Faculty - The Chronicle of Higher Education - 1 views

  • This week he posted a public critique of the NRC study on his university's Web site.
  • "Little credence should be given" to the NRC's ranges of rankings.
  • There's not very much real information about quality in the simple measures they've got."
  • ...4 more annotations...
  • The NRC project's directors say that those small samples are not a problem, because the reputational scores were not converted directly into program assessments. Instead, the scores were used to develop a profile of the kinds of traits that faculty members value in doctoral programs in their field.
  • For one thing, Mr. Stigler says, the relationships between programs' reputations and the various program traits are probably not simple and linear.
  • if these correlations between reputation and citations were plotted on a graph, the most accurate representation would be a curved line, not a straight line. (The curve would occur at the tipping point where high citation levels make reputations go sky-high.)
  • Mr. Stigler says that it was a mistake for the NRC to so thoroughly abandon the reputational measures it used in its previous doctoral studies, in 1982 and 1995. Reputational surveys are widely criticized, he says, but they do provide a check on certain kinds of qualitative measures.
  •  
    What is not challenged is the validity and utility of the construct itself--reputation rankings.
Judy Rumph

Views: Why Are We Assessing? - Inside Higher Ed - 1 views

  • Amid all this progress, however, we seem to have lost our way. Too many of us have focused on the route we’re traveling: whether assessment should be value-added; the improvement versus accountability debate; entering assessment data into a database; pulling together a report for an accreditor. We’ve been so focused on the details of our route that we’ve lost sight of our destinatio
  • Our destination, which is what we should be focusing on, is the purpose of assessment. Over the last decades, we've consistently talked about two purposes of assessment: improvement and accountability. The thinking has been that improvement means using assessment to identify problems — things that need improvement — while accountability means using assessment to show that we're already doing a great job and need no improvement. A great deal has been written about the need to reconcile these two seemingly disparate purposes.
  • The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education
  • ...7 more annotations...
  • Our second common purpose of assessment should be making sure not only that students learn what’s important, but that their learning is of appropriate scope, depth, and rigo
  • Third, we need to accept how good we already are, so we can recognize success when we see i
  • And we haven’t figured out a way to tell the story of our effectiveness in 25 words or less, which is what busy people want and nee
  • Because we're not telling the stories of our successful outcomes in simple, understandable terms, the public continues to define quality using the outdated concept of inputs like faculty credentials, student aptitude, and institutional wealth — things that by themselves don’t say a whole lot about student learning.
  • And people like to invest in success. Because the public doesn't know how good we are at helping students learn, it doesn't yet give us all the support we need in our quest to give our students the best possible education.
  • But while virtually every college and university has had to make draconian budget cuts in the last couple of years, with more to come, I wonder how many are using solid, systematic evidence — including assessment evidence — to inform those decisions.
  • Now is the time to move our focus from the road we are traveling to our destination: a point at which we all are prudent, informed stewards of our resources… a point at which we each have clear, appropriate, justifiable, and externally-informed standards for student learning. Most importantly, now is the time to move our focus from assessment to learning, and to keeping our promises. Only then can we make higher education as great as it needs to be.
  •  
    Yes, this article resonnated with me too. Especially connecting assessment to teaching and learning. The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education.... today we seem to be devoting more time, money, thought, and effort to assessment than to helping faculty help students learn as effectively as possible. When our colleagues have disappointing assessment results, and they don't know what to do to improve them, I wonder how many have been made aware that, in some respects, we are living in a golden age of higher education, coming off a quarter-century of solid research on practices that promote deep, lasting learning. I wonder how many are pointed to the many excellent resources we now have on good teaching practices, including books, journals, conferences and, increasingly, teaching-learning centers right on campus. I wonder how many of the graduate programs they attended include the study and practice of contemporary research on effective higher education pedagogies. No wonder so many of us are struggling to make sense of our assessment results! Too many of us are separating work on assessment from work on improving teaching and learning, when they should be two sides of the same coin. We need to bring our work on teaching, learning, and assessment together.
Gary Brown

Cheating Scandal Snares Hundreds in U. of Central Florida Course - The Ticker - The Chronicle of Higher Education - 1 views

  • evidence of widespread cheating
  • business course on strategic management,
  • I don’t condone cheating. But I think it is equally pathetic that faculty are put in situations where they feel the only option for an examination is an easy to grade multiple choice or true/false test
  • ...3 more annotations...
  • Faculty all need to wake up, as virtually all test banks, and also all instructor’s manuals with homework answers, are widely available on the interne
  • I think we need to question why a class has 600 students enrolled.
  • Perhaps they are the ones being cheated.
Gary Brown

A Final Word on the Presidents' Student-Learning Alliance - Measuring Stick - The Chronicle of Higher Education - 1 views

  • I was very pleased to see the responses to the announcement of the Presidents’ Alliance as generally welcoming (“commendable,” “laudatory initiative,” “applaud”) the shared commitment of these 71 founding institutions to do more—and do it publicly and cooperatively—with regard to gathering, reporting, and using evidence of student learning.
  • establishing institutional indicators of educational progress that could be valuable in increasing transparency may not suggest what needs changing to improve results
  • As Adelman’s implied critique of the CLA indicates, we may end up with an indicator without connections to practice.
  • ...6 more annotations...
  • The Presidents’ Alliance’s focus on and encouragement of institutional efforts is important to making these connections and steps in a direct way supporting improvement.
  • Second, it is hard to disagree with the notion that ultimately evidence-based improvement will occur only if faculty members are appropriately trained and encouraged to improve their classroom work with undergraduates.
  • Certainly there has to be some connection between and among various levels of assessment—classroom, program, department, and institution—in order to have evidence that serves both to aid improvement and to provide transparency and accountability.
  • Presidents’ Alliance is setting forth a common framework of “critical dimensions” that institutions can use to evaluate and extend their own efforts, efforts that would include better reporting for transparency and accountability and greater involvement of faculty.
  • there is wide variation in where institutions are in their efforts, and we have a long way to go. But what is critical here is the public commitment of these institutions to work on their campuses and together to improve the gathering and reporting of evidence of student learning and, in turn, using evidence to improve outcomes.
  • The involvement of institutions of all types will make it possible to build a more coherent and cohesive professional community in which evidence-based improvement of student learning is tangible, visible, and ongoing.
Nils Peterson

From Knowledgable to Knowledge-able: Learning in New Media Environments | Academic Commons - 0 views

  • Many faculty may hope to subvert the system, but a variety of social structures work against them. Radical experiments in teaching carry no guarantees and even fewer rewards in most tenure and promotion systems, even if they are successful. In many cases faculty are required to assess their students in a standardized way to fulfill requirements for the curriculum. Nothing is easier to assess than information recall on multiple-choice exams, and the concise and “objective” numbers satisfy committee members busy with their own teaching and research.
    • Nils Peterson
       
      Do we think this is true? Many?
  • In a world of nearly infinite information, we must first address why, facilitate how, and let the what generate naturally from there.
  •  
    "Most university classrooms have gone through a massive transformation in the past ten years. I'm not talking about the numerous initiatives for multiple plasma screens, moveable chairs, round tables, or digital whiteboards. The change is visually more subtle, yet potentially much more transformative."
  •  
    Connect this to the 10 point self assessment we did for AACU comparing institutional vs community-based learning https://teamsite.oue.wsu.edu/ctlt/home/Anonymous%20Access%20Documents/AACU%202009/inst%20vs%20comm%20based%20spectrum.pdf
S Spaeth

YouTube - Networked Student - 0 views

  •  
    The Networked Student was inspired by CCK08, a Connectivism course offered by George Siemens and Stephen Downes during fall 2008. It depicts an actual project completed by Wendy Drexler's high school students. The Networked Student concept map was inspired by Alec Couros' Networked Teacher. I hope that teachers will use it to help their colleagues, parents, and students understand networked learning in the 21st century. Anyone is free to use this video for educational purposes. You may download, translate, or use as part of another presentation. Please share.
  •  
    This video should be required viewing for incoming faculty. Especially the end of the video; it gives a good description of the new roles faculty can take when they leave the lecture stand. Thanks Stephen
Nils Peterson

One small step for man » Blog Archive » Advice to a Web 2.0 Learner - 0 views

  •  
    Written with an eye to advising a bright student who is home schooled, but also to capture my advice and strategy for Palouse Prairie. \n\nSince i see we are starting to develop a 'blogging' thread in this Diigo group, and such a tool could be part of strategy for what to tell faculty, I decided to bookmark this into that stream.
Gary Brown

Wired Campus - The Chronicle of Higher Education - 0 views

  • colleges and universities can learn from for-profit colleges' approach to teaching.
  • "If disruptive technology allows them to serve new markets, or serve markets more efficiently and effectively in order to profit, then they are more likely to utilize them."
  • Some for-profit institutions emphasize instructor training in a way that more traditional institutions should emulate, according to the report. The University of Phoenix, for example, "has required faculty to participate in a four-week training program that includes adult learning theory," the report said.
  • ...1 more annotation...
  • The committee's largest sponsors include GE, Merrill Lynch and Company, IBM, McKinsey and Company, General Motors, and Pfizer.
  •  
    Minimally the advocates list suggests that higher ed might qualify for a bail out.
Gary Brown

Capella University to Receive 2010 CHEA Award - 2 views

  • The Council for Higher Education Accreditation, a national advocate and institutional voice for self-regulation of academic quality through accreditation, has awarded the 2010 CHEA Award for Outstanding Institutional Practice in Student Learning Outcomes to Capella University (MN), one of four institutions that will receive the award in 2010. Capella University is the first online university to receive the award.
  • Capella University’s faculty have developed an outcomes-based curricular model
  • “Capella University is a leader in accountability in higher education. Their work in student learning outcomes exemplifies the progress that institutions are making through the implementation of comprehensive, relevant and effective initiatives,” said CHEA President Judith Eaton. “We are pleased to recognize this institution with the CHEA Award.”
  • ...2 more annotations...
  • our award criteria: 1) articulation and evidence of outcomes; 2) success with regard to outcomes; 3) information to the public about outcomes; and 4) use of outcomes for educational improvement.
  • In addition to Capella University, Portland State University (OR), St. Olaf College (MN) and the University of Arkansas - Fort Smith (AR) also will receive the 2010 CHEA Award. The award will be presented at the 2010 CHEA Annual Conference, which will be held January 25-28 in Washington, D.C
  •  
    Capella has mandatory faculty training program, and then they select from the training program those who will teach. Candidates also pay their own tuition for the "try-out" or training.
Theron DesRosier

Virtual-TA - 2 views

  • We also developed a technology platform that allows our TAs to electronically insert detailed, actionable feedback directly into student assignments
  • Your instructors give us the schedule of assignments, when student assignments are due, when we might expect to receive them electronically, when the scored assignments will be returned, the learning outcomes on which to score the assignments, the rubrics to be used and the weights to be applied to different learning outcomes. We can use your rubrics to score assignments or design rubrics for sign-off by your faculty members.
  • review and embed feedback using color-coded pushpins (each color corresponds to a specific learning outcome) directly onto the electronic assignments. Color-coded pushpins provide a powerful visual diagnostic.
  • ...5 more annotations...
  • We do not have any contact with your students. Instructors retain full control of the process, from designing the assignments in the first place, to specifying learning outcomes and attaching weights to each outcome. Instructors also review the work of our TAs through a step called the Interim Check, which happens after 10% of the assignments have been completed. Faculty provide feedback, offer any further instructions and eventually sign-off on the work done, before our TAs continue with the remainder of the assignments
  • Finally, upon the request of the instructor, the weights he/she specified to the learning outcomes will be rubric-based scores which are used to generate a composite score for each student assignment
  • As an added bonus, our Virtual-TAs provide a detailed, summative report for the instructor on the overall class performance on the given assignment, which includes a look at how the class fared on each outcome, where the students did well, where they stumbled and what concepts, if any, need reinforcing in class the following week.
  • We can also, upon request, generate reports by Student Learning Outcomes (SLOs). This report can be used by the instructor to immediately address gaps in learning at the individual or classroom level.
  • Think of this as a micro-closing-of-the-loop that happens each week.  Contrast this with the broader, closing-the-loop that accompanies program-level assessment of learning, which might happen at the end of a whole academic year or later!
  •  
    I went to Virtual TA and Highlighted their language describing how it works.
Joshua Yeidel

Digication e-Portfolios: Highered - Assessment - 0 views

  •  
    "Our web-based assessment solution for tracking, comparing, and reporting on student progress and performance gives faculty and administrators the tools they need to assess a class, department, or institution based on your standards, goals, or objectives. The Digication AMS integrates tightly with our award winning e-Portfolio system, enabling students to record and showcase learning outcomes within customizable, media friendly templates."
  •  
    Could this start out as with program portfolios, and bgrow to include student work?
Gary Brown

Evaluations That Make the Grade: 4 Ways to Improve Rating the Faculty - Teaching - The Chronicle of Higher Education - 1 views

  • For students, the act of filling out those forms is sometimes a fleeting, half-conscious moment. But for instructors whose careers can live and die by student evaluations, getting back the forms is an hour of high anxiety
  • "They have destroyed higher education." Mr. Crumbley believes the forms lead inexorably to grade inflation and the dumbing down of the curriculum.
  • Texas enacted a law that will require every public college to post each faculty member's student-evaluation scores on a public Web site.
  • ...10 more annotations...
  • The IDEA Center, an education research group based at Kansas State University, has been spreading its particular course-evaluation gospel since 1975. The central innovation of the IDEA system is that departments can tailor their evaluation forms to emphasize whichever learning objectives are most important in their discipline.
  • (Roughly 350 colleges use the IDEA Center's system, though in some cases only a single department or academic unit participates.)
  • The new North Texas instrument that came from these efforts tries to correct for biases that are beyond an instructor's control. The questionnaire asks students, for example, whether the classroom had an appropriate size and layout for the course. If students were unhappy with the classroom, and if it appears that their unhappiness inappropriately colored their evaluations of the instructor, the system can adjust the instructor's scores accordingly.
  • Elaine Seymour, who was then director of ethnography and evaluation research at the University of Colorado at Boulder, was assisting with a National Science Foundation project to improve the quality of science instruction at the college level. She found that many instructors were reluctant to try new teaching techniques because they feared their course-evaluation ratings might decline.
  • "So the ability to do some quantitative analysis of these comments really allows you to take a more nuanced and effective look at what these students are really saying."
  • Mr. Frick and his colleagues found that his new course-evaluation form was strongly correlated with both students' and instructors' own measures of how well the students had mastered each course's learning goals.
  • The survey instrument, known as SALG, for Student Assessment of their Learning Gains, is now used by instructors across the country. The project's Web site contains more than 900 templates, mostly for courses in the sciences.
  • "Students are the inventory," Mr. Crumbley says. "The real stakeholders in higher education are employers, society, the people who hire our graduates. But what we do is ask the inventory if a professor is good or bad. At General Motors," he says, "you don't ask the cars which factory workers are good at their jobs. You check the cars for defects, you ask the drivers, and that's how you know how the workers are doing."
  • William H. Pallett, president of the IDEA Center, says that when course rating surveys are well-designed and instructors make clear that they care about them, students will answer honestly and thoughtfully.
  • In Mr. Bain's view, student evaluations should be just one of several tools colleges use to assess teaching. Peers should regularly visit one another's classrooms, he argues. And professors should develop "teaching portfolios" that demonstrate their ability to do the kinds of instruction that are most important in their particular disciplines. "It's kind of ironic that we grab onto something that seems fixed and fast and absolute, rather than something that seems a little bit messy," he says. "Making decisions about the ability of someone to cultivate someone else's learning is inherently a messy process. It can't be reduced to a formula."
  •  
    Old friends at the Idea Center, and an old but persistent issue.
Gary Brown

Texas Law Requires Professors to Post Details of Their Teaching Online - Faculty - The Chronicle of Higher Education - 1 views

  • Faculty members and administrators in Texas are speaking out about a recent state law that requires them to post specific, detailed information about their classroom assignments, curricula vitae, department budgets, and the results of student evaluations.
  • Beginning this fall, universities will have to post online a syllabus for every undergraduate course, including major assignments and examinations, reading lists, and course descriptions.
  • All of the information must be no more than three clicks away from the college's home page.
  • ...2 more annotations...
  • "the worst example of government meddling at a huge cost to the public and for zero public good that I have ever seen."
  • "You get the feeling that the government sees us as slackers," she says. By requiring professors to list every assignment, she says the law interferes with her ability to respond to students' interests and current events and shift to different topics during the semester.
  •  
    another to watch--the politization of the, well, everything
Theron DesRosier

How Group Dynamics May Be Killing Innovation - Knowledge@Wharton - 5 views

  • Christian Terwiesch and Karl Ulrich argue that group dynamics are the enemy of businesses trying to develop one-of-a-kind new products, unique ways to save money or distinctive marketing strategies.
  • Terwiesch, Ulrich and co-author Karan Girotra, a professor of technology and operations management at INSEAD, found that a hybrid process -- in which people are given time to brainstorm on their own before discussing ideas with their peers -- resulted in more and better quality ideas than a purely team-oriented process.
    • Theron DesRosier
       
      This happens naturally when collaboration is asynchronous.
    • Theron DesRosier
       
      They use the term "team oriented process" but what they mean, I think, is a synchronous, face to face, brainstorming session.
  • Although several existing experimental studies criticize the team brainstorming process due to the interference of group dynamics, the Wharton researchers believe their work stands out due to a focus on the quality, in addition to the number, of ideas generated by the different processes -- in particular, the quality of the best idea.
  • ...8 more annotations...
  • "The evaluation part is critical. No matter which process we used, whether it was the [team] or hybrid model, they all did significantly worse than we hoped [in the evaluation stage]," Terwiesch says. "It's no good generating a great idea if you don't recognize the idea as great. It's like me sitting here and saying I had the idea for Amazon. If I had the idea but didn't do anything about it, then it really doesn't matter that I had the idea."
  • He says an online system that creates a virtual "suggestion box" can accomplish the same goal as long as it is established to achieve a particular purpose.
  • Imposing structure doesn't replace or stifle the creativity of employees, Ulrich adds. In fact, the goal is to establish an idea generation process that helps to bring out the best in people. "We have found that, in the early phases of idea generation, providing very specific process guideposts for individuals [such as] 'Generate at least 10 ideas and submit them by Wednesday,' ensures that all members of a team contribute and that they devote sufficient creative energy to the problem."
  • The results of the experiment with the students showed that average quality of the ideas generated by the hybrid process were better than those that came from the team process by the equivalent of roughly 30 percentage points.
  • in about three times more ideas than the traditional method.
  • "We find huge differences in people's levels of creativity, and we just have to face it. We're not all good singers and we're not all good runners, so why should we expect that we all are good idea generators?
  • They found that ideas built around other ideas are not statistically better than any random suggestion.
  • "In innovation, variance is your friend. You want wacky stuff because you can afford to reject it if you don't like it. If you build on group norms, the group kills variance."
  •  
    Not as radical as it first seems, but pertains to much of our work and the work of others.
Gary Brown

In Hunt for Prestige, Colleges May Undermine Their Public Mission - Government - The Chronicle of Higher Education - 1 views

  • many large research universities are placing too much priority on activities that raise the profile and prestige of their institutions but do little to improve undergraduate education.
  • "In some of these places, undergraduate education has never been a top priority," says Jane V. Wellman, executive director of the Delta Project on Postsecondary Education Costs, Productivity, and Accountability.
  • While its grants and gifts have gone up, the percentage of money it spends on core teaching and student services has gone down. Many students, of course, benefit from the private support and research dollars, as the university has built better facilities and attracted world-class faculty members.
  • ...3 more annotations...
  • But the research aspirations of many large universities are in conflict with their founding principles, Ms. Wellman says, especially as undergraduate admissions has become more selective
  • another result of the chase for research dollars is that measures for faculty assessment and promotion rely too heavily on the research output and publication and too little on the quality of classroom teaching.
  • "I'm not pushing for banning research," he says, but there should be more flexibility and balance in the criteria."
  •  
    Nothing new, but affirmation of our perceptons.
Gary Brown

Assumptions about Setting the Right Classroom Climate - 0 views

  • September 2, 2009 Assumptions about Setting the Right Classroom Climate By: Maryellen Weimer in Effective Classroom Management   SHARETHIS.addEntry({ title: "Assumptions about Setting the Right Classroom Climate", url: "http://www.facultyfocus.com/articles/effective-classroom-management/assumptions-about-setting-the-right-classroom-climate/" });ShareThis For quite some time now I’ve been interested in a widely held set of assumptions faculty make about the need to assert control at the beginning of a course. The argument goes something like this: When a course starts, the teacher needs to set the rules and clearly establish who’s in charge. If the course goes well, meaning students abide by the rules and do not challenge the teacher’s authority, then the teacher can gradually ease up and be a bit looser about the rules.
  • If all potential challenges to authority are headed off at the pass, then the teacher can devote full attention to the content, and isn’t that where the teacher’s expertise really shines? And so the classroom becomes a place that showcases teaching more than learning? My suspicion is that most teachers overreact to potential threats.
  •  
    Our friend Mary Ellen Weimer sets the stage for addressing a critical bottleneck to innovation, suggesting faculty insecurity/inexperience result in exerting authority.
Nils Peterson

What Intrigues Me About Google Wave - 0 views

  • The basic idea was to make a radically editable learning environment in which students as well as faculty members could rearrange content, functionality, and navigation in the learning environment.
    • Nils Peterson
       
      What fraction of faculty will be excited by radical editability? Its a paragidm shift
    • Joshua Yeidel
       
      Also, what fraction of _students_ will be excited by radical editability? Will a readiness assessment be needed?
Theron DesRosier

BCCC Faculty Learning Community - Faculty Learning Community Blog - 0 views

  •  
    blogpost and link to test drive of HGB
Corinna Lo

The End in Mind » An Open (Institutional) Learning Network - 0 views

shared by Corinna Lo on 15 Apr 09 - Cached
  •  
    Jon said "I wrote a post last year exploring the spider-starfish tension between Personal Learning Environments and institutionally run CMSs. This is a fundamental challenge that institutions of higher learning need to resolve. On the one hand, we should promote open, flexible, learner-centric activities and tools that support them. On the other hand, legal, ethical and business constraints prevent us from opening up student information systems, online assessment tools, and online gradebooks. These tools have to be secure and, at least from a data management and integration perspective, proprietary. So what would an open learning network look like if facilitated and orchestrated by an institution? Is it possible to create a hybrid spider-starfish learning environment for faculty and students?"
« First ‹ Previous 41 - 60 of 133 Next › Last »
Showing 20 items per page