Skip to main content

Home/ Educational Analytics/ Group items tagged impact

Rss Feed Group items tagged

George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

Open Research Online - Learning dispositions and transferable competencies: pedagogy, m... - 0 views

  •  
    Theoretical and empirical evidence in the learning sciences substantiates the view that deep engagement in learning is a function of a complex combination of learners' identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. To date, the learning analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by learning analytics. We report progress in the design and implementation of learning analytics based on a research validated multidimensional construct termed "learning power". We describe, for the first time, a learning analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of analytics that it supports from real time summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners. We conclude by summarising the ongoing research and development programme and identifying the challenges of integrating traditional social science research, with learning analytics and modelling.
George Bradford

Times Higher Education - Satisfaction and its discontents - 0 views

  •  
    Satisfaction and its discontents 8 March 2012 The National Student Survey puts pressure on lecturers to provide 'enhanced' experiences. But, argues Frank Furedi, the results do not measure educational quality and the process infantilises students and corrodes academic integrity One of the striking features of a highly centralised system of higher education, such as that of the UK, is that the introduction of new targets and modifications to the quality assurance framework can have a dramatic impact in a very short space of time. When the National Student Survey was introduced in 2005, few colleagues imagined that, just several years down the road, finessing and managing its implementation would require the employment of an entirely new group of quality-assurance operatives. At the time, the NSS was seen by many as a relatively pointless public-relations exercise that would have only a minimal effect on academics' lives. It is unlikely that even its advocates would have expected the NSS to acquire a life of its own and become one of the most powerful influences on the form and nature of the work done in universities.
George Bradford

Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning ... - 0 views

  •  
    Simon Buckingham Shum Ruth Deakin Crick 2012 (In review) Theoretical and empirical evidence in the learning sciences  substantiates the view that deep engagement in learning is a  function of a  combination of learners' dispositions,  values,  attitudes and skills. When these are fragile, learners struggle to  achieve their potential in conventional assessments, and critically,  are not prepared for the novelty and complexity of the challenges  they will meet in the workplace, and the many other spheres of  life which require personal qualities such as resilience, critical  thinking and collaboration skills. To date, the learning analytics  research and development communities have not addressed how  these complex concepts can be modelled and analysed. We report  progress in the design and implementation of learning analytics  based on an empirically validated  multidimensional construct  termed  "learning power". We describe a  learning analytics  infrastructure  for gathering data at scale, managing stakeholder  permissions, the range of analytics that it supports from real time  summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners.  We conclude by  summarising the ongoing research and  development programme.
George Bradford

ScienceDirect - The Internet and Higher Education : A course is a course is a course: F... - 0 views

  •  
    "Abstract The authors compared the underlying student response patterns to an end-of-course rating instrument for large student samples in online, blended and face-to-face courses. For each modality, the solution produced a single factor that accounted for approximately 70% of the variance. The correlations among the factors across the class formats showed that they were identical. The authors concluded that course modality does not impact the dimensionality by which students evaluate their course experiences. The inability to verify multiple dimensions for student evaluation of instruction implies that the boundaries of a typical course are beginning to dissipate. As a result, the authors concluded that end-of-course evaluations now involve a much more complex network of interactions. Highlights ► The study models student satisfaction in the online, blended, and face-to-face course modalities. ► The course models vary technology involvement. ► Image analysis produced single dimension solutions. ► The solutions were identical across modalities. Keywords: Student rating of instruction; online learning; blended learning; factor analysis; student agency"
1 - 5 of 5
Showing 20 items per page