Skip to main content

Home/ Educational Analytics/ Group items tagged establish

Rss Feed Group items tagged

George Bradford

Analytics in Higher Education: Establishing a Common Language | EDUCAUSE - 0 views

  •  
    Analytics in Higher Education: Establishing a Common Language Title: Analytics in Higher Education: Establishing a Common Language (ID: ELI3026) Author(s): Angela van Barneveld (Purdue University), Kimberly Arnold (Purdue University) and John P. Campbell (Purdue University) Topics: Academic Analytics, Action Analytics, Analytics, Business Analytics, Decision Support Systems, Learning Analytics, Predictive Analytics, Scholarship of Teaching and Learning Origin: ELI White Papers, EDUCAUSE Learning Initiative (ELI) (01/24/2012) Type: Articles, Briefs, Papers, and Reports
George Bradford

Sydney Learning Analytics Research Group (LARG) - 0 views

  •  
    "SYDNEY LEARNING ANALYTICS RESEARCH GROUP About The Sydney Learning Analytics Research Group (LARG) is a joint venture of the newly established Quality and Analytics Group within the Education Portfolio, and the new Centre for Research on Learning and Innovation connected to the Faculty of Education and Social Work. The key purposes in establishing the new research group are: Capacity building in learning analytics for the benefit of the institution, its students and staff To generate interest and expertise in learning analytics at the University, and build a new network of research colleagues To build a profile for the University of Sydney as a national and international leader in learning analytics LARG was launched at ALASI in late November 2015. The leadership team is actively planning now for the 2016 calendar year and beyond, with several community-building initiatives already in the pipeline, the first being a lecture by George Siemens, and the second is a new conference travel grant (see details below)."
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
1 - 3 of 3
Showing 20 items per page