Skip to main content

Home/ Educational Analytics/ Group items matching "data" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
George Bradford

IBM Solidifies Academic Analytics Investments - Datanami - 0 views

  •  
    December 22, 2011 IBM Solidifies Academic Analytics Investments Datanami Staff As their own detailed report in conjunction with MIT Sloan made clear, IBM is keenly aware of the dramatic talent shortfall that could keep the future of big Data analytics in check. Accordingly, the company is stepping in to boost analytics-driven programs at universities around the world. A report out of India this week indicated that Big Blue is firming up its investments at a number of academic institutions worldwide in the hopes of readying a new generation of analytics graduates. This effort springs from the company's Academic Initiative, which is the IBM-led effort to partner with universities to extend the capabilities of institutions to provide functional IT training and research opportunities.
George Bradford

Analytics in Higher Education - Benefits, Barriers, Progress, and Recommendations (EDUCAUSE) 2012 - 0 views

  •  
    Jacqueline Bichsel - 2012 EDUCAUSE Many colleges and universities have demonstrated that analytics can help significantly advance an institution in such strategic areas as resource allocation, student success, and finance. Higher education leaders hear about these transformations occurring at other institutions and wonder how their institutions can initiate or build upon their own analytics programs. Some question whether they have the resources, infrastructure, processes, or data for analytics. Some wonder whether their institutions are on par with other in their analytics endeavors. It is within that context that this study set out to assess the current state of analytics in higher education, outline the challenges and barriers to analytics, and provide a basis for benchmarking progress in analytics.
George Bradford

QUT | Learning and Teaching Unit | REFRAME - 0 views

  •  
    REFRAME REFRAME is a university-wide project reconceptualising QUT's evaluation of learning and teaching. REFRAME is fundamentally reconsidering QUT's overall approach to evaluating learning and teaching. Our aim is to develop a sophisticated risk-based system to gather, analyse and respond to data along with a broader set of user-centered resources. The objective is to provide individuals and teams with the tools, support and reporting they need to meaningfully reflect upon, review and improve teaching, student learning and the curriculum. The approach will be informed by feedback from the university community, practices in other institutions and the literature, and will, as far as possible, be 'future-proofed' through awareness of emergent evaluation trends and tools. Central to REFRAME is the consideration of the purpose of evaluation and the features that a future approach should consider.
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning Analytics (PDF) - 0 views

  •  
    Simon Buckingham Shum Ruth Deakin Crick 2012 (In review) Theoretical and empirical evidence in the learning sciences  substantiates the view that deep engagement in learning is a  function of a  combination of learners' dispositions,  values,  attitudes and skills. When these are fragile, learners struggle to  achieve their potential in conventional assessments, and critically,  are not prepared for the novelty and complexity of the challenges  they will meet in the workplace, and the many other spheres of  life which require personal qualities such as resilience, critical  thinking and collaboration skills. To date, the learning analytics  research and development communities have not addressed how  these complex concepts can be modelled and analysed. We report  progress in the design and implementation of learning analytics  based on an empirically validated  multidimensional construct  termed  "learning power". We describe a  learning analytics  infrastructure  for gathering data at scale, managing stakeholder  permissions, the range of analytics that it supports from real time  summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners.  We conclude by  summarising the ongoing research and  development programme.
George Bradford

SpringerLink - Abstract - Dr. Fox Rocks: Using Data-mining Techniques to Examine Student Ratings of Instruction - 0 views

  •  
    Abstract Few traditions in higher education evoke more controversy, ambivalence, criticism, and, at the same time, support than student evaluation of instruction (SEI). Ostensibly, results from these end-of-course survey instruments serve two main functions: they provide instructors with formative input for improving their teaching, and they serve as the basis for summative profiles of professors' effectiveness through the eyes of their students. In the academy, instructor evaluations also can play out in the high-stakes environments of tenure, promotion, and merit salary increases, making this information particularly important to the professional lives of faculty members. At the research level, the volume of the literature for student ratings impresses even the most casual observer with well over 2,000 studies referenced in the Education Resources Information Center (ERIC) alone (Centra, 2003) and an untold number of additional studies published in educational, psychological, psychometric, and discipline-related journals. There have been numerous attempts at summarizing this work (Algozzine et al., 2004; Gump, 2007; Marsh & Roche, 1997; Pounder, 2007; Wachtel, 1998). Student ratings gained such notoriety that in November 1997 the American Psychologist devoted an entire issue to the topic (Greenwald, 1997). The issue included student ratings articles focusing on stability and reliability, validity, dimensionality, usefulness for improving teaching and learning, and sensitivity to biasing factors, such as the Dr. Fox phenomenon that describes eliciting high student ratings with strategies that reflect little or no relationship to effective teaching practice (Ware & Williams, 1975; Williams & Ware, 1976, 1977).
George Bradford

Dr Ruth Deakin Crick - Graduate School of Education - 0 views

  •  
    First, the ongoing exploration of the reliability and validity of the psychometric assessment instrument designed to measure and stimulate change in learning power, for which I was one of three originators between 2000 and 2002. To date I have been able to collect large data sets (n=>50,000) and have published reliability and validity statistics in four  peer reviewed journal articles. Second, the application of the concept and assessment of learning power in pedagogy in school, community and corporate sectors, and in particular its contribution to personalisation of learning through authentic enquiry. Third, the contribution of learning power and enquiry to what we know about complexity in education, particularly through the development of systems learning and leadership as a vehicle for organisational transformation. Finally, the application of learning power assessment strategies to the emerging field of learning analytics and agent-based modelling.
George Bradford

[!!!!] Social Learning Analytics - Technical Report (pdf) - 0 views

  •  
    Technical Report KMI-11-01 June 2011 Simon Buckingham Shum and Rebecca Ferguson Abstract: We propose that the design and implementation of effective Social Learning Analytics presents significant challenges and opportunities for both research and enterprise, in three important respects. The first is the challenge of implementing analytics that have pedagogical and ethical integrity, in a context where power and control over data is now of primary importance. The second challenge is that the educational landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social learning, and ways of conceiving social learning environments as distinct from other social platforms. This sets the context for the third challenge, namely, to understand different types of Social Learning Analytic, each of which has specific technical and pedagogical challenges. We propose an initial taxonomy of five types. We conclude by considering potential futures for Social Learning Analytics, if the drivers and trends reviewed continue, and the prospect of solutions to some of the concerns that institution-centric learning analytics may provoke. 
George Bradford

Learning networks, crowds and communities - 1 views

  •  
    Learning networks, crowds and communities Full Text: PDF Author: Caroline Haythornthwaite University of British Columbia, Vancouver, BC Who we learn from, where and when is dramatically affected by the reach of the Internet. From learning for formal education to learning for pleasure, we look to the web early and often for our data and knowledge needs, but also for places and spaces where we can collaborate, contribute to, and create learning and knowledge communities. Based on the keynote presentation given at the first Learning Analytics and Knowledge Conference held in 2011 in Banff, Alberta, this paper explores a social network perspective on learning with reference to social network principles and studies by the author. The paper explores the ways a social network perspective can be used to examine learning, with attention to the structure and dynamics of online learning networks, and emerging configurations such as online crowds and communities.
George Bradford

LOCO-Analyst - 0 views

  •  
    What is LOCO-Analyst? LOCO-Analyst is an educational tool aimed at providing teachers with feedback on the relevant aspects of the learning process taking place in a web-based learning environment, and thus helps them improve the content and the structure of their web-based courses. LOCO-Analyst aims at providing teachers with feedback regarding: *  all kinds of activities their students performed and/or took part in during the learning process, *  the usage and the comprehensibility of the learning content they had prepared and deployed in the LCMS, *  contextualized social interactions among students (i.e., social networking) in the virtual learning environment. This Web site provides some basic information about LOCO-Analyst, its functionalities and implementation. In addition, you can watch videos illustrating the tool's functionalities. You can also learn about the LOCO (Learning Object Context Ontologies) ontological framework that lies beneath the LOCO-Analyst tool and download the ontologies of this framework.
George Bradford

Where everyone in the world is migrating-in one gorgeous chart - Quartz - 1 views

  •  
    "Where everyone in the world is migrating-in one gorgeous chart By Nick Stockton @StocktonSays March 28, 2014"
‹ Previous 21 - 34 of 34
Showing 20 items per page