Skip to main content

Home/ Educational Analytics/ Group items tagged resources

Rss Feed Group items tagged

George Bradford

Submissions for the Awards for Excellence in Learning Analytics « ascilite - 0 views

  •  
    "Awards for Excellence in Learning Analytics: Submissions The LA SIG operates an Awards program recognising excellence in the practical application of LA to enhance learning and teaching. A key driver for the Awards program is to create and share resources about effective LA practices. We want to give a voice to all who are working with LA to improve learning and teaching - whatever the scale of their endeavours.  To this end, all presentations from Award applicants are available below for viewing. (Submissions closed on 11 September).  We believe these presentations will form an important resource library around the use of LA in tertiary education across Australasia and New Zealand. You may view the awards submission criteria and other background information on the awards in the SIG Awards Program Information (PDF) and to find out more about LA-SIG activities, go here."
George Bradford

Analytics in Higher Education - Benefits, Barriers, Progress, and Recommendations (EDUC... - 0 views

  •  
    Jacqueline Bichsel - 2012 EDUCAUSE Many colleges and universities have demonstrated that analytics can help significantly advance an institution in such strategic areas as resource allocation, student success, and finance. Higher education leaders hear about these transformations occurring at other institutions and wonder how their institutions can initiate or build upon their own analytics programs. Some question whether they have the resources, infrastructure, processes, or data for analytics. Some wonder whether their institutions are on par with other in their analytics endeavors. It is within that context that this study set out to assess the current state of analytics in higher education, outline the challenges and barriers to analytics, and provide a basis for benchmarking progress in analytics.
George Bradford

Selected outcomes assessment resources | ALA Accredited Programs - 0 views

  •  
    "Selected outcomes assessment resources"
George Bradford

Assessment Commons - Internet Resources for Higher Education Outcomes Assessment - 0 views

  •  
    "General Resources Discussion Lists, Forums, Archives of Articles, Lists of Links, etc. Principles of good outcomes assessment practice "
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

QUT | Learning and Teaching Unit | REFRAME - 0 views

  •  
    REFRAME REFRAME is a university-wide project reconceptualising QUT's evaluation of learning and teaching. REFRAME is fundamentally reconsidering QUT's overall approach to evaluating learning and teaching. Our aim is to develop a sophisticated risk-based system to gather, analyse and respond to data along with a broader set of user-centered resources. The objective is to provide individuals and teams with the tools, support and reporting they need to meaningfully reflect upon, review and improve teaching, student learning and the curriculum. The approach will be informed by feedback from the university community, practices in other institutions and the literature, and will, as far as possible, be 'future-proofed' through awareness of emergent evaluation trends and tools. Central to REFRAME is the consideration of the purpose of evaluation and the features that a future approach should consider.
George Bradford

NSSE Home - 0 views

  •  
    National Survey of Student Engagement What is student engagement? Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning. What does NSSE do? Through its student survey, The College Student Report, NSSE annually collects information at hundreds of four-year colleges and universities about student participation in programs and activities that institutions provide for their learning and personal development. The results provide an estimate of how undergraduates spend their time and what they gain from attending college. NSSE provides participating institutions a variety of reports that compare their students' responses with those of students at self-selected groups of comparison institutions. Comparisons are available for individual survey questions and the five NSSE Benchmarks of Effective Educational Practice. Each November, NSSE also publishes its Annual Results, which reports topical research and trends in student engagement results. NSSE researchers also present and publish research findings throughout the year.
George Bradford

SpringerLink - Abstract - Dr. Fox Rocks: Using Data-mining Techniques to Examine Studen... - 0 views

  •  
    Abstract Few traditions in higher education evoke more controversy, ambivalence, criticism, and, at the same time, support than student evaluation of instruction (SEI). Ostensibly, results from these end-of-course survey instruments serve two main functions: they provide instructors with formative input for improving their teaching, and they serve as the basis for summative profiles of professors' effectiveness through the eyes of their students. In the academy, instructor evaluations also can play out in the high-stakes environments of tenure, promotion, and merit salary increases, making this information particularly important to the professional lives of faculty members. At the research level, the volume of the literature for student ratings impresses even the most casual observer with well over 2,000 studies referenced in the Education Resources Information Center (ERIC) alone (Centra, 2003) and an untold number of additional studies published in educational, psychological, psychometric, and discipline-related journals. There have been numerous attempts at summarizing this work (Algozzine et al., 2004; Gump, 2007; Marsh & Roche, 1997; Pounder, 2007; Wachtel, 1998). Student ratings gained such notoriety that in November 1997 the American Psychologist devoted an entire issue to the topic (Greenwald, 1997). The issue included student ratings articles focusing on stability and reliability, validity, dimensionality, usefulness for improving teaching and learning, and sensitivity to biasing factors, such as the Dr. Fox phenomenon that describes eliciting high student ratings with strategies that reflect little or no relationship to effective teaching practice (Ware & Williams, 1975; Williams & Ware, 1976, 1977).
George Bradford

Analytics in Higher Education: Establishing a Common Language | EDUCAUSE - 0 views

  •  
    Analytics in Higher Education: Establishing a Common Language Title: Analytics in Higher Education: Establishing a Common Language (ID: ELI3026) Author(s): Angela van Barneveld (Purdue University), Kimberly Arnold (Purdue University) and John P. Campbell (Purdue University) Topics: Academic Analytics, Action Analytics, Analytics, Business Analytics, Decision Support Systems, Learning Analytics, Predictive Analytics, Scholarship of Teaching and Learning Origin: ELI White Papers, EDUCAUSE Learning Initiative (ELI) (01/24/2012) Type: Articles, Briefs, Papers, and Reports
George Bradford

[!!!] Penetrating the Fog: Analytics in Learning and Education (EDUCAUSE Review) | EDUC... - 0 views

  • Continued growth in the amount of data creates an environment in which new or novel approaches are required to understand the patterns of value that exist within the data.
  • learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.
  • Academic analytics, in contrast, is the application of business intelligence in education and emphasizes analytics at institutional, regional, and international levels.
  • ...14 more annotations...
  • Course-level:
  • Educational data-mining
  • Intelligent curriculum
  • Adaptive content
  • the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.
  • Mobile devices
  • social media monitoring tools (e.g., Radian6)
  • Analytics in education must be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration.
    • George Bradford
       
      See Bradford - Brief vision of the semantic web as being used to support future learning: http://heybradfords.com/moonlight/research-resources/SemWeb_EducatorsVision 
    • George Bradford
       
      See Peter Goodyear's work on the Ecology of Sustainable e-Learning in Education.
  • How “real time” should analytics be in classroom settings?
  • Adaptive learning
  • EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)
  • Penetrating the Fog: Analytics in Learning and Education
  •  
    Attempts to imagine the future of education often emphasize new technologies-ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that we can't actually touch or see: big data and analytics. Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision-making improves organizational output and productivity.1 For many leaders in higher education, however, experience and "gut instinct" have a stronger pull.
George Bradford

Data Visualization: Modern Approaches - Smashing Magazine | Smashing Magazine - 0 views

  • 2. Displaying News
  • Digg Stack 15: Digg stories arrange themselves as stack as users digg them. The more diggs a story gets, the larger is the stack.
  • Let’s take a look at the most interesting modern approaches to data visualization as well as related articles, resources and tools.
  • ...6 more annotations...
  • an application that visually reflects the constantly changing landscape of the Google News news aggregator. The size of data blocks is defined by their popularity at the moment.
  • Digg stories arrange themselves as stack as users digg them. The more diggs a story gets, the larger is the stack.
  • a typographic book search, collects the information from Amazon and presents it in the form of keyword you’ve provided.
  • uses visual hills (spikes) to emphasize the density of American population in its map.
  • lets you explore the behavior of your visitors with a heat map. More popular sections, which are clicked more often, are highlighted as “warm” – in red color.
  • Eric Blue provides some references to unusual Data Visualization methods.
  •  
    Data presentation can be beautiful, elegant and descriptive. There is a variety of conventional ways to visualize data - tables, histograms, pie charts and bar graphs are being used every day, in every project and on every possible occasion. However, to convey a message to your readers effectively, sometimes you need more than just a simple pie chart of your results. In fact, there are much better, profound, creative and absolutely fascinating ways to visualize data. Many of them might become ubiquitous in the next few years.
George Bradford

About | Learning Emergence - 0 views

  •  
    CORE IDEAS We decided on the name Learning Emergence because we are very much learning about emergence and complex systems phenomena ourselves, even as we develop our thinking on learning as an emergent, systemic phenomenon in different contexts. We must shift to a new paradigm for learning in schools, universities and the workplace which addresses the challenges of the 21st Century. Society needs learners who can cope with intellectual, ethical and emotional complexity of an unprecedented nature. Learning Emergence partners share an overarching focus on deep, systemic learning and leadership - the pro-active engagement of learners and leaders in their own authentic learning journey, in the context of relationship and community. We work at the intersection of (1) deep learning and sensemaking, (2) leadership, (3) complex systems, and (4) technology:
George Bradford

Student Learning and Analytics at Michigan (SLAM) | CRLT - 0 views

  •  
    "Student Learning and Analytics at Michigan (SLAM)"
George Bradford

Assessing learning dispositions/academic mindsets | Learning Emergence - 0 views

  •  
    Assessing learning dispositions/academic mindsets Mar 01 2014 2 A few years ago Ruth and I spent a couple of days with the remarkable Larry Rosenstock at High Tech High, and were blown away by the creativity and passion that he and his team bring to authentic learning. At that point they were just beginning to conceive the idea of a Graduate School of Education (er… run by a high school?!). Yes indeed. Screen Shot 2014-02-28 at 16.56.56Now they're flying, running the Deeper Learning conference in a few weeks, and right now, the Deeper Learning MOOC [DLMOOC] is doing a great job of bringing practitioners and researchers together, and that's just from the perspective of someone on the edge who has only managed to replay the late night (in the UK) Hangouts and post a couple of stories. Huge thanks and congratulations to Larry, Rob Riordan and everyone else at High Tech High Grad School of Education, plus of course the other supporting organisations and funders who are making this happen. Here are two of my favourite sessions, in which we hear from students what it's like to be in schools where mindsets and authentic learning are taken seriously, and a panel of researcher/practitioners
George Bradford

alpha lab research network - Persistence - 0 views

  •  
    "PRODUCTIVE PERSISTENCE: A "PRACTICAL" THEORY OF COMMUNITY COLLEGE STUDENTS SUCCESS The Carnegie Foundation's Productive Persistence initiative is a practical theory of the causes of successfully completing coursework at a community college-or, in our terms, the "drivers" of successful course completion.  The term "Productive Persistence" refers to both the tenacity to persist, and also the ability to use good strategiesto productively engage with the course materials. "
George Bradford

Networked Improvement Communities: Bryk lectures Bristol 2014 | Learning Emergence - 0 views

  •  
    "'Making Systems Work - whether in healthcare, education, climate change, or making a pathway out of poverty - is the great task of our generation as a whole' and at the heart of making systems work is the problem of complexity.  Prof Tony Bryk, President of the Carnegie Foundation for the Advancement of Teaching,  spent a week with people from the Learning Emergence network, leading a Master Class for practitioners, delivering two public lectures and participating in a consultation on Learning Analytics Hubs in Networked Improvement Communities  (background).  A key idea is that in order to engage in quality improvement in any system, we need to be able to 'see the system as a whole' and not just step in and meddle with one part of it."
George Bradford

Program Evaluation Standards « Joint Committee on Standards for Educational E... - 0 views

  •  
    "   Welcome to the Program Evaluation Standards, 3rd Edition   Standards Names and Statements Errata Sheet for the book   After seven years of systematic effort and much study, the 3rd edition of the Program Evaluation Standards was published this fall by Sage Publishers: http://www.sagepub.com/booksProdDesc.nav?prodId=Book230597&_requestid=255617. The development process relied on formal and informal needs assessments, reviews of existing scholarship, and the involvement of more than 400 stakeholders in national and international reviews, field trials, and national hearings. It's the first revision of the standards in 17 years. This third edition is similar to the previous two editions (1981, 1994) in many respects, for example, the book is organized into the same four dimensions of evaluation quality (utility, feasibility, propriety, and accuracy). It also still includes the popular and useful "Functional Table of Standards," a glossary, extensive documentation, information about how to apply the standards, and numerous case applications."
George Bradford

National Institute for Learning Outcomes Assessment - 0 views

  •  
    "Accrediting associations have expectations that call on institutions to collect and use evidence of student learning outcomes at the programmatic and institutional to confirm and improve student learning.  This section of the NILOA website lists both regional accrediting associations and specialized or programmatic accrediting organizations along with links to those groups."
George Bradford

Learning process analytics - EduTech Wiki - 1 views

  •  
    "Introduction In this discussion paper, we define learning process analytics as a collection of methods that allow teachers and learners to understand what is going on in a' 'learning scenario, i.e. what participants work(ed) on, how they interact(ed), what they produced(ed), what tools they use(ed), in which physical and virtual location, etc. Learning analytics is most often aimed at generating predictive models of general student behavior. So-called academic analytics even aims to improve the system. We are trying to find a solution to a somewhat different problem. In this paper we will focus on improving project-oriented learner-centered designs, i.e. a family of educational designs that include any or some of knowledge-building, writing-to-learn, project-based learning, inquiry learning, problem-based learning and so forth. We will first provide a short literature review of learning process analytics and related frameworks that can help improve the quality of educational scenarios. We will then describe a few project-oriented educational scenarios that are implemented in various programs at the University of Geneva. These examples illustrate the kind of learning scenarios we have in mind and help define the different types of analytics both learners and teachers need. Finally, we present a provisional list of analytics desiderata divided into "wanted tomorrow" and "nice to have in the future"."
1 - 20 of 25 Next ›
Showing 20 items per page