Skip to main content

Home/ Educational Analytics/ Group items tagged assessment

Rss Feed Group items tagged

George Bradford

Assessment and Analytics in Institutional Transformation (EDUCAUSE Review) | EDUCAUSE - 0 views

  • At the University of Maryland, Baltimore County (UMBC), we believe that process is an important factor in creating cultural change. We thus approach transformational initiatives by using the same scholarly rigor that we expect of any researcher. This involves (1) reviewing the literature and prior work in the area, (2) identifying critical factors and variables, (3) collecting data associated with these critical factors, (4) using rigorous statistical analysis and modeling of the question and factors, (5) developing hypotheses to influence the critical factors, and (6) collecting data based on the changes and assessing the results.
  • among predominantly white higher education institutions in the United States, UMBC has become the leading producer of African-American bachelor’s degree recipients who go on to earn Ph.D.’s in STEM fields. The program has been recognized by the National Science Foundation and the National Academies as a national model.
  • UMBC has recently begun a major effort focused on the success of transfer students in STEM majors. This effort, with pilot funding from the Bill and Melinda Gates Foundation, will look at how universities can partner with community colleges to prepare their graduates to successfully complete a bachelor’s degree in a STEM field.
  • ...5 more annotations...
  • Too often, IT organizations try to help by providing an analytics “dashboard” designed by a vendor that doesn’t know the institution. As a result, the dashboard indicators don’t focus on those key factors most needed at the institution and quickly become window-dressing.
  • IT organizations can support assessment by showing how data in separate systems can become very useful when captured and correlated. For example, UMBC has spent considerable effort to develop a reporting system based on our learning management system (LMS) data. This effort, led from within the IT organization, has helped the institution find new insights into the way faculty and students are using the LMS and has helped us improve the services we offer. We are now working to integrate this data into our institutional data warehouse and are leveraging access to important demographic data to better assess student risk factors and develop interventions.
  • the purpose of learning analytics is “to observe and understand learning behaviors in order to enable appropriate interventions.
  • the 1st International Conference on Learning Analytics and Knowledge (LAK) was held in Banff, Alberta, Canada, in early 2011 (https://tekri.athabascau.ca/analytics/)
  • At UMBC, we are using analytics and assessment to shine a light on students’ performance and behavior and to support teaching effectiveness. What has made the use of analytics and assessment particularly effective on our campus has been the insistence that all groups—faculty, staff, and students—take ownership of the challenge involving student performance and persistence.
  •  
    Assessment and analytics, supported by information technology, can change institutional culture and drive the transformation in student retention, graduation, and success. U.S. higher education has an extraordinary record of accomplishment in preparing students for leadership, in serving as a wellspring of research and creative endeavor, and in providing public service. Despite this success, colleges and universities are facing an unprecedented set of challenges. To maintain the country's global preeminence, those of us in higher education are being called on to expand the number of students we educate, increase the proportion of students in science, technology, engineering, and mathematics (STEM), and address the pervasive and long-standing underrepresentation of minorities who earn college degrees-all at a time when budgets are being reduced and questions about institutional efficiency and effectiveness are being raised.
George Bradford

Dr Ruth Deakin Crick - Graduate School of Education - 0 views

  •  
    First, the ongoing exploration of the reliability and validity of the psychometric assessment instrument designed to measure and stimulate change in learning power, for which I was one of three originators between 2000 and 2002. To date I have been able to collect large data sets (n=>50,000) and have published reliability and validity statistics in four  peer reviewed journal articles. Second, the application of the concept and assessment of learning power in pedagogy in school, community and corporate sectors, and in particular its contribution to personalisation of learning through authentic enquiry. Third, the contribution of learning power and enquiry to what we know about complexity in education, particularly through the development of systems learning and leadership as a vehicle for organisational transformation. Finally, the application of learning power assessment strategies to the emerging field of learning analytics and agent-based modelling.
George Bradford

Assessing learning dispositions/academic mindsets | Learning Emergence - 0 views

  •  
    Assessing learning dispositions/academic mindsets Mar 01 2014 2 A few years ago Ruth and I spent a couple of days with the remarkable Larry Rosenstock at High Tech High, and were blown away by the creativity and passion that he and his team bring to authentic learning. At that point they were just beginning to conceive the idea of a Graduate School of Education (er… run by a high school?!). Yes indeed. Screen Shot 2014-02-28 at 16.56.56Now they're flying, running the Deeper Learning conference in a few weeks, and right now, the Deeper Learning MOOC [DLMOOC] is doing a great job of bringing practitioners and researchers together, and that's just from the perspective of someone on the edge who has only managed to replay the late night (in the UK) Hangouts and post a couple of stories. Huge thanks and congratulations to Larry, Rob Riordan and everyone else at High Tech High Grad School of Education, plus of course the other supporting organisations and funders who are making this happen. Here are two of my favourite sessions, in which we hear from students what it's like to be in schools where mindsets and authentic learning are taken seriously, and a panel of researcher/practitioners
George Bradford

Assessment | University of Wisconsin-Madison - 0 views

  •  
    "Using Assessment for Academic Program Improvement Revised April 2009 "
George Bradford

Assessment Commons - Internet Resources for Higher Education Outcomes Assessment - 0 views

  •  
    "General Resources Discussion Lists, Forums, Archives of Articles, Lists of Links, etc. Principles of good outcomes assessment practice "
George Bradford

Selected outcomes assessment resources | ALA Accredited Programs - 0 views

  •  
    "Selected outcomes assessment resources"
George Bradford

Office of Student Learning Assessment: Examples of Direct and Indirect Measures - Cleve... - 0 views

  •  
    "Examples of Direct and Indirect Measures Examples of Direct Measures of Student Learning"
George Bradford

Open Research Online - Learning dispositions and transferable competencies: pedagogy, m... - 0 views

  •  
    Theoretical and empirical evidence in the learning sciences substantiates the view that deep engagement in learning is a function of a complex combination of learners' identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. To date, the learning analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by learning analytics. We report progress in the design and implementation of learning analytics based on a research validated multidimensional construct termed "learning power". We describe, for the first time, a learning analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of analytics that it supports from real time summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners. We conclude by summarising the ongoing research and development programme and identifying the challenges of integrating traditional social science research, with learning analytics and modelling.
George Bradford

Analytics in Higher Education - Benefits, Barriers, Progress, and Recommendations (EDUC... - 0 views

  •  
    Jacqueline Bichsel - 2012 EDUCAUSE Many colleges and universities have demonstrated that analytics can help significantly advance an institution in such strategic areas as resource allocation, student success, and finance. Higher education leaders hear about these transformations occurring at other institutions and wonder how their institutions can initiate or build upon their own analytics programs. Some question whether they have the resources, infrastructure, processes, or data for analytics. Some wonder whether their institutions are on par with other in their analytics endeavors. It is within that context that this study set out to assess the current state of analytics in higher education, outline the challenges and barriers to analytics, and provide a basis for benchmarking progress in analytics.
George Bradford

Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning ... - 0 views

  •  
    Simon Buckingham Shum Ruth Deakin Crick 2012 (In review) Theoretical and empirical evidence in the learning sciences  substantiates the view that deep engagement in learning is a  function of a  combination of learners' dispositions,  values,  attitudes and skills. When these are fragile, learners struggle to  achieve their potential in conventional assessments, and critically,  are not prepared for the novelty and complexity of the challenges  they will meet in the workplace, and the many other spheres of  life which require personal qualities such as resilience, critical  thinking and collaboration skills. To date, the learning analytics  research and development communities have not addressed how  these complex concepts can be modelled and analysed. We report  progress in the design and implementation of learning analytics  based on an empirically validated  multidimensional construct  termed  "learning power". We describe a  learning analytics  infrastructure  for gathering data at scale, managing stakeholder  permissions, the range of analytics that it supports from real time  summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners.  We conclude by  summarising the ongoing research and  development programme.
George Bradford

Program Evaluation Standards « Joint Committee on Standards for Educational E... - 0 views

  •  
    "   Welcome to the Program Evaluation Standards, 3rd Edition   Standards Names and Statements Errata Sheet for the book   After seven years of systematic effort and much study, the 3rd edition of the Program Evaluation Standards was published this fall by Sage Publishers: http://www.sagepub.com/booksProdDesc.nav?prodId=Book230597&_requestid=255617. The development process relied on formal and informal needs assessments, reviews of existing scholarship, and the involvement of more than 400 stakeholders in national and international reviews, field trials, and national hearings. It's the first revision of the standards in 17 years. This third edition is similar to the previous two editions (1981, 1994) in many respects, for example, the book is organized into the same four dimensions of evaluation quality (utility, feasibility, propriety, and accuracy). It also still includes the popular and useful "Functional Table of Standards," a glossary, extensive documentation, information about how to apply the standards, and numerous case applications."
George Bradford

National Institute for Learning Outcomes Assessment - 0 views

  •  
    "Accrediting associations have expectations that call on institutions to collect and use evidence of student learning outcomes at the programmatic and institutional to confirm and improve student learning.  This section of the NILOA website lists both regional accrediting associations and specialized or programmatic accrediting organizations along with links to those groups."
George Bradford

Developing Student Learning Outcomes - Tool Box - Assessment - CSU, Chico - 0 views

  •  
    "Developing Student Learning Outcomes Student learning outcome (SLO) statements take the program learning goals and focus on how students can demonstrate that the goals are being met. In other words, SLOs answer the question: how can graduates from this program demonstrate they have the needed/stated knowledge, skills, and/or values. SLOs are clear, concise statements that describe how students can demonstrate their mastery of program learning goals. Each student learning outcome statement must be measurable. Measures are applied to student work and may include student assignments, work samples, tests, etc. measuring student ability/skill, knowledge, or attitude/value."
George Bradford

[!!!] Penetrating the Fog: Analytics in Learning and Education (EDUCAUSE Review) | EDUC... - 0 views

  • Continued growth in the amount of data creates an environment in which new or novel approaches are required to understand the patterns of value that exist within the data.
  • learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.
  • Academic analytics, in contrast, is the application of business intelligence in education and emphasizes analytics at institutional, regional, and international levels.
  • ...14 more annotations...
  • Course-level:
  • Educational data-mining
  • Intelligent curriculum
  • Adaptive content
  • the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.
  • Mobile devices
  • social media monitoring tools (e.g., Radian6)
  • Analytics in education must be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration.
    • George Bradford
       
      See Bradford - Brief vision of the semantic web as being used to support future learning: http://heybradfords.com/moonlight/research-resources/SemWeb_EducatorsVision 
    • George Bradford
       
      See Peter Goodyear's work on the Ecology of Sustainable e-Learning in Education.
  • How “real time” should analytics be in classroom settings?
  • Adaptive learning
  • EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)
  • Penetrating the Fog: Analytics in Learning and Education
  •  
    Attempts to imagine the future of education often emphasize new technologies-ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that we can't actually touch or see: big data and analytics. Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision-making improves organizational output and productivity.1 For many leaders in higher education, however, experience and "gut instinct" have a stronger pull.
George Bradford

AUSSE | ACER - 0 views

  •  
    Australasian Survey of Student Engagement (AUSSE) Areas measured by the AUSSE The survey instruments used in the AUSSE collect information on around 100 specific learning activities and conditions along with information on individual demographics and educational contexts.The instruments contain items that map onto six student engagement scales: Academic Challenge - the extent to which expectations and assessments challenge students to learn; Active Learning - students' efforts to actively construct knowledge; Student and Staff Interactions - the level and nature of students' contact and interaction with teaching staff; Enriching Educational Experiences - students' participation in broadening educational activities; Supportive Learning Environment - students' feelings of support within the university community; and Work Integrated Learning - integration of employment-focused work experiences into study. The instruments also contain items that map onto seven outcome measures. Average overall grade is captured in a single item, and the other six are composite measures which reflect responses to several items: Higher-Order Thinking - participation in higher-order forms of thinking; General Learning Outcomes - development of general competencies; General Development Outcomes - development of general forms of individual and social development; Career Readiness - preparation for participation in the professional workforce; Average Overall Grade - average overall grade so far in course; Departure Intention - non-graduating students' intentions on not returning to study in the following year; and Overall Satisfaction - students' overall satisfaction with their educational experience.
1 - 15 of 15
Showing 20 items per page