Skip to main content

Home/ Educational Analytics/ Group items tagged measures

Rss Feed Group items tagged

George Bradford

Measuring Teacher Effectiveness - DataQualityCampaign.Org - 0 views

  •  
    Measuring Teacher Effectiveness Significant State Data Capacity is Required to Measure and Improve Teacher Effectiveness  States Increasingly Focus on Improving Teacher Effectiveness: There is significant activity at the local, state, and federal levels to  measure and improve teacher effectiveness, with an unprecedented focus on the use of student achievement as a primary indicator of  effectiveness. > 23 states require that teacher evaluations include evidence of student learning in the form of student growth and/or value-added data (NCTQ, 2011). > 17 states and DC have adopted legislation or regulations that specifically require student achievement and/or student growth to "significantly" inform or be the primary criterion in teacher evaluations(NCTQ, 2011).  States Need Significant Data Capacity to Do This Work: These policy changes have significant data implications. > The linchpin of all these efforts is that states must reliably link students and teachers in ways that capture the complex connections that  exist in schools. > If such data is to be used for high stakes decisions-such as hiring, firing, and tenure-it must be accepted as valid, reliable, and fair. > Teacher effectiveness data can be leveraged to target professional development, inform staffing assignments, tailor classroom instruction,  reflect on practice, support research, and otherwise support teachers.  Federal Policies Are Accelerating State and Local Efforts: Federal policies increasingly support states' efforts to use student  achievement data to measure teacher effectiveness. > Various competitive grant funds, including the Race to the Top grants and the Teacher Incentive Fund, require states to implement teacher  and principal evaluation systems that take student data into account.  > States applying for NCLB waivers, including the 11 that submitted requests in November 2011, must commit to implementing teacher and  principal evaluation and support systems. > P
George Bradford

Developing Student Learning Outcomes - Tool Box - Assessment - CSU, Chico - 0 views

  •  
    "Developing Student Learning Outcomes Student learning outcome (SLO) statements take the program learning goals and focus on how students can demonstrate that the goals are being met. In other words, SLOs answer the question: how can graduates from this program demonstrate they have the needed/stated knowledge, skills, and/or values. SLOs are clear, concise statements that describe how students can demonstrate their mastery of program learning goals. Each student learning outcome statement must be measurable. Measures are applied to student work and may include student assignments, work samples, tests, etc. measuring student ability/skill, knowledge, or attitude/value."
George Bradford

Office of Student Learning Assessment: Examples of Direct and Indirect Measures - Cleve... - 0 views

  •  
    "Examples of Direct and Indirect Measures Examples of Direct Measures of Student Learning"
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

Discussions - Learning Analytics | Google Groups - 0 views

  •  
    Flare at Purdue in October    Hi everyone. Can someone provide more information for the upcoming SoLAR FLARE event at Purdue in October? Thanks, Kelvin Bentley By Kelvin Bentley  - May 14 - 2 new of 2 messages - Report as spam     EDUCAUSE Survey on Analytics - Looking for International Input    Colleagues, EDUCAUSE is soliciting input on analytics in higher education. They have currently sent email to their current members, but are looking for additional participation from the international community. We would greatly appreciate if you could complete the survey below. -- john... more » By John Campbell - Purdue  - May 11 - 2 new of 2 messages - Report as spam     CFP: #Influence12: Symposium & Workshop on Measuring Influence on Social Media    Hi Everyone, If you are interested in Learning Analytics and Social Media, I invite you to submit a short position paper or poster to the Symposium & Workshop on Measuring Influence on Social Media. The event is set for September 28-29, 2012 in beautiful Halifax, Nova Scotia, Canada. All submissions are due *June 15, 2012*.... more » By Anatoliy Gruzd  - May 11 - 2 new of 2 messages - Report as spam     LA beginnings    Learning Analytics isn't really new, it is just getting more publicity now as a result of the buzz word name change. Institutions have been collecting data about students for a long time, but only a few people dealt with the data. Instructors kept gradebooks and many tracked student progress locally - by hand. What's new about Learning... more »
George Bradford

Using Big Data to Predict Online Student Success | Inside Higher Ed - 0 views

  • Researchers have created a database that measures 33 variables for the online coursework of 640,000 students – a whopping 3 million course-level records.
  • Project Participants American Public University System Community College System of Colorado Rio Salado College University of Hawaii System University of Illinois-Springfield University of Phoenix
  • “What the data seem to suggest, however, is that for students who seem to have a high propensity of dropping out of an online course-based program, the fewer courses they take initially, the better-off they are.”
  • ...6 more annotations...
  • Phil Ice, vice president of research and development for the American Public University System and the project’s lead investigator.
  • Predictive Analytics Reporting Framework
  • Rio Salado, for example, has used the database to create a student performance tracking system.
  • The two-year college, which is based in Arizona, has a particularly strong online presence for a community college – 43,000 of its students are enrolled in online programs. The new tracking system allows instructors to see a red, yellow or green light for each student’s performance. And students can see their own tracking lights.
  • It measures student engagement through their Web interactions, how often they look at textbooks and whether they respond to feedback from instructors, all in addition to their performance on coursework.
  • The data set has the potential to give institutions sophisticated information about small subsets of students – such as which academic programs are best suited for a 25-year-old male Latino with strength in mathematics
  •  
    New students are more likely to drop out of online colleges if they take full courseloads than if they enroll part time, according to findings from a research project that is challenging conventional wisdom about student success. But perhaps more important than that potentially game-changing nugget, researchers said, is how the project has chipped away at skepticism in higher education about the power of "big data." Researchers have created a database that measures 33 variables for the online coursework of 640,000 students - a whopping 3 million course-level records. While the work is far from complete, the variables help track student performance and retention across a broad range of demographic factors. The data can show what works at a specific type of institution, and what doesn't. That sort of predictive analytics has long been embraced by corporations, but not so much by the academy. The ongoing data-mining effort, which was kicked off last year with a $1 million grant from the Bill and Melinda Gates Foundation, is being led by WCET, the WICHE Cooperative for Educational Technologies.
George Bradford

AUSSE | ACER - 0 views

  •  
    Australasian Survey of Student Engagement (AUSSE) Areas measured by the AUSSE The survey instruments used in the AUSSE collect information on around 100 specific learning activities and conditions along with information on individual demographics and educational contexts.The instruments contain items that map onto six student engagement scales: Academic Challenge - the extent to which expectations and assessments challenge students to learn; Active Learning - students' efforts to actively construct knowledge; Student and Staff Interactions - the level and nature of students' contact and interaction with teaching staff; Enriching Educational Experiences - students' participation in broadening educational activities; Supportive Learning Environment - students' feelings of support within the university community; and Work Integrated Learning - integration of employment-focused work experiences into study. The instruments also contain items that map onto seven outcome measures. Average overall grade is captured in a single item, and the other six are composite measures which reflect responses to several items: Higher-Order Thinking - participation in higher-order forms of thinking; General Learning Outcomes - development of general competencies; General Development Outcomes - development of general forms of individual and social development; Career Readiness - preparation for participation in the professional workforce; Average Overall Grade - average overall grade so far in course; Departure Intention - non-graduating students' intentions on not returning to study in the following year; and Overall Satisfaction - students' overall satisfaction with their educational experience.
George Bradford

American Statistical Association seeks to usher in new era of statistical significance - 0 views

  •  
    "The American Statistical Association seeks to embrace science's inherent complexity and push for more data transparency by rejecting a common, oversimplified measure of statistical significance. March 15, 2016 By Colleen Flaherty Is the tyrannical reign of the P value finally ending (if it was ever tyrannical at all)? An unprecedented statement from the American Statistical Association seeks to usher in a "post-P
George Bradford

Times Higher Education - Satisfaction and its discontents - 0 views

  •  
    Satisfaction and its discontents 8 March 2012 The National Student Survey puts pressure on lecturers to provide 'enhanced' experiences. But, argues Frank Furedi, the results do not measure educational quality and the process infantilises students and corrodes academic integrity One of the striking features of a highly centralised system of higher education, such as that of the UK, is that the introduction of new targets and modifications to the quality assurance framework can have a dramatic impact in a very short space of time. When the National Student Survey was introduced in 2005, few colleagues imagined that, just several years down the road, finessing and managing its implementation would require the employment of an entirely new group of quality-assurance operatives. At the time, the NSS was seen by many as a relatively pointless public-relations exercise that would have only a minimal effect on academics' lives. It is unlikely that even its advocates would have expected the NSS to acquire a life of its own and become one of the most powerful influences on the form and nature of the work done in universities.
George Bradford

Dr Ruth Deakin Crick - Graduate School of Education - 0 views

  •  
    First, the ongoing exploration of the reliability and validity of the psychometric assessment instrument designed to measure and stimulate change in learning power, for which I was one of three originators between 2000 and 2002. To date I have been able to collect large data sets (n=>50,000) and have published reliability and validity statistics in four  peer reviewed journal articles. Second, the application of the concept and assessment of learning power in pedagogy in school, community and corporate sectors, and in particular its contribution to personalisation of learning through authentic enquiry. Third, the contribution of learning power and enquiry to what we know about complexity in education, particularly through the development of systems learning and leadership as a vehicle for organisational transformation. Finally, the application of learning power assessment strategies to the emerging field of learning analytics and agent-based modelling.
George Bradford

[!!!] Penetrating the Fog: Analytics in Learning and Education (EDUCAUSE Review) | EDUC... - 0 views

  • Continued growth in the amount of data creates an environment in which new or novel approaches are required to understand the patterns of value that exist within the data.
  • learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.
  • Academic analytics, in contrast, is the application of business intelligence in education and emphasizes analytics at institutional, regional, and international levels.
  • ...14 more annotations...
  • Course-level:
  • Educational data-mining
  • Intelligent curriculum
  • Adaptive content
  • the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.
  • Mobile devices
  • social media monitoring tools (e.g., Radian6)
  • Analytics in education must be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration.
    • George Bradford
       
      See Bradford - Brief vision of the semantic web as being used to support future learning: http://heybradfords.com/moonlight/research-resources/SemWeb_EducatorsVision 
    • George Bradford
       
      See Peter Goodyear's work on the Ecology of Sustainable e-Learning in Education.
  • How “real time” should analytics be in classroom settings?
  • Adaptive learning
  • EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)
  • Penetrating the Fog: Analytics in Learning and Education
  •  
    Attempts to imagine the future of education often emphasize new technologies-ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that we can't actually touch or see: big data and analytics. Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision-making improves organizational output and productivity.1 For many leaders in higher education, however, experience and "gut instinct" have a stronger pull.
1 - 11 of 11
Showing 20 items per page