Skip to main content

Home/ Educational Analytics/ Group items tagged times

Rss Feed Group items tagged

George Bradford

Times Higher Education - Satisfaction and its discontents - 0 views

  •  
    Satisfaction and its discontents 8 March 2012 The National Student Survey puts pressure on lecturers to provide 'enhanced' experiences. But, argues Frank Furedi, the results do not measure educational quality and the process infantilises students and corrodes academic integrity One of the striking features of a highly centralised system of higher education, such as that of the UK, is that the introduction of new targets and modifications to the quality assurance framework can have a dramatic impact in a very short space of time. When the National Student Survey was introduced in 2005, few colleagues imagined that, just several years down the road, finessing and managing its implementation would require the employment of an entirely new group of quality-assurance operatives. At the time, the NSS was seen by many as a relatively pointless public-relations exercise that would have only a minimal effect on academics' lives. It is unlikely that even its advocates would have expected the NSS to acquire a life of its own and become one of the most powerful influences on the form and nature of the work done in universities.
George Bradford

Open Research Online - Learning dispositions and transferable competencies: pedagogy, m... - 0 views

  •  
    Theoretical and empirical evidence in the learning sciences substantiates the view that deep engagement in learning is a function of a complex combination of learners' identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. To date, the learning analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by learning analytics. We report progress in the design and implementation of learning analytics based on a research validated multidimensional construct termed "learning power". We describe, for the first time, a learning analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of analytics that it supports from real time summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners. We conclude by summarising the ongoing research and development programme and identifying the challenges of integrating traditional social science research, with learning analytics and modelling.
George Bradford

About | SNAPP - Social Networks Adapting Pedagogical Practice - 3 views

  •  
    "The Social Networks Adapting Pedagogical Practice (SNAPP) tool performs real-time social network analysis and visualization of discussion forum activity within popular commercial and open source Learning Management Systems (LMS). SNAPP essentially serves as a diagnostic instrument, allowing teaching staff to evaluate student behavioral patterns against learning activity design objectives and intervene as required a timely manner. Valuable interaction data is stored within a discussion forum but from the default threaded display of messages it is difficult to determine the level and direction of activity between participants. SNAPP infers relationship ties from the post-reply data and renders a social network diagram below the forum thread. The social network visualization can be filtered based upon user activity and social network data can be exported for further analysis in NetDraw. SNAPP integrates seamlessly with a variety of Learning Management Systems (Blackboard, Moodle and Desire2Learn) and must be triggered while a forum thread is displayed in a Web browser."
George Bradford

50 most stunning examples of data visualization and infographics | Richworks - 0 views

  • The terms Data visualization and Infographics are used interchangeably, the former means the study of visual representation of data and the latter is its representation per se.
  • 42) Geological Time Spiral
  • 40) Map of online communities
  • ...4 more annotations...
  • 44) Global distribution of water?
  • 43) 1 hour in front of the TV
  • 36) Evolution of Storage
  • 33) The Life of a web article
  •  
    50 MOST STUNNING EXAMPLES OF DATA VISUALIZATION AND INFOGRAPHICS Posted by Richie on Thursday, April 15, 2010 "A picture is worth a thousand words", if I had a penny for every time I heard that!! There is so much data in the world today that it has become impossible for us to analyze them with patience. Data as we perceive it, need not be boring, bland and cumbersome to remember. To make complex things seem simple, is Creativity and using pictures to represent data has been an age old method to analyze data in a fun way. From navigating the web in an entirely new dimension to understanding how the human brain works; from peeking into how Google has evolved to analyzing the inner working of the geeky mind, Infographics has completely changed the way we view content and visualize data.
George Bradford

Open Research Online - Learning analytics to identify exploratory dialogue within synch... - 0 views

  •  
    While generic web analytics tend to focus on easily harvested quantitative data, Learning Analytics will often seek qualitative understanding of the context and meaning of this information. This is critical in the case of dialogue, which may be employed to share knowledge and jointly construct understandings, but which also involves many superficial exchanges. Previous studies have validated a particular pattern of "exploratory dialogue" in learning environments to signify sharing, challenge, evaluation and careful consideration by participants. This study investigates the use of sociocultural discourse analysis to analyse synchronous text chat during an online conference. Key words and phrases indicative of exploratory dialogue were identified in these exchanges, and peaks of exploratory dialogue were associated with periods set aside for discussion and keynote speakers. Fewer individuals posted at these times, but meaningful discussion outweighed trivial exchanges. If further analysis confirms the validity of these markers as learning analytics, they could be used by recommendation engines to support learners and teachers in locating dialogue exchanges where deeper learning appears to be taking place.
George Bradford

Discussions - Learning Analytics | Google Groups - 0 views

  •  
    Flare at Purdue in October    Hi everyone. Can someone provide more information for the upcoming SoLAR FLARE event at Purdue in October? Thanks, Kelvin Bentley By Kelvin Bentley  - May 14 - 2 new of 2 messages - Report as spam     EDUCAUSE Survey on Analytics - Looking for International Input    Colleagues, EDUCAUSE is soliciting input on analytics in higher education. They have currently sent email to their current members, but are looking for additional participation from the international community. We would greatly appreciate if you could complete the survey below. -- john... more » By John Campbell - Purdue  - May 11 - 2 new of 2 messages - Report as spam     CFP: #Influence12: Symposium & Workshop on Measuring Influence on Social Media    Hi Everyone, If you are interested in Learning Analytics and Social Media, I invite you to submit a short position paper or poster to the Symposium & Workshop on Measuring Influence on Social Media. The event is set for September 28-29, 2012 in beautiful Halifax, Nova Scotia, Canada. All submissions are due *June 15, 2012*.... more » By Anatoliy Gruzd  - May 11 - 2 new of 2 messages - Report as spam     LA beginnings    Learning Analytics isn't really new, it is just getting more publicity now as a result of the buzz word name change. Institutions have been collecting data about students for a long time, but only a few people dealt with the data. Instructors kept gradebooks and many tracked student progress locally - by hand. What's new about Learning... more »
George Bradford

http://www.sinclair.edu/support/success/ea/ - 0 views

  •  
    Early Alert Program The Early Alert classroom assistance program at Sinclair Community College is an intervention program teaming faculty, counselors, and advisors together in order to promote the success of students facing challenges.   An Overview:    Early Alert is an intervention program that allows for faculty to notify advisors/counselors of issues that may affect the success of a student.  It is a simple way of assisting students in difficulty find the help they need while taking very little time. Web-based Early Alert notifications are easy ways to promote the retention efforts of the college and the success of students. Utilized currently in all DEV courses, English 111, select Math courses, and SCC 101 courses.
George Bradford

Assessment and Analytics in Institutional Transformation (EDUCAUSE Review) | EDUCAUSE - 0 views

  • At the University of Maryland, Baltimore County (UMBC), we believe that process is an important factor in creating cultural change. We thus approach transformational initiatives by using the same scholarly rigor that we expect of any researcher. This involves (1) reviewing the literature and prior work in the area, (2) identifying critical factors and variables, (3) collecting data associated with these critical factors, (4) using rigorous statistical analysis and modeling of the question and factors, (5) developing hypotheses to influence the critical factors, and (6) collecting data based on the changes and assessing the results.
  • among predominantly white higher education institutions in the United States, UMBC has become the leading producer of African-American bachelor’s degree recipients who go on to earn Ph.D.’s in STEM fields. The program has been recognized by the National Science Foundation and the National Academies as a national model.
  • UMBC has recently begun a major effort focused on the success of transfer students in STEM majors. This effort, with pilot funding from the Bill and Melinda Gates Foundation, will look at how universities can partner with community colleges to prepare their graduates to successfully complete a bachelor’s degree in a STEM field.
  • ...5 more annotations...
  • Too often, IT organizations try to help by providing an analytics “dashboard” designed by a vendor that doesn’t know the institution. As a result, the dashboard indicators don’t focus on those key factors most needed at the institution and quickly become window-dressing.
  • IT organizations can support assessment by showing how data in separate systems can become very useful when captured and correlated. For example, UMBC has spent considerable effort to develop a reporting system based on our learning management system (LMS) data. This effort, led from within the IT organization, has helped the institution find new insights into the way faculty and students are using the LMS and has helped us improve the services we offer. We are now working to integrate this data into our institutional data warehouse and are leveraging access to important demographic data to better assess student risk factors and develop interventions.
  • the purpose of learning analytics is “to observe and understand learning behaviors in order to enable appropriate interventions.
  • the 1st International Conference on Learning Analytics and Knowledge (LAK) was held in Banff, Alberta, Canada, in early 2011 (https://tekri.athabascau.ca/analytics/)
  • At UMBC, we are using analytics and assessment to shine a light on students’ performance and behavior and to support teaching effectiveness. What has made the use of analytics and assessment particularly effective on our campus has been the insistence that all groups—faculty, staff, and students—take ownership of the challenge involving student performance and persistence.
  •  
    Assessment and analytics, supported by information technology, can change institutional culture and drive the transformation in student retention, graduation, and success. U.S. higher education has an extraordinary record of accomplishment in preparing students for leadership, in serving as a wellspring of research and creative endeavor, and in providing public service. Despite this success, colleges and universities are facing an unprecedented set of challenges. To maintain the country's global preeminence, those of us in higher education are being called on to expand the number of students we educate, increase the proportion of students in science, technology, engineering, and mathematics (STEM), and address the pervasive and long-standing underrepresentation of minorities who earn college degrees-all at a time when budgets are being reduced and questions about institutional efficiency and effectiveness are being raised.
George Bradford

Using Big Data to Predict Online Student Success | Inside Higher Ed - 0 views

  • Researchers have created a database that measures 33 variables for the online coursework of 640,000 students – a whopping 3 million course-level records.
  • Project Participants American Public University System Community College System of Colorado Rio Salado College University of Hawaii System University of Illinois-Springfield University of Phoenix
  • “What the data seem to suggest, however, is that for students who seem to have a high propensity of dropping out of an online course-based program, the fewer courses they take initially, the better-off they are.”
  • ...6 more annotations...
  • Phil Ice, vice president of research and development for the American Public University System and the project’s lead investigator.
  • Predictive Analytics Reporting Framework
  • Rio Salado, for example, has used the database to create a student performance tracking system.
  • The two-year college, which is based in Arizona, has a particularly strong online presence for a community college – 43,000 of its students are enrolled in online programs. The new tracking system allows instructors to see a red, yellow or green light for each student’s performance. And students can see their own tracking lights.
  • It measures student engagement through their Web interactions, how often they look at textbooks and whether they respond to feedback from instructors, all in addition to their performance on coursework.
  • The data set has the potential to give institutions sophisticated information about small subsets of students – such as which academic programs are best suited for a 25-year-old male Latino with strength in mathematics
  •  
    New students are more likely to drop out of online colleges if they take full courseloads than if they enroll part time, according to findings from a research project that is challenging conventional wisdom about student success. But perhaps more important than that potentially game-changing nugget, researchers said, is how the project has chipped away at skepticism in higher education about the power of "big data." Researchers have created a database that measures 33 variables for the online coursework of 640,000 students - a whopping 3 million course-level records. While the work is far from complete, the variables help track student performance and retention across a broad range of demographic factors. The data can show what works at a specific type of institution, and what doesn't. That sort of predictive analytics has long been embraced by corporations, but not so much by the academy. The ongoing data-mining effort, which was kicked off last year with a $1 million grant from the Bill and Melinda Gates Foundation, is being led by WCET, the WICHE Cooperative for Educational Technologies.
George Bradford

NSSE Home - 0 views

  •  
    National Survey of Student Engagement What is student engagement? Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning. What does NSSE do? Through its student survey, The College Student Report, NSSE annually collects information at hundreds of four-year colleges and universities about student participation in programs and activities that institutions provide for their learning and personal development. The results provide an estimate of how undergraduates spend their time and what they gain from attending college. NSSE provides participating institutions a variety of reports that compare their students' responses with those of students at self-selected groups of comparison institutions. Comparisons are available for individual survey questions and the five NSSE Benchmarks of Effective Educational Practice. Each November, NSSE also publishes its Annual Results, which reports topical research and trends in student engagement results. NSSE researchers also present and publish research findings throughout the year.
George Bradford

Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning ... - 0 views

  •  
    Simon Buckingham Shum Ruth Deakin Crick 2012 (In review) Theoretical and empirical evidence in the learning sciences  substantiates the view that deep engagement in learning is a  function of a  combination of learners' dispositions,  values,  attitudes and skills. When these are fragile, learners struggle to  achieve their potential in conventional assessments, and critically,  are not prepared for the novelty and complexity of the challenges  they will meet in the workplace, and the many other spheres of  life which require personal qualities such as resilience, critical  thinking and collaboration skills. To date, the learning analytics  research and development communities have not addressed how  these complex concepts can be modelled and analysed. We report  progress in the design and implementation of learning analytics  based on an empirically validated  multidimensional construct  termed  "learning power". We describe a  learning analytics  infrastructure  for gathering data at scale, managing stakeholder  permissions, the range of analytics that it supports from real time  summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners.  We conclude by  summarising the ongoing research and  development programme.
George Bradford

SpringerLink - Abstract - Dr. Fox Rocks: Using Data-mining Techniques to Examine Studen... - 0 views

  •  
    Abstract Few traditions in higher education evoke more controversy, ambivalence, criticism, and, at the same time, support than student evaluation of instruction (SEI). Ostensibly, results from these end-of-course survey instruments serve two main functions: they provide instructors with formative input for improving their teaching, and they serve as the basis for summative profiles of professors' effectiveness through the eyes of their students. In the academy, instructor evaluations also can play out in the high-stakes environments of tenure, promotion, and merit salary increases, making this information particularly important to the professional lives of faculty members. At the research level, the volume of the literature for student ratings impresses even the most casual observer with well over 2,000 studies referenced in the Education Resources Information Center (ERIC) alone (Centra, 2003) and an untold number of additional studies published in educational, psychological, psychometric, and discipline-related journals. There have been numerous attempts at summarizing this work (Algozzine et al., 2004; Gump, 2007; Marsh & Roche, 1997; Pounder, 2007; Wachtel, 1998). Student ratings gained such notoriety that in November 1997 the American Psychologist devoted an entire issue to the topic (Greenwald, 1997). The issue included student ratings articles focusing on stability and reliability, validity, dimensionality, usefulness for improving teaching and learning, and sensitivity to biasing factors, such as the Dr. Fox phenomenon that describes eliciting high student ratings with strategies that reflect little or no relationship to effective teaching practice (Ware & Williams, 1975; Williams & Ware, 1976, 1977).
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

[!!!] Penetrating the Fog: Analytics in Learning and Education (EDUCAUSE Review) | EDUC... - 0 views

  • Continued growth in the amount of data creates an environment in which new or novel approaches are required to understand the patterns of value that exist within the data.
  • learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.
  • Academic analytics, in contrast, is the application of business intelligence in education and emphasizes analytics at institutional, regional, and international levels.
  • ...14 more annotations...
  • Course-level:
  • Educational data-mining
  • Intelligent curriculum
  • Adaptive content
  • the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.
  • Mobile devices
  • social media monitoring tools (e.g., Radian6)
  • Analytics in education must be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration.
    • George Bradford
       
      See Bradford - Brief vision of the semantic web as being used to support future learning: http://heybradfords.com/moonlight/research-resources/SemWeb_EducatorsVision 
    • George Bradford
       
      See Peter Goodyear's work on the Ecology of Sustainable e-Learning in Education.
  • How “real time” should analytics be in classroom settings?
  • Adaptive learning
  • EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)
  • Penetrating the Fog: Analytics in Learning and Education
  •  
    Attempts to imagine the future of education often emphasize new technologies-ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that we can't actually touch or see: big data and analytics. Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision-making improves organizational output and productivity.1 For many leaders in higher education, however, experience and "gut instinct" have a stronger pull.
George Bradford

What Google Learned From Its Quest to Build the Perfect Team - The New York Times - 1 views

  •  
    "What Google Learned From Its Quest to Build the Perfect Team"
1 - 15 of 15
Showing 20 items per page