Skip to main content

Home/ Educational Analytics/ Group items tagged university

Rss Feed Group items tagged

George Bradford

Analytics in Higher Education: Establishing a Common Language | EDUCAUSE - 0 views

  •  
    Analytics in Higher Education: Establishing a Common Language Title: Analytics in Higher Education: Establishing a Common Language (ID: ELI3026) Author(s): Angela van Barneveld (Purdue University), Kimberly Arnold (Purdue University) and John P. Campbell (Purdue University) Topics: Academic Analytics, Action Analytics, Analytics, Business Analytics, Decision Support Systems, Learning Analytics, Predictive Analytics, Scholarship of Teaching and Learning Origin: ELI White Papers, EDUCAUSE Learning Initiative (ELI) (01/24/2012) Type: Articles, Briefs, Papers, and Reports
George Bradford

Using Big Data to Predict Online Student Success | Inside Higher Ed - 0 views

  • Researchers have created a database that measures 33 variables for the online coursework of 640,000 students – a whopping 3 million course-level records.
  • Project Participants American Public University System Community College System of Colorado Rio Salado College University of Hawaii System University of Illinois-Springfield University of Phoenix
  • “What the data seem to suggest, however, is that for students who seem to have a high propensity of dropping out of an online course-based program, the fewer courses they take initially, the better-off they are.”
  • ...6 more annotations...
  • Phil Ice, vice president of research and development for the American Public University System and the project’s lead investigator.
  • Predictive Analytics Reporting Framework
  • Rio Salado, for example, has used the database to create a student performance tracking system.
  • The two-year college, which is based in Arizona, has a particularly strong online presence for a community college – 43,000 of its students are enrolled in online programs. The new tracking system allows instructors to see a red, yellow or green light for each student’s performance. And students can see their own tracking lights.
  • It measures student engagement through their Web interactions, how often they look at textbooks and whether they respond to feedback from instructors, all in addition to their performance on coursework.
  • The data set has the potential to give institutions sophisticated information about small subsets of students – such as which academic programs are best suited for a 25-year-old male Latino with strength in mathematics
  •  
    New students are more likely to drop out of online colleges if they take full courseloads than if they enroll part time, according to findings from a research project that is challenging conventional wisdom about student success. But perhaps more important than that potentially game-changing nugget, researchers said, is how the project has chipped away at skepticism in higher education about the power of "big data." Researchers have created a database that measures 33 variables for the online coursework of 640,000 students - a whopping 3 million course-level records. While the work is far from complete, the variables help track student performance and retention across a broad range of demographic factors. The data can show what works at a specific type of institution, and what doesn't. That sort of predictive analytics has long been embraced by corporations, but not so much by the academy. The ongoing data-mining effort, which was kicked off last year with a $1 million grant from the Bill and Melinda Gates Foundation, is being led by WCET, the WICHE Cooperative for Educational Technologies.
George Bradford

University builds 'course recommendation engine' to steer students toward completion | ... - 0 views

  •  
    Recommended for You March 16, 2012 - 3:00am By Steve Kolowich Completing assignments and sitting through exams can be stressful. But when it comes to being graded the waiting is often the hardest part. This is perhaps most true at the end of a semester, as students wait for their instructors to reduce months of work into a series of letter grades that will stay on the books forever. But at Austin Peay State University, students do not have to wait for the end of a semester to learn their grade averages. Thanks to a new technology, pioneered by the university's provost, they do not even have to wait for the semester to start.
George Bradford

IBM Solidifies Academic Analytics Investments - Datanami - 0 views

  •  
    December 22, 2011 IBM Solidifies Academic Analytics Investments Datanami Staff As their own detailed report in conjunction with MIT Sloan made clear, IBM is keenly aware of the dramatic talent shortfall that could keep the future of big data analytics in check. Accordingly, the company is stepping in to boost analytics-driven programs at universities around the world. A report out of India this week indicated that Big Blue is firming up its investments at a number of academic institutions worldwide in the hopes of readying a new generation of analytics graduates. This effort springs from the company's Academic Initiative, which is the IBM-led effort to partner with universities to extend the capabilities of institutions to provide functional IT training and research opportunities.
George Bradford

Assessment and Analytics in Institutional Transformation (EDUCAUSE Review) | EDUCAUSE - 0 views

  • At the University of Maryland, Baltimore County (UMBC), we believe that process is an important factor in creating cultural change. We thus approach transformational initiatives by using the same scholarly rigor that we expect of any researcher. This involves (1) reviewing the literature and prior work in the area, (2) identifying critical factors and variables, (3) collecting data associated with these critical factors, (4) using rigorous statistical analysis and modeling of the question and factors, (5) developing hypotheses to influence the critical factors, and (6) collecting data based on the changes and assessing the results.
  • among predominantly white higher education institutions in the United States, UMBC has become the leading producer of African-American bachelor’s degree recipients who go on to earn Ph.D.’s in STEM fields. The program has been recognized by the National Science Foundation and the National Academies as a national model.
  • UMBC has recently begun a major effort focused on the success of transfer students in STEM majors. This effort, with pilot funding from the Bill and Melinda Gates Foundation, will look at how universities can partner with community colleges to prepare their graduates to successfully complete a bachelor’s degree in a STEM field.
  • ...5 more annotations...
  • Too often, IT organizations try to help by providing an analytics “dashboard” designed by a vendor that doesn’t know the institution. As a result, the dashboard indicators don’t focus on those key factors most needed at the institution and quickly become window-dressing.
  • IT organizations can support assessment by showing how data in separate systems can become very useful when captured and correlated. For example, UMBC has spent considerable effort to develop a reporting system based on our learning management system (LMS) data. This effort, led from within the IT organization, has helped the institution find new insights into the way faculty and students are using the LMS and has helped us improve the services we offer. We are now working to integrate this data into our institutional data warehouse and are leveraging access to important demographic data to better assess student risk factors and develop interventions.
  • the purpose of learning analytics is “to observe and understand learning behaviors in order to enable appropriate interventions.
  • the 1st International Conference on Learning Analytics and Knowledge (LAK) was held in Banff, Alberta, Canada, in early 2011 (https://tekri.athabascau.ca/analytics/)
  • At UMBC, we are using analytics and assessment to shine a light on students’ performance and behavior and to support teaching effectiveness. What has made the use of analytics and assessment particularly effective on our campus has been the insistence that all groups—faculty, staff, and students—take ownership of the challenge involving student performance and persistence.
  •  
    Assessment and analytics, supported by information technology, can change institutional culture and drive the transformation in student retention, graduation, and success. U.S. higher education has an extraordinary record of accomplishment in preparing students for leadership, in serving as a wellspring of research and creative endeavor, and in providing public service. Despite this success, colleges and universities are facing an unprecedented set of challenges. To maintain the country's global preeminence, those of us in higher education are being called on to expand the number of students we educate, increase the proportion of students in science, technology, engineering, and mathematics (STEM), and address the pervasive and long-standing underrepresentation of minorities who earn college degrees-all at a time when budgets are being reduced and questions about institutional efficiency and effectiveness are being raised.
George Bradford

QUT | Learning and Teaching Unit | REFRAME - 0 views

  •  
    REFRAME REFRAME is a university-wide project reconceptualising QUT's evaluation of learning and teaching. REFRAME is fundamentally reconsidering QUT's overall approach to evaluating learning and teaching. Our aim is to develop a sophisticated risk-based system to gather, analyse and respond to data along with a broader set of user-centered resources. The objective is to provide individuals and teams with the tools, support and reporting they need to meaningfully reflect upon, review and improve teaching, student learning and the curriculum. The approach will be informed by feedback from the university community, practices in other institutions and the literature, and will, as far as possible, be 'future-proofed' through awareness of emergent evaluation trends and tools. Central to REFRAME is the consideration of the purpose of evaluation and the features that a future approach should consider.
George Bradford

Sydney Learning Analytics Research Group (LARG) - 0 views

  •  
    "SYDNEY LEARNING ANALYTICS RESEARCH GROUP About The Sydney Learning Analytics Research Group (LARG) is a joint venture of the newly established Quality and Analytics Group within the Education Portfolio, and the new Centre for Research on Learning and Innovation connected to the Faculty of Education and Social Work. The key purposes in establishing the new research group are: Capacity building in learning analytics for the benefit of the institution, its students and staff To generate interest and expertise in learning analytics at the University, and build a new network of research colleagues To build a profile for the University of Sydney as a national and international leader in learning analytics LARG was launched at ALASI in late November 2015. The leadership team is actively planning now for the 2016 calendar year and beyond, with several community-building initiatives already in the pipeline, the first being a lecture by George Siemens, and the second is a new conference travel grant (see details below)."
George Bradford

Open Research Online - Social Learning Analytics: Five Approaches - 0 views

  •  
    This paper proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these analytics on SocialLearn, an online learning space in use at the UK's Open University, and the challenges that this is raising. This work takes an iterative approach to analytics, encouraging learners to respond to and help to shape not only the analytics but also their associated recommendations.
George Bradford

Times Higher Education - Satisfaction and its discontents - 0 views

  •  
    Satisfaction and its discontents 8 March 2012 The National Student Survey puts pressure on lecturers to provide 'enhanced' experiences. But, argues Frank Furedi, the results do not measure educational quality and the process infantilises students and corrodes academic integrity One of the striking features of a highly centralised system of higher education, such as that of the UK, is that the introduction of new targets and modifications to the quality assurance framework can have a dramatic impact in a very short space of time. When the National Student Survey was introduced in 2005, few colleagues imagined that, just several years down the road, finessing and managing its implementation would require the employment of an entirely new group of quality-assurance operatives. At the time, the NSS was seen by many as a relatively pointless public-relations exercise that would have only a minimal effect on academics' lives. It is unlikely that even its advocates would have expected the NSS to acquire a life of its own and become one of the most powerful influences on the form and nature of the work done in universities.
George Bradford

People | Knowledge Media Institute | The Open University - 0 views

  •  
    People | Member | Simon Buckingham Shum Snr Lecturer in Knowledge Media I am fundamentally interested in technologies for sensemaking, specifically, which structure discourse to assist reflection and analysis. Examples: D3E, Compendium, ClaiMaker and Cohere.
George Bradford

Analytics in Higher Education - Benefits, Barriers, Progress, and Recommendations (EDUC... - 0 views

  •  
    Jacqueline Bichsel - 2012 EDUCAUSE Many colleges and universities have demonstrated that analytics can help significantly advance an institution in such strategic areas as resource allocation, student success, and finance. Higher education leaders hear about these transformations occurring at other institutions and wonder how their institutions can initiate or build upon their own analytics programs. Some question whether they have the resources, infrastructure, processes, or data for analytics. Some wonder whether their institutions are on par with other in their analytics endeavors. It is within that context that this study set out to assess the current state of analytics in higher education, outline the challenges and barriers to analytics, and provide a basis for benchmarking progress in analytics.
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

College Degrees, Designed by the Numbers - Technology - The Chronicle of Higher Education - 0 views

  • Arizona State's retention rate rose to 84 percent from 77 percent in recent years, a change that the provost credits largely to eAdvisor.
  • Mr. Lange and his colleagues had found that by the eighth day of class, they could predict, with 70-percent accuracy, whether a student would score a C or better. Mr. Lange built a system, rolled out in 2009, that sent professors frequently updated alerts about how well each student was predicted to do, based on course performance and online behavior.
  • Rio Salado knows from its database that students who hand in late assignments and don't log in frequently often fail or withdraw from a course. So the software is more likely to throw up a red flag for current students with those characteristics.
  • ...5 more annotations...
  • And in a cautionary tale about technical glitches, the college began sharing grade predictions with students last summer, hoping to encourage those lagging behind to step up, but had to shut the alerts down in the spring. Course revisions had skewed the calculations, and some predictions were found to be inaccurate. An internal analysis found no increase in the number of students dropping classes. An improved system is promised for the fall.
  • His software borrows a page from Netflix. It melds each student's transcript with thousands of past students' grades and standardized-test scores to make suggestions. When students log into the online portal, they see 10 "Course Suggestions for You," ranked on a five-star scale. For, say, a health-and-human-performance major, kinesiology might get five stars, as the next class needed for her major. Physics might also top the list, to satisfy a science requirement in the core curriculum.
  • Behind those recommendations is a complex algorithm, but the basics are simple enough. Degree requirements figure in the calculations. So do classes that can be used in many programs, like freshman writing. And the software bumps up courses for which a student might have a talent, by mining their records—grades, high-school grade-point average, ACT scores—and those of others who walked this path before.
  • The software sifts through a database of hundreds of thousands of grades other students have received. It analyzes the historical data to figure out how much weight to assign each piece of the health major's own academic record in forecasting how she will do in a particular course. Success in math is strongly predictive of success in physics, for example. So if her transcript and ACT score indicate a history of doing well in math, physics would probably be recommended over biology, though both satisfy the same core science requirement.
  • Every year, students in Tennessee lose their state scholarships because they fall a hair short of the GPA cutoff, Mr. Denley says, a financial swing that "massively changes their likelihood of graduating."
  •  
    July 18, 2012 College Degrees, Designed by the Numbers By Marc Parry Illustration by Randy Lyhus for The Chronicle Campuses are places of intuition and serendipity: A professor senses confusion on a student's face and repeats his point; a student majors in psychology after a roommate takes a course; two freshmen meet on the quad and eventually become husband and wife. Now imagine hard data substituting for happenstance. As Katye Allisone, a freshman at Arizona State University, hunkers down in a computer lab for an 8:35 a.m. math class, the Web-based course watches her back. Answers, scores, pace, click paths-it hoovers up information, like Google. But rather than personalizing search results, data shape Ms. Allisone's class according to her understanding of the material.
George Bradford

About | Learning Emergence - 0 views

  •  
    CORE IDEAS We decided on the name Learning Emergence because we are very much learning about emergence and complex systems phenomena ourselves, even as we develop our thinking on learning as an emergent, systemic phenomenon in different contexts. We must shift to a new paradigm for learning in schools, universities and the workplace which addresses the challenges of the 21st Century. Society needs learners who can cope with intellectual, ethical and emotional complexity of an unprecedented nature. Learning Emergence partners share an overarching focus on deep, systemic learning and leadership - the pro-active engagement of learners and leaders in their own authentic learning journey, in the context of relationship and community. We work at the intersection of (1) deep learning and sensemaking, (2) leadership, (3) complex systems, and (4) technology:
George Bradford

Learning Analytics + NICs for Systemic Educational Improvement | Learning Emergence - 0 views

  •  
    "Personal reflections on 2 workshops and a lecture with Tony Bryk (Carnegie Foundation for the Advancement of Teaching), hosted last week by Ruth Deakin Crick at University of Bristol. What follows after a brief introduction to the concept of NICs, are my thoughts on the intersection of NICs with Learning Analytics. I made a number of connection points between the features of the DEED+NIC approach, and learning analytics, which I'll highlight in green."
George Bradford

Learning process analytics - EduTech Wiki - 1 views

  •  
    "Introduction In this discussion paper, we define learning process analytics as a collection of methods that allow teachers and learners to understand what is going on in a' 'learning scenario, i.e. what participants work(ed) on, how they interact(ed), what they produced(ed), what tools they use(ed), in which physical and virtual location, etc. Learning analytics is most often aimed at generating predictive models of general student behavior. So-called academic analytics even aims to improve the system. We are trying to find a solution to a somewhat different problem. In this paper we will focus on improving project-oriented learner-centered designs, i.e. a family of educational designs that include any or some of knowledge-building, writing-to-learn, project-based learning, inquiry learning, problem-based learning and so forth. We will first provide a short literature review of learning process analytics and related frameworks that can help improve the quality of educational scenarios. We will then describe a few project-oriented educational scenarios that are implemented in various programs at the University of Geneva. These examples illustrate the kind of learning scenarios we have in mind and help define the different types of analytics both learners and teachers need. Finally, we present a provisional list of analytics desiderata divided into "wanted tomorrow" and "nice to have in the future"."
George Bradford

[!!!] Penetrating the Fog: Analytics in Learning and Education (EDUCAUSE Review) | EDUC... - 0 views

  • Continued growth in the amount of data creates an environment in which new or novel approaches are required to understand the patterns of value that exist within the data.
  • learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.
  • Academic analytics, in contrast, is the application of business intelligence in education and emphasizes analytics at institutional, regional, and international levels.
  • ...14 more annotations...
  • Course-level:
  • Educational data-mining
  • Intelligent curriculum
  • Adaptive content
  • the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.
  • Mobile devices
  • social media monitoring tools (e.g., Radian6)
  • Analytics in education must be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration.
    • George Bradford
       
      See Bradford - Brief vision of the semantic web as being used to support future learning: http://heybradfords.com/moonlight/research-resources/SemWeb_EducatorsVision 
    • George Bradford
       
      See Peter Goodyear's work on the Ecology of Sustainable e-Learning in Education.
  • How “real time” should analytics be in classroom settings?
  • Adaptive learning
  • EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)
  • Penetrating the Fog: Analytics in Learning and Education
  •  
    Attempts to imagine the future of education often emphasize new technologies-ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that we can't actually touch or see: big data and analytics. Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision-making improves organizational output and productivity.1 For many leaders in higher education, however, experience and "gut instinct" have a stronger pull.
George Bradford

NSSE Home - 0 views

  •  
    National Survey of Student Engagement What is student engagement? Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning. What does NSSE do? Through its student survey, The College Student Report, NSSE annually collects information at hundreds of four-year colleges and universities about student participation in programs and activities that institutions provide for their learning and personal development. The results provide an estimate of how undergraduates spend their time and what they gain from attending college. NSSE provides participating institutions a variety of reports that compare their students' responses with those of students at self-selected groups of comparison institutions. Comparisons are available for individual survey questions and the five NSSE Benchmarks of Effective Educational Practice. Each November, NSSE also publishes its Annual Results, which reports topical research and trends in student engagement results. NSSE researchers also present and publish research findings throughout the year.
George Bradford

AUSSE | ACER - 0 views

  •  
    Australasian Survey of Student Engagement (AUSSE) Areas measured by the AUSSE The survey instruments used in the AUSSE collect information on around 100 specific learning activities and conditions along with information on individual demographics and educational contexts.The instruments contain items that map onto six student engagement scales: Academic Challenge - the extent to which expectations and assessments challenge students to learn; Active Learning - students' efforts to actively construct knowledge; Student and Staff Interactions - the level and nature of students' contact and interaction with teaching staff; Enriching Educational Experiences - students' participation in broadening educational activities; Supportive Learning Environment - students' feelings of support within the university community; and Work Integrated Learning - integration of employment-focused work experiences into study. The instruments also contain items that map onto seven outcome measures. Average overall grade is captured in a single item, and the other six are composite measures which reflect responses to several items: Higher-Order Thinking - participation in higher-order forms of thinking; General Learning Outcomes - development of general competencies; General Development Outcomes - development of general forms of individual and social development; Career Readiness - preparation for participation in the professional workforce; Average Overall Grade - average overall grade so far in course; Departure Intention - non-graduating students' intentions on not returning to study in the following year; and Overall Satisfaction - students' overall satisfaction with their educational experience.
George Bradford

Learning networks, crowds and communities - 1 views

  •  
    Learning networks, crowds and communities Full Text: PDF Author: Caroline Haythornthwaite University of British Columbia, Vancouver, BC Who we learn from, where and when is dramatically affected by the reach of the Internet. From learning for formal education to learning for pleasure, we look to the web early and often for our data and knowledge needs, but also for places and spaces where we can collaborate, contribute to, and create learning and knowledge communities. Based on the keynote presentation given at the first Learning Analytics and Knowledge Conference held in 2011 in Banff, Alberta, this paper explores a social network perspective on learning with reference to social network principles and studies by the author. The paper explores the ways a social network perspective can be used to examine learning, with attention to the structure and dynamics of online learning networks, and emerging configurations such as online crowds and communities.
1 - 20 of 25 Next ›
Showing 20 items per page