Skip to main content

Home/ Educational Analytics/ Group items tagged large

Rss Feed Group items tagged

George Bradford

Sydney Learning Analytics Research Group (LARG) - 0 views

  •  
    "SYDNEY LEARNING ANALYTICS RESEARCH GROUP About The Sydney Learning Analytics Research Group (LARG) is a joint venture of the newly established Quality and Analytics Group within the Education Portfolio, and the new Centre for Research on Learning and Innovation connected to the Faculty of Education and Social Work. The key purposes in establishing the new research group are: Capacity building in learning analytics for the benefit of the institution, its students and staff To generate interest and expertise in learning analytics at the University, and build a new network of research colleagues To build a profile for the University of Sydney as a national and international leader in learning analytics LARG was launched at ALASI in late November 2015. The leadership team is actively planning now for the 2016 calendar year and beyond, with several community-building initiatives already in the pipeline, the first being a lecture by George Siemens, and the second is a new conference travel grant (see details below)."
George Bradford

ScienceDirect - The Internet and Higher Education : A course is a course is a course: F... - 0 views

  •  
    "Abstract The authors compared the underlying student response patterns to an end-of-course rating instrument for large student samples in online, blended and face-to-face courses. For each modality, the solution produced a single factor that accounted for approximately 70% of the variance. The correlations among the factors across the class formats showed that they were identical. The authors concluded that course modality does not impact the dimensionality by which students evaluate their course experiences. The inability to verify multiple dimensions for student evaluation of instruction implies that the boundaries of a typical course are beginning to dissipate. As a result, the authors concluded that end-of-course evaluations now involve a much more complex network of interactions. Highlights ► The study models student satisfaction in the online, blended, and face-to-face course modalities. ► The course models vary technology involvement. ► Image analysis produced single dimension solutions. ► The solutions were identical across modalities. Keywords: Student rating of instruction; online learning; blended learning; factor analysis; student agency"
George Bradford

Dr Ruth Deakin Crick - Graduate School of Education - 0 views

  •  
    First, the ongoing exploration of the reliability and validity of the psychometric assessment instrument designed to measure and stimulate change in learning power, for which I was one of three originators between 2000 and 2002. To date I have been able to collect large data sets (n=>50,000) and have published reliability and validity statistics in four  peer reviewed journal articles. Second, the application of the concept and assessment of learning power in pedagogy in school, community and corporate sectors, and in particular its contribution to personalisation of learning through authentic enquiry. Third, the contribution of learning power and enquiry to what we know about complexity in education, particularly through the development of systems learning and leadership as a vehicle for organisational transformation. Finally, the application of learning power assessment strategies to the emerging field of learning analytics and agent-based modelling.
George Bradford

College Degrees, Designed by the Numbers - Technology - The Chronicle of Higher Education - 0 views

  • Arizona State's retention rate rose to 84 percent from 77 percent in recent years, a change that the provost credits largely to eAdvisor.
  • Mr. Lange and his colleagues had found that by the eighth day of class, they could predict, with 70-percent accuracy, whether a student would score a C or better. Mr. Lange built a system, rolled out in 2009, that sent professors frequently updated alerts about how well each student was predicted to do, based on course performance and online behavior.
  • Rio Salado knows from its database that students who hand in late assignments and don't log in frequently often fail or withdraw from a course. So the software is more likely to throw up a red flag for current students with those characteristics.
  • ...5 more annotations...
  • And in a cautionary tale about technical glitches, the college began sharing grade predictions with students last summer, hoping to encourage those lagging behind to step up, but had to shut the alerts down in the spring. Course revisions had skewed the calculations, and some predictions were found to be inaccurate. An internal analysis found no increase in the number of students dropping classes. An improved system is promised for the fall.
  • His software borrows a page from Netflix. It melds each student's transcript with thousands of past students' grades and standardized-test scores to make suggestions. When students log into the online portal, they see 10 "Course Suggestions for You," ranked on a five-star scale. For, say, a health-and-human-performance major, kinesiology might get five stars, as the next class needed for her major. Physics might also top the list, to satisfy a science requirement in the core curriculum.
  • Behind those recommendations is a complex algorithm, but the basics are simple enough. Degree requirements figure in the calculations. So do classes that can be used in many programs, like freshman writing. And the software bumps up courses for which a student might have a talent, by mining their records—grades, high-school grade-point average, ACT scores—and those of others who walked this path before.
  • The software sifts through a database of hundreds of thousands of grades other students have received. It analyzes the historical data to figure out how much weight to assign each piece of the health major's own academic record in forecasting how she will do in a particular course. Success in math is strongly predictive of success in physics, for example. So if her transcript and ACT score indicate a history of doing well in math, physics would probably be recommended over biology, though both satisfy the same core science requirement.
  • Every year, students in Tennessee lose their state scholarships because they fall a hair short of the GPA cutoff, Mr. Denley says, a financial swing that "massively changes their likelihood of graduating."
  •  
    July 18, 2012 College Degrees, Designed by the Numbers By Marc Parry Illustration by Randy Lyhus for The Chronicle Campuses are places of intuition and serendipity: A professor senses confusion on a student's face and repeats his point; a student majors in psychology after a roommate takes a course; two freshmen meet on the quad and eventually become husband and wife. Now imagine hard data substituting for happenstance. As Katye Allisone, a freshman at Arizona State University, hunkers down in a computer lab for an 8:35 a.m. math class, the Web-based course watches her back. Answers, scores, pace, click paths-it hoovers up information, like Google. But rather than personalizing search results, data shape Ms. Allisone's class according to her understanding of the material.
1 - 5 of 5
Showing 20 items per page