Skip to main content

Home/ Educational Analytics/ Group items tagged framework

Rss Feed Group items tagged

George Bradford

Open Research Online - Contested Collective Intelligence: rationale, technologies, and ... - 0 views

  •  
    We propose the concept of Contested Collective Intelligence (CCI) as a distinctive subset of the broader Collective Intelligence design space. CCI is relevant to the many organizational contexts in which it is important to work with contested knowledge, for instance, due to different intellectual traditions, competing organizational objectives, information overload or ambiguous environmental signals. The CCI challenge is to design sociotechnical infrastructures to augment such organizational capability. Since documents are often the starting points for contested discourse, and discourse markers provide a powerful cue to the presence of claims, contrasting ideas and argumentation, discourse and rhetoric provide an annotation focus in our approach to CCI. Research in sensemaking, computer-supported discourse and rhetorical text analysis motivate a conceptual framework for the combined human and machine annotation of texts with this specific focus. This conception is explored through two tools: a social-semantic web application for human annotation and knowledge mapping (Cohere), plus the discourse analysis component in a textual analysis software tool (Xerox Incremental Parser: XIP). As a step towards an integrated platform, we report a case study in which a document corpus underwent independent human and machine analysis, providing quantitative and qualitative insight into their respective contributions. A promising finding is that significant contributions were signalled by authors via explicit rhetorical moves, which both human analysts and XIP could readily identify. Since working with contested knowledge is at the heart of CCI, the evidence that automatic detection of contrasting ideas in texts is possible through rhetorical discourse analysis is progress towards the effective use of automatic discourse analysis in the CCI framework.
George Bradford

LOCO-Analyst - 0 views

  •  
    What is LOCO-Analyst? LOCO-Analyst is an educational tool aimed at providing teachers with feedback on the relevant aspects of the learning process taking place in a web-based learning environment, and thus helps them improve the content and the structure of their web-based courses. LOCO-Analyst aims at providing teachers with feedback regarding: *  all kinds of activities their students performed and/or took part in during the learning process, *  the usage and the comprehensibility of the learning content they had prepared and deployed in the LCMS, *  contextualized social interactions among students (i.e., social networking) in the virtual learning environment. This Web site provides some basic information about LOCO-Analyst, its functionalities and implementation. In addition, you can watch videos illustrating the tool's functionalities. You can also learn about the LOCO (Learning Object Context Ontologies) ontological framework that lies beneath the LOCO-Analyst tool and download the ontologies of this framework.
George Bradford

A unified framework for multi-level analysis of distributed learning - 0 views

  •  
    A unified framework for multi-level analysis of distributed learning Full Text: PDF Authors: Daniel Suthers University of Hawaii, Honolulu, HI Devan Rosen School of Communications, Ithaca College, Ithaca, NY Learning and knowledge creation is often distributed across multiple media and sites in networked environments. Traces of such activity may be fragmented across multiple logs and may not match analytic needs. As a result, the coherence of distributed interaction and emergent phenomena are analytically cloaked. Understanding distributed learning and knowledge creation requires multi-level analysis of the situated accomplishments of individuals and small groups and of how this local activity gives rise to larger phenomena in a network. We have developed an abstract transcript representation that provides a unified analytic artifact of distributed activity, and an analytic hierarchy that supports multiple levels of analysis. Log files are abstracted to directed graphs that record observed relationships (contingencies) between events, which may be interpreted as evidence of interaction and other influences between actors. Contingency graphs are further abstracted to two-mode directed graphs that record how associations between actors are mediated by digital artifacts and summarize sequential patterns of interaction. Transitive closure of these associograms creates sociograms, to which existing network analytic techniques may be applied, yielding aggregate results that can then be interpreted by reference to the other levels of analysis. We discuss how the analytic hierarchy bridges between levels of analysis and theory.
George Bradford

Times Higher Education - Satisfaction and its discontents - 0 views

  •  
    Satisfaction and its discontents 8 March 2012 The National Student Survey puts pressure on lecturers to provide 'enhanced' experiences. But, argues Frank Furedi, the results do not measure educational quality and the process infantilises students and corrodes academic integrity One of the striking features of a highly centralised system of higher education, such as that of the UK, is that the introduction of new targets and modifications to the quality assurance framework can have a dramatic impact in a very short space of time. When the National Student Survey was introduced in 2005, few colleagues imagined that, just several years down the road, finessing and managing its implementation would require the employment of an entirely new group of quality-assurance operatives. At the time, the NSS was seen by many as a relatively pointless public-relations exercise that would have only a minimal effect on academics' lives. It is unlikely that even its advocates would have expected the NSS to acquire a life of its own and become one of the most powerful influences on the form and nature of the work done in universities.
George Bradford

Learning process analytics - EduTech Wiki - 1 views

  •  
    "Introduction In this discussion paper, we define learning process analytics as a collection of methods that allow teachers and learners to understand what is going on in a' 'learning scenario, i.e. what participants work(ed) on, how they interact(ed), what they produced(ed), what tools they use(ed), in which physical and virtual location, etc. Learning analytics is most often aimed at generating predictive models of general student behavior. So-called academic analytics even aims to improve the system. We are trying to find a solution to a somewhat different problem. In this paper we will focus on improving project-oriented learner-centered designs, i.e. a family of educational designs that include any or some of knowledge-building, writing-to-learn, project-based learning, inquiry learning, problem-based learning and so forth. We will first provide a short literature review of learning process analytics and related frameworks that can help improve the quality of educational scenarios. We will then describe a few project-oriented educational scenarios that are implemented in various programs at the University of Geneva. These examples illustrate the kind of learning scenarios we have in mind and help define the different types of analytics both learners and teachers need. Finally, we present a provisional list of analytics desiderata divided into "wanted tomorrow" and "nice to have in the future"."
George Bradford

Using Big Data to Predict Online Student Success | Inside Higher Ed - 0 views

  • Researchers have created a database that measures 33 variables for the online coursework of 640,000 students – a whopping 3 million course-level records.
  • Project Participants American Public University System Community College System of Colorado Rio Salado College University of Hawaii System University of Illinois-Springfield University of Phoenix
  • “What the data seem to suggest, however, is that for students who seem to have a high propensity of dropping out of an online course-based program, the fewer courses they take initially, the better-off they are.”
  • ...6 more annotations...
  • Phil Ice, vice president of research and development for the American Public University System and the project’s lead investigator.
  • Predictive Analytics Reporting Framework
  • Rio Salado, for example, has used the database to create a student performance tracking system.
  • The two-year college, which is based in Arizona, has a particularly strong online presence for a community college – 43,000 of its students are enrolled in online programs. The new tracking system allows instructors to see a red, yellow or green light for each student’s performance. And students can see their own tracking lights.
  • It measures student engagement through their Web interactions, how often they look at textbooks and whether they respond to feedback from instructors, all in addition to their performance on coursework.
  • The data set has the potential to give institutions sophisticated information about small subsets of students – such as which academic programs are best suited for a 25-year-old male Latino with strength in mathematics
  •  
    New students are more likely to drop out of online colleges if they take full courseloads than if they enroll part time, according to findings from a research project that is challenging conventional wisdom about student success. But perhaps more important than that potentially game-changing nugget, researchers said, is how the project has chipped away at skepticism in higher education about the power of "big data." Researchers have created a database that measures 33 variables for the online coursework of 640,000 students - a whopping 3 million course-level records. While the work is far from complete, the variables help track student performance and retention across a broad range of demographic factors. The data can show what works at a specific type of institution, and what doesn't. That sort of predictive analytics has long been embraced by corporations, but not so much by the academy. The ongoing data-mining effort, which was kicked off last year with a $1 million grant from the Bill and Melinda Gates Foundation, is being led by WCET, the WICHE Cooperative for Educational Technologies.
1 - 6 of 6
Showing 20 items per page