Skip to main content

Home/ Educational Analytics/ Group items tagged cohere

Rss Feed Group items tagged

George Bradford

Cohere >>> make the connection - 0 views

  •  
    About Cohere The Web is about IDEAS+PEOPLE. Cohere is a visual tool to create, connect and share Ideas. Back them up with websites. Support or challenge them. Embed them to spread virally. Discover who - literally - connects with your thinking. Publish ideas and optionally add relevant websites Weave webs of meaningful connections between ideas: your own and the world's Discover new ideas and people We experience the information ocean as streams of media fragments, flowing past us in every modality. To make sense of these, learners, researchers and analysts must organise them into coherent patterns. Cohere is an idea management tool for you to annotate URLs with ideas, and weave meaningful connections between ideas for personal, team or social use. Key Features Annotate a URL with any number of Ideas, or vice-versa. Visualize your network as it grows Make connections between your Ideas, or Ideas that anyone else has made public or shared with you via a common Group Use Groups to organise your Ideas and Connections by project, and to manage access-rights Import your data as RSS feeds (eg. bookmarks or blog posts), to convert them to Ideas, ready for connecting Use the RESTful API services to query, edit and mashup data from other tools Learn More Subscribe to our Blog to track developments as they happen. Read this article to learn more about the design of Cohere to support dialogue and debate.
George Bradford

Open Research Online - Discourse-centric learning analytics - 0 views

  •  
    Drawing on sociocultural discourse analysis and argumentation theory, we motivate a focus on learners' discourse as a promising site for identifying patterns of activity which correspond to meaningful learning and knowledge construction. However, software platforms must gain access to qualitative information about the rhetorical dimensions to discourse contributions to enable such analytics. This is difficult to extract from naturally occurring text, but the emergence of more-structured annotation and deliberation platforms for learning makes such information available. Using the Cohere web application as a research vehicle, we present examples of analytics at the level of individual learners and groups, showing conceptual and social network patterns, which we propose as indicators of meaningful learning.
George Bradford

Open Research Online - Contested Collective Intelligence: rationale, technologies, and ... - 0 views

  •  
    We propose the concept of Contested Collective Intelligence (CCI) as a distinctive subset of the broader Collective Intelligence design space. CCI is relevant to the many organizational contexts in which it is important to work with contested knowledge, for instance, due to different intellectual traditions, competing organizational objectives, information overload or ambiguous environmental signals. The CCI challenge is to design sociotechnical infrastructures to augment such organizational capability. Since documents are often the starting points for contested discourse, and discourse markers provide a powerful cue to the presence of claims, contrasting ideas and argumentation, discourse and rhetoric provide an annotation focus in our approach to CCI. Research in sensemaking, computer-supported discourse and rhetorical text analysis motivate a conceptual framework for the combined human and machine annotation of texts with this specific focus. This conception is explored through two tools: a social-semantic web application for human annotation and knowledge mapping (Cohere), plus the discourse analysis component in a textual analysis software tool (Xerox Incremental Parser: XIP). As a step towards an integrated platform, we report a case study in which a document corpus underwent independent human and machine analysis, providing quantitative and qualitative insight into their respective contributions. A promising finding is that significant contributions were signalled by authors via explicit rhetorical moves, which both human analysts and XIP could readily identify. Since working with contested knowledge is at the heart of CCI, the evidence that automatic detection of contrasting ideas in texts is possible through rhetorical discourse analysis is progress towards the effective use of automatic discourse analysis in the CCI framework.
George Bradford

People | Knowledge Media Institute | The Open University - 0 views

  •  
    People | Member | Simon Buckingham Shum Snr Lecturer in Knowledge Media I am fundamentally interested in technologies for sensemaking, specifically, which structure discourse to assist reflection and analysis. Examples: D3E, Compendium, ClaiMaker and Cohere.
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
  • Reports and testimonies from faculty and students (57%) Measures of student and faculty satisfaction (50%) Measures of student mastery (learning outcomes) (41%) Changes in faculty teaching practice (35%) Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
  • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning Knowing which measurement and evaluation techniques are most appropriate Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results. The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
George Bradford

A unified framework for multi-level analysis of distributed learning - 0 views

  •  
    A unified framework for multi-level analysis of distributed learning Full Text: PDF Authors: Daniel Suthers University of Hawaii, Honolulu, HI Devan Rosen School of Communications, Ithaca College, Ithaca, NY Learning and knowledge creation is often distributed across multiple media and sites in networked environments. Traces of such activity may be fragmented across multiple logs and may not match analytic needs. As a result, the coherence of distributed interaction and emergent phenomena are analytically cloaked. Understanding distributed learning and knowledge creation requires multi-level analysis of the situated accomplishments of individuals and small groups and of how this local activity gives rise to larger phenomena in a network. We have developed an abstract transcript representation that provides a unified analytic artifact of distributed activity, and an analytic hierarchy that supports multiple levels of analysis. Log files are abstracted to directed graphs that record observed relationships (contingencies) between events, which may be interpreted as evidence of interaction and other influences between actors. Contingency graphs are further abstracted to two-mode directed graphs that record how associations between actors are mediated by digital artifacts and summarize sequential patterns of interaction. Transitive closure of these associograms creates sociograms, to which existing network analytic techniques may be applied, yielding aggregate results that can then be interpreted by reference to the other levels of analysis. We discuss how the analytic hierarchy bridges between levels of analysis and theory.
1 - 6 of 6
Showing 20 items per page