Skip to main content

Home/ Educational Analytics/ Contents contributed and discussions participated by George Bradford

Contents contributed and discussions participated by George Bradford

George Bradford

Wmatrix corpus analysis and comparison tool - 0 views

  •  
    Wmatrix is a software tool for corpus analysis and comparison. It provides a web interface to the USAS and CLAWS corpus annotation tools, and standard corpus linguistic methodologies such as frequency lists and concordances. It also extends the keywords method to key grammatical categories and key semantic domains. Wmatrix allows the user to run these tools via a web browser such as Opera, Firefox or Internet Explorer, and so will run on any computer (Mac, Windows, Linux, Unix) with a web browser and a network connection. Wmatrix was initially developed by Paul Rayson in the REVERE project, extended and applied to corpus linguistics during PhD work and is still being updated regularly. Earlier versions were available for Unix via terminal-based command line access (tmatrix) and Unix via Xwindows (Xmatrix), but these only offer retrieval of text pre-annotated with USAS and CLAWS.
George Bradford

Browse Maps - Places & Spaces: Mapping Science - 0 views

  •  
    Data Visualizations - Organized by KATY BÖRNER
George Bradford

Cohere >>> make the connection - 0 views

  •  
    About Cohere The Web is about IDEAS+PEOPLE. Cohere is a visual tool to create, connect and share Ideas. Back them up with websites. Support or challenge them. Embed them to spread virally. Discover who - literally - connects with your thinking. Publish ideas and optionally add relevant websites Weave webs of meaningful connections between ideas: your own and the world's Discover new ideas and people We experience the information ocean as streams of media fragments, flowing past us in every modality. To make sense of these, learners, researchers and analysts must organise them into coherent patterns. Cohere is an idea management tool for you to annotate URLs with ideas, and weave meaningful connections between ideas for personal, team or social use. Key Features Annotate a URL with any number of Ideas, or vice-versa. Visualize your network as it grows Make connections between your Ideas, or Ideas that anyone else has made public or shared with you via a common Group Use Groups to organise your Ideas and Connections by project, and to manage access-rights Import your data as RSS feeds (eg. bookmarks or blog posts), to convert them to Ideas, ready for connecting Use the RESTful API services to query, edit and mashup data from other tools Learn More Subscribe to our Blog to track developments as they happen. Read this article to learn more about the design of Cohere to support dialogue and debate.
George Bradford

About | Learning Emergence - 0 views

  •  
    CORE IDEAS We decided on the name Learning Emergence because we are very much learning about emergence and complex systems phenomena ourselves, even as we develop our thinking on learning as an emergent, systemic phenomenon in different contexts. We must shift to a new paradigm for learning in schools, universities and the workplace which addresses the challenges of the 21st Century. Society needs learners who can cope with intellectual, ethical and emotional complexity of an unprecedented nature. Learning Emergence partners share an overarching focus on deep, systemic learning and leadership - the pro-active engagement of learners and leaders in their own authentic learning journey, in the context of relationship and community. We work at the intersection of (1) deep learning and sensemaking, (2) leadership, (3) complex systems, and (4) technology:
George Bradford

LOCO-Analyst - 0 views

  •  
    What is LOCO-Analyst? LOCO-Analyst is an educational tool aimed at providing teachers with feedback on the relevant aspects of the learning process taking place in a web-based learning environment, and thus helps them improve the content and the structure of their web-based courses. LOCO-Analyst aims at providing teachers with feedback regarding: *  all kinds of activities their students performed and/or took part in during the learning process, *  the usage and the comprehensibility of the learning content they had prepared and deployed in the LCMS, *  contextualized social interactions among students (i.e., social networking) in the virtual learning environment. This Web site provides some basic information about LOCO-Analyst, its functionalities and implementation. In addition, you can watch videos illustrating the tool's functionalities. You can also learn about the LOCO (Learning Object Context Ontologies) ontological framework that lies beneath the LOCO-Analyst tool and download the ontologies of this framework.
George Bradford

Features | Gephi, open source graph visualization software - 0 views

  •  
    Features Gephi is a tool for people that have to explore and understand graphs. Like Photoshop but for data, the user interacts with the representation, manipulate the structures, shapes and colors to reveal hidden properties. The goal is to help data analysts to make hypothesis, intuitively discover patterns, isolate structure singularities or faults during data sourcing. It is a complementary tool to traditional statistics, as visual thinking with interactive interfaces is now recognized to facilitate reasoning. This is a software for Exploratory Data Analysis, a paradigm appeared in the Visual Analytics field of research.
George Bradford

NSSE Survey Instruments - 0 views

  •  
    Links to instruments from 2000 through 2011.
George Bradford

AUSSE | ACER - 0 views

  •  
    Australasian Survey of Student Engagement (AUSSE) Areas measured by the AUSSE The survey instruments used in the AUSSE collect information on around 100 specific learning activities and conditions along with information on individual demographics and educational contexts.The instruments contain items that map onto six student engagement scales: Academic Challenge - the extent to which expectations and assessments challenge students to learn; Active Learning - students' efforts to actively construct knowledge; Student and Staff Interactions - the level and nature of students' contact and interaction with teaching staff; Enriching Educational Experiences - students' participation in broadening educational activities; Supportive Learning Environment - students' feelings of support within the university community; and Work Integrated Learning - integration of employment-focused work experiences into study. The instruments also contain items that map onto seven outcome measures. Average overall grade is captured in a single item, and the other six are composite measures which reflect responses to several items: Higher-Order Thinking - participation in higher-order forms of thinking; General Learning Outcomes - development of general competencies; General Development Outcomes - development of general forms of individual and social development; Career Readiness - preparation for participation in the professional workforce; Average Overall Grade - average overall grade so far in course; Departure Intention - non-graduating students' intentions on not returning to study in the following year; and Overall Satisfaction - students' overall satisfaction with their educational experience.
George Bradford

NSSE Home - 0 views

  •  
    National Survey of Student Engagement What is student engagement? Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning. What does NSSE do? Through its student survey, The College Student Report, NSSE annually collects information at hundreds of four-year colleges and universities about student participation in programs and activities that institutions provide for their learning and personal development. The results provide an estimate of how undergraduates spend their time and what they gain from attending college. NSSE provides participating institutions a variety of reports that compare their students' responses with those of students at self-selected groups of comparison institutions. Comparisons are available for individual survey questions and the five NSSE Benchmarks of Effective Educational Practice. Each November, NSSE also publishes its Annual Results, which reports topical research and trends in student engagement results. NSSE researchers also present and publish research findings throughout the year.
George Bradford

From the Semantic Web to social machines: A research challenge for AI on the World Wide... - 0 views

  •  
    From the Semantic Web to social machines: A research challenge for AI on the World Wide Web Jim Hendler, Tim Berners-Lee Abstract The advent of social computing on the Web has led to a new generation of Web applications that are powerful and world-changing. However, we argue that we are just at the beginning of this age of "social machines" and that their continued evolution and growth requires the cooperation of Web and AI researchers. In this paper, we show how the growing Semantic Web provides necessary support for these technologies, outline the challenges we see in bringing the technology to the next level, and propose some starting places for the research.
George Bradford

[!!!!] Social Learning Analytics - Technical Report (pdf) - 0 views

  •  
    Technical Report KMI-11-01 June 2011 Simon Buckingham Shum and Rebecca Ferguson Abstract: We propose that the design and implementation of effective Social Learning Analytics presents significant challenges and opportunities for both research and enterprise, in three important respects. The first is the challenge of implementing analytics that have pedagogical and ethical integrity, in a context where power and control over data is now of primary importance. The second challenge is that the educational landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social learning, and ways of conceiving social learning environments as distinct from other social platforms. This sets the context for the third challenge, namely, to understand different types of Social Learning Analytic, each of which has specific technical and pedagogical challenges. We propose an initial taxonomy of five types. We conclude by considering potential futures for Social Learning Analytics, if the drivers and trends reviewed continue, and the prospect of solutions to some of the concerns that institution-centric learning analytics may provoke. 
George Bradford

Learning networks, crowds and communities - 1 views

  •  
    Learning networks, crowds and communities Full Text: PDF Author: Caroline Haythornthwaite University of British Columbia, Vancouver, BC Who we learn from, where and when is dramatically affected by the reach of the Internet. From learning for formal education to learning for pleasure, we look to the web early and often for our data and knowledge needs, but also for places and spaces where we can collaborate, contribute to, and create learning and knowledge communities. Based on the keynote presentation given at the first Learning Analytics and Knowledge Conference held in 2011 in Banff, Alberta, this paper explores a social network perspective on learning with reference to social network principles and studies by the author. The paper explores the ways a social network perspective can be used to examine learning, with attention to the structure and dynamics of online learning networks, and emerging configurations such as online crowds and communities.
George Bradford

Attention please! - 0 views

  •  
    Attention please!: learning analytics for visualization and recommendation Full Text: PDF Author: Erik Duval Katholieke Universiteit Leuven, Leuven, Belgium This paper will present the general goal of and inspiration for our work on learning analytics, that relies on attention metadata for visualization and recommendation. Through information visualization techniques, we can provide a dashboard for learners and teachers, so that they no longer need to "drive blind". Moreover, recommendation can help to deal with the "paradox of choice" and turn abundance from a problem into an asset for learning.
George Bradford

A unified framework for multi-level analysis of distributed learning - 0 views

  •  
    A unified framework for multi-level analysis of distributed learning Full Text: PDF Authors: Daniel Suthers University of Hawaii, Honolulu, HI Devan Rosen School of Communications, Ithaca College, Ithaca, NY Learning and knowledge creation is often distributed across multiple media and sites in networked environments. Traces of such activity may be fragmented across multiple logs and may not match analytic needs. As a result, the coherence of distributed interaction and emergent phenomena are analytically cloaked. Understanding distributed learning and knowledge creation requires multi-level analysis of the situated accomplishments of individuals and small groups and of how this local activity gives rise to larger phenomena in a network. We have developed an abstract transcript representation that provides a unified analytic artifact of distributed activity, and an analytic hierarchy that supports multiple levels of analysis. Log files are abstracted to directed graphs that record observed relationships (contingencies) between events, which may be interpreted as evidence of interaction and other influences between actors. Contingency graphs are further abstracted to two-mode directed graphs that record how associations between actors are mediated by digital artifacts and summarize sequential patterns of interaction. Transitive closure of these associograms creates sociograms, to which existing network analytic techniques may be applied, yielding aggregate results that can then be interpreted by reference to the other levels of analysis. We discuss how the analytic hierarchy bridges between levels of analysis and theory.
George Bradford

Analytics: The Widening Divide - 0 views

  •  
    Analytics: The Widening Divide By David Kiron, Rebecca Shockley, Nina Kruschwitz, Glenn Finch and Dr. Michael Haydock November 7, 2011 How companies are achieving competitive advantage through analytics IN THIS SECOND JOINT MIT Sloan Management Review and IBM Institute for Business Value study, we see a growing divide between those companies that, on one side, see the value of business analytics and are transforming themselves to take advantage of these newfound opportunities, and, on the other, that have yet to embrace them. Using insights gathered from more than 4,500 managers and executives, Analytics: The Widening Divide identifies three key competencies that enable organizations to build competitive advantage using analytics. Further, the study identifies two distinct paths that organizations travel while gaining analytic sophistication, and provides recommendations to accelerate organizations on their own paths to analytic transformation.
George Bradford

Measuring Teacher Effectiveness - DataQualityCampaign.Org - 0 views

  •  
    Measuring Teacher Effectiveness Significant State Data Capacity is Required to Measure and Improve Teacher Effectiveness  States Increasingly Focus on Improving Teacher Effectiveness: There is significant activity at the local, state, and federal levels to  measure and improve teacher effectiveness, with an unprecedented focus on the use of student achievement as a primary indicator of  effectiveness. > 23 states require that teacher evaluations include evidence of student learning in the form of student growth and/or value-added data (NCTQ, 2011). > 17 states and DC have adopted legislation or regulations that specifically require student achievement and/or student growth to "significantly" inform or be the primary criterion in teacher evaluations(NCTQ, 2011).  States Need Significant Data Capacity to Do This Work: These policy changes have significant data implications. > The linchpin of all these efforts is that states must reliably link students and teachers in ways that capture the complex connections that  exist in schools. > If such data is to be used for high stakes decisions-such as hiring, firing, and tenure-it must be accepted as valid, reliable, and fair. > Teacher effectiveness data can be leveraged to target professional development, inform staffing assignments, tailor classroom instruction,  reflect on practice, support research, and otherwise support teachers.  Federal Policies Are Accelerating State and Local Efforts: Federal policies increasingly support states' efforts to use student  achievement data to measure teacher effectiveness. > Various competitive grant funds, including the Race to the Top grants and the Teacher Incentive Fund, require states to implement teacher  and principal evaluation systems that take student data into account.  > States applying for NCLB waivers, including the 11 that submitted requests in November 2011, must commit to implementing teacher and  principal evaluation and support systems. > P
George Bradford

Learning and Knowledge Analytics - Analyzing what can be connected - 0 views

  •  
    Learning and Knowledge Analytics Analyzing what can be connected
George Bradford

ScienceDirect - The Internet and Higher Education : A course is a course is a course: F... - 0 views

  •  
    "Abstract The authors compared the underlying student response patterns to an end-of-course rating instrument for large student samples in online, blended and face-to-face courses. For each modality, the solution produced a single factor that accounted for approximately 70% of the variance. The correlations among the factors across the class formats showed that they were identical. The authors concluded that course modality does not impact the dimensionality by which students evaluate their course experiences. The inability to verify multiple dimensions for student evaluation of instruction implies that the boundaries of a typical course are beginning to dissipate. As a result, the authors concluded that end-of-course evaluations now involve a much more complex network of interactions. Highlights ► The study models student satisfaction in the online, blended, and face-to-face course modalities. ► The course models vary technology involvement. ► Image analysis produced single dimension solutions. ► The solutions were identical across modalities. Keywords: Student rating of instruction; online learning; blended learning; factor analysis; student agency"
George Bradford

SpringerLink - Abstract - Dr. Fox Rocks: Using Data-mining Techniques to Examine Studen... - 0 views

  •  
    Abstract Few traditions in higher education evoke more controversy, ambivalence, criticism, and, at the same time, support than student evaluation of instruction (SEI). Ostensibly, results from these end-of-course survey instruments serve two main functions: they provide instructors with formative input for improving their teaching, and they serve as the basis for summative profiles of professors' effectiveness through the eyes of their students. In the academy, instructor evaluations also can play out in the high-stakes environments of tenure, promotion, and merit salary increases, making this information particularly important to the professional lives of faculty members. At the research level, the volume of the literature for student ratings impresses even the most casual observer with well over 2,000 studies referenced in the Education Resources Information Center (ERIC) alone (Centra, 2003) and an untold number of additional studies published in educational, psychological, psychometric, and discipline-related journals. There have been numerous attempts at summarizing this work (Algozzine et al., 2004; Gump, 2007; Marsh & Roche, 1997; Pounder, 2007; Wachtel, 1998). Student ratings gained such notoriety that in November 1997 the American Psychologist devoted an entire issue to the topic (Greenwald, 1997). The issue included student ratings articles focusing on stability and reliability, validity, dimensionality, usefulness for improving teaching and learning, and sensitivity to biasing factors, such as the Dr. Fox phenomenon that describes eliciting high student ratings with strategies that reflect little or no relationship to effective teaching practice (Ware & Williams, 1975; Williams & Ware, 1976, 1977).
« First ‹ Previous 61 - 80 of 94 Next ›
Showing 20 items per page