Skip to main content

Home/ Groups/ Educational Analytics
George Bradford

Analytics: The Widening Divide - 0 views

  •  
    Analytics: The Widening Divide
    By David Kiron, Rebecca Shockley, Nina Kruschwitz, Glenn Finch and Dr. Michael Haydock
    November 7, 2011
    How companies are achieving competitive advantage through analytics

    IN THIS SECOND JOINT MIT Sloan Management Review and IBM Institute for Business Value study, we see a growing divide between those companies that, on one side, see the value of business analytics and are transforming themselves to take advantage of these newfound opportunities, and, on the other, that have yet to embrace them. Using insights gathered from more than 4,500 managers and executives, Analytics: The Widening Divide identifies three key competencies that enable organizations to build competitive advantage using analytics. Further, the study identifies two distinct paths that organizations travel while gaining analytic sophistication, and provides recommendations to accelerate organizations on their own paths to analytic transformation.
George Bradford

Measuring Teacher Effectiveness - DataQualityCampaign.Org - 0 views

  •  
    Measuring Teacher Effectiveness
    Significant State Data Capacity is Required to Measure and Improve Teacher Effectiveness
     States Increasingly Focus on Improving Teacher Effectiveness: There is significant activity at the local, state, and federal levels to 
    measure and improve teacher effectiveness, with an unprecedented focus on the use of student achievement as a primary indicator of 
    effectiveness.
    > 23 states require that teacher evaluations include evidence of student learning in the form of student growth and/or value-added data
    (NCTQ, 2011).
    > 17 states and DC have adopted legislation or regulations that specifically require student achievement and/or student growth to
    "significantly" inform or be the primary criterion in teacher evaluations(NCTQ, 2011).
     States Need Significant Data Capacity to Do This Work: These policy changes have significant data implications.
    > The linchpin of all these efforts is that states must reliably link students and teachers in ways that capture the complex connections that 
    exist in schools.
    > If such data is to be used for high stakes decisions-such as hiring, firing, and tenure-it must be accepted as valid, reliable, and fair.
    > Teacher effectiveness data can be leveraged to target professional development, inform staffing assignments, tailor classroom instruction, 
    reflect on practice, support research, and otherwise support teachers.
     Federal Policies Are Accelerating State and Local Efforts: Federal policies increasingly support states' efforts to use student 
    achievement data to measure teacher effectiveness.
    > Various competitive grant funds, including the Race to the Top grants and the Teacher Incentive Fund, require states to implement teacher 
    and principal evaluation systems that take student data into account. 
    > States applying for NCLB waivers, including the 11 that submitted requests in November 2011, must commit to implementing teacher and 
    principal evaluation and support systems.
    > P
George Bradford

Learning and Knowledge Analytics - Analyzing what can be connected - 0 views

  •  
    Learning and Knowledge Analytics
    Analyzing what can be connected
George Bradford

ScienceDirect - The Internet and Higher Education : A course is a course is a course: F... - 0 views

  •  
    "Abstract
    The authors compared the underlying student response patterns to an end-of-course rating instrument for large student samples in online, blended and face-to-face courses. For each modality, the solution produced a single factor that accounted for approximately 70% of the variance. The correlations among the factors across the class formats showed that they were identical. The authors concluded that course modality does not impact the dimensionality by which students evaluate their course experiences. The inability to verify multiple dimensions for student evaluation of instruction implies that the boundaries of a typical course are beginning to dissipate. As a result, the authors concluded that end-of-course evaluations now involve a much more complex network of
    interactions.

    Highlights
    ► The study models student satisfaction in the online, blended, and
    face-to-face course modalities. ► The course models vary technology involvement. ► Image analysis produced single dimension solutions. ► The solutions were
    identical across modalities.

    Keywords:
    Student rating of instruction; online learning;
    blended learning; factor analysis; student agency"
George Bradford

SpringerLink - Abstract - Dr. Fox Rocks: Using Data-mining Techniques to Examine Studen... - 0 views

  •  
    Abstract
    Few traditions in higher education evoke more controversy, ambivalence, criticism, and, at the same time, support than student evaluation of instruction (SEI). Ostensibly, results from these end-of-course survey instruments serve two main functions: they provide instructors with formative input for improving their teaching, and they serve as the basis for summative profiles of professors' effectiveness through the eyes of their students. In the academy, instructor evaluations also can play out in the high-stakes environments of tenure, promotion, and merit salary increases, making this information particularly important to the professional lives of faculty members. At the research level, the volume of the literature for student ratings impresses even the most casual observer with well over 2,000 studies referenced in the Education Resources Information Center (ERIC) alone (Centra, 2003) and an untold number of additional studies published in educational, psychological, psychometric, and discipline-related journals.
    There have been numerous attempts at summarizing this work (Algozzine et al., 2004; Gump, 2007; Marsh & Roche, 1997; Pounder, 2007; Wachtel, 1998). Student ratings gained such notoriety that in November 1997 the American Psychologist devoted an entire issue to the topic (Greenwald, 1997). The issue included student ratings articles focusing on stability and reliability, validity, dimensionality, usefulness for improving teaching and learning, and sensitivity to biasing factors, such as the Dr. Fox phenomenon that describes eliciting high student ratings with strategies that reflect little or no relationship to effective teaching practice (Ware & Williams, 1975; Williams & Ware, 1976, 1977).
George Bradford

Many Eyes - 0 views

  •  
    Try yourself:

    Explore
    ::Visualizations
    :: Data sets
    :: Comments
    :: Topic centers

    Participate
    :: Create a visualization
    :: Upload a data set
    :: Create a topic center

    Learn more
    :: Quick start
    :: Visualization types
    :: About Many Eyes
    :: Privacy
    :: Blog
George Bradford

Many Eyes : Browsing visualizations - 0 views

  •  
    Listing visualizations of data. IBM Research and the IBM Cognos experiment.
George Bradford

Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning ... - 0 views

  •  
    Simon Buckingham Shum
    Ruth Deakin Crick
    2012 (In review)

    Theoretical and empirical evidence in the learning sciences 
    substantiates the view that deep engagement in learning is a 
    function of a  combination of learners' dispositions,  values, 
    attitudes and skills. When these are fragile, learners struggle to 
    achieve their potential in conventional assessments, and critically, 
    are not prepared for the novelty and complexity of the challenges 
    they will meet in the workplace, and the many other spheres of 
    life which require personal qualities such as resilience, critical 
    thinking and collaboration skills. To date, the learning analytics 
    research and development communities have not addressed how 
    these complex concepts can be modelled and analysed. We report 
    progress in the design and implementation of learning analytics 
    based on an empirically validated  multidimensional construct 
    termed  "learning power". We describe a  learning analytics 
    infrastructure  for gathering data at scale, managing stakeholder 
    permissions, the range of analytics that it supports from real time 
    summaries to exploratory research, and a particular visual analytic
    which has been shown to have demonstrable impact on learners. 
    We conclude by  summarising the ongoing research and 
    development programme.
George Bradford

Stunning Infographics and Data Visualization - Noupe - 1 views

  •  
    Creating an effective infographic requires both artistic sense and a clear vision of what to tell the audience. The following are some cool infographics we have collected. Some are colorful, some are simple, but all are informative and visually pleasing. Not only do they provide information in a format that is easy to understand, but they are also artistic creations in their own right.
George Bradford

Eric Blue's Blog » Dataesthetics: The Power and Beauty of Data Visualization - 0 views

  •  
    One of my areas of interest that has grown over the last couple years has been data visualization. I'm a visually-oriented learner, and I look forward to seeing any techniques, illustrations, or technologies that:

    1) Allow people to assimilate information as fast as possible.

    2) Deepen understanding of knowledge by visually illustrating data in new and interesting ways. There is nothing like having an intellectual epiphony after looking at a picture for a few seconds (pictures can definitely be worth a thousand words).

    3) Present information in an aesthetically pleasing way. Or, in extreme examples, inspire a sense of awe!
George Bradford

Data Visualization: Modern Approaches - Smashing Magazine | Smashing Magazine - 0 views

  • 2. Displaying News
  • Digg Stack 15: Digg stories arrange themselves as stack as users digg them. The more diggs a story gets, the larger is the stack.
  • Let’s take a look at the most interesting modern approaches to data visualization as well as related articles, resources and tools.
  • ...6 more annotations...
  • an application that visually reflects the constantly changing landscape of the Google News news aggregator. The size of data blocks is defined by their popularity at the moment.
  • Digg stories arrange themselves as stack as users digg them. The more diggs a story gets, the larger is the stack.
  • a typographic book search, collects the information from Amazon and presents it in the form of keyword you’ve provided.
  • uses visual hills (spikes) to emphasize the density of American population in its map.
  • lets you explore the behavior of your visitors with a heat map. More popular sections, which are clicked more often, are highlighted as “warm” – in red color.
  • Eric Blue provides some references to unusual Data Visualization methods.
  •  
    Data presentation can be beautiful, elegant and descriptive. There is a variety of conventional ways to visualize data - tables, histograms, pie charts and bar graphs are being used every day, in every project and on every possible occasion. However, to convey a message to your readers effectively, sometimes you need more than just a simple pie chart of your results. In fact, there are much better, profound, creative and absolutely fascinating ways to visualize data. Many of them might become ubiquitous in the next few years.
George Bradford

50 most stunning examples of data visualization and infographics | Richworks - 0 views

  • The terms Data visualization and Infographics are used interchangeably, the former means the study of visual representation of data and the latter is its representation per se.
  • 42) Geological Time Spiral
  • 40) Map of online communities
  • ...4 more annotations...
  • 44) Global distribution of water?
  • 43) 1 hour in front of the TV
  • 36) Evolution of Storage
  • 33) The Life of a web article
  •  
    50 MOST STUNNING EXAMPLES OF DATA VISUALIZATION AND INFOGRAPHICS
    Posted by Richie on Thursday, April 15, 2010
    "A picture is worth a thousand words", if I had a penny for every time I heard that!! There is so much data in the world today that it has become impossible for us to analyze them with patience. Data as we perceive it, need not be boring, bland and cumbersome to remember. To make complex things seem simple, is Creativity and using pictures to represent data has been an age old method to analyze data in a fun way.
    From navigating the web in an entirely new dimension to understanding how the human brain works; from peeking into how Google has evolved to analyzing the inner working of the geeky mind, Infographics has completely changed the way we view content and visualize data.
George Bradford

Information graphics - Wikipedia, the free encyclopedia - 0 views

  •  
    Information graphics or infographics are graphic visual representations of information, data or knowledge. These graphics present complex information quickly and clearly,[1] such as in signs, maps, journalism, technical writing, and education. With an information graphic, computer scientists, mathematicians, and statisticians develop and communicate concepts using a single symbol to process information.
George Bradford

Learning Analytics: Ascilite 2011 Keynote - 0 views

  •  
    Learning Analytics: Dream, Nightmare, or Fairydust?

    From today's keynote at Ascilite 2011, here's the podcast plus the slides. I am grateful to Gary, Renee and everyone else at Ascilite for their understanding and flexibility, since after months of planning this trip, unfortunately I could not be there in person after my father passed away last weekend.

    For those of you who like to download and watch offline: podcast [Hi-Res version: 93.3Mb] + slides [PPTX/PDF]

    For detailed descriptions of work presented here, see other posts tagged learning analytics and the references below.
George Bradford

QUT | Learning and Teaching Unit | REFRAME - 0 views

  •  
    REFRAME

    REFRAME is a university-wide project reconceptualising QUT's evaluation of learning and teaching.

    REFRAME is fundamentally reconsidering QUT's overall approach to evaluating learning and teaching. Our aim is to develop a sophisticated risk-based system to gather, analyse and respond to data along with a broader set of user-centered resources. The objective is to provide individuals and teams with the tools, support and reporting they need to meaningfully reflect upon, review and improve teaching, student learning and the curriculum. The approach will be informed by feedback from the university community, practices in other institutions and the literature, and will, as far as possible, be 'future-proofed' through awareness of emergent evaluation trends and tools.

    Central to REFRAME is the consideration of the purpose of evaluation and the features that a future approach should consider.
George Bradford

Using Big Data to Predict Online Student Success | Inside Higher Ed - 0 views

  • Researchers have created a database that measures 33 variables for the online coursework of 640,000 students – a whopping 3 million course-level records.
  • Project Participants

    American Public University System

    Community College System of Colorado

    Rio Salado College

    University of Hawaii System

    University of Illinois-Springfield

    University of Phoenix

  • “What the data seem to suggest, however, is that for students who seem to have a high propensity of dropping out of an online course-based program, the fewer courses they take initially, the better-off they are.”
  • ...6 more annotations...
  • Phil Ice, vice president of research and development for the American Public University System and the project’s lead investigator.
  • Predictive Analytics Reporting Framework
  • Rio Salado, for example, has used the database to create a student performance tracking system.
  • The two-year college, which is based in Arizona, has a particularly strong online presence for a community college – 43,000 of its students are enrolled in online programs. The new tracking system allows instructors to see a red, yellow or green light for each student’s performance. And students can see their own tracking lights.
  • It measures student engagement through their Web interactions, how often they look at textbooks and whether they respond to feedback from instructors, all in addition to their performance on coursework.
  • The data set has the potential to give institutions sophisticated information about small subsets of students – such as which academic programs are best suited for a 25-year-old male Latino with strength in mathematics
  •  
    New students are more likely to drop out of online colleges if they take full courseloads than if they enroll part time, according to findings from a research project that is challenging conventional wisdom about student success.
    But perhaps more important than that potentially game-changing nugget, researchers said, is how the project has chipped away at skepticism in higher education about the power of "big data."
    Researchers have created a database that measures 33 variables for the online coursework of 640,000 students - a whopping 3 million course-level records. While the work is far from complete, the variables help track student performance and retention across a broad range of demographic factors. The data can show what works at a specific type of institution, and what doesn't.
    That sort of predictive analytics has long been embraced by corporations, but not so much by the academy.
    The ongoing data-mining effort, which was kicked off last year with a $1 million grant from the Bill and Melinda Gates Foundation, is being led by WCET, the WICHE Cooperative for Educational Technologies.
George Bradford

Assessment and Analytics in Institutional Transformation (EDUCAUSE Review) | EDUCAUSE - 0 views

  • At the University of Maryland, Baltimore County (UMBC), we believe that process is an important factor in creating cultural change. We thus approach transformational initiatives by using the same scholarly rigor that we expect of any researcher. This involves (1) reviewing the literature and prior work in the area, (2) identifying critical factors and variables, (3) collecting data associated with these critical factors, (4) using rigorous statistical analysis and modeling of the question and factors, (5) developing hypotheses to influence the critical factors, and (6) collecting data based on the changes and assessing the results.
  • among predominantly white higher education institutions in the United States, UMBC has become the leading producer of African-American bachelor’s degree recipients who go on to earn Ph.D.’s in STEM fields. The program has been recognized by the National Science Foundation and the National Academies as a national model.
  • UMBC has recently begun a major effort focused on the success of transfer students in STEM majors. This effort, with pilot funding from the Bill and Melinda Gates Foundation, will look at how universities can partner with community colleges to prepare their graduates to successfully complete a bachelor’s degree in a STEM field.
  • ...5 more annotations...
  • Too often, IT organizations try to help by providing an analytics “dashboard” designed by a vendor that doesn’t know the institution. As a result, the dashboard indicators don’t focus on those key factors most needed at the institution and quickly become window-dressing.
  • IT organizations can support assessment by showing how data in separate systems can become very useful when captured and correlated. For example, UMBC has spent considerable effort to develop a reporting system based on our learning management system (LMS) data. This effort, led from within the IT organization, has helped the institution find new insights into the way faculty and students are using the LMS and has helped us improve the services we offer. We are now working to integrate this data into our institutional data warehouse and are leveraging access to important demographic data to better assess student risk factors and develop interventions.
  • the purpose of learning analytics is “to observe and understand learning behaviors in order to enable appropriate interventions.
  • the 1st International Conference on Learning Analytics and Knowledge (LAK) was held in Banff, Alberta, Canada, in early 2011 (https://tekri.athabascau.ca/analytics/)
  • At UMBC, we are using analytics and assessment to shine a light on students’ performance and behavior and to support teaching effectiveness. What has made the use of analytics and assessment particularly effective on our campus has been the insistence that all groups—faculty, staff, and students—take ownership of the challenge involving student performance and persistence.
  •  
    Assessment and analytics, supported by information technology, can change institutional culture and drive the transformation in student retention, graduation, and success.

    U.S. higher education has an extraordinary record of accomplishment in preparing students for leadership, in serving as a wellspring of research and creative endeavor, and in providing public service. Despite this success, colleges and universities are facing an unprecedented set of challenges. To maintain the country's global preeminence, those of us in higher education are being called on to expand the number of students we educate, increase the proportion of students in science, technology, engineering, and mathematics (STEM), and address the pervasive and long-standing underrepresentation of minorities who earn college degrees-all at a time when budgets are being reduced and questions about institutional efficiency and effectiveness are being raised.
George Bradford

Seeking Evidence of Impact: Opportunities and Needs (EDUCAUSE Review) | EDUCAUSE - 0 views

  • Conversations with CIOs and other senior IT administrators reveal a keen interest in the results of evaluation in teaching and learning to guide fiscal, policy, and strategic decision-making. Yet those same conversations reveal that this need is not being met.
  • gain a wider and shared understanding of “evidence” and “impact” in teaching and learning
  • establish a community of practice
  • ...11 more annotations...
  • provide professional-development opportunities
  • explore successful institutional and political contexts
  • establish evidence-based practice
  • The most important reason is that in the absence of data, anecdote can become the primary basis for decision-making. Rarely does that work out very well.
  • autocatalytic evaluation process—one that builds its own synergy.
  • We live by three principles: uncollected data cannot be analyzed; the numbers are helped by a brief and coherent summary; and good graphs beat tables every time.
    • Reports and testimonies from faculty and students (57%)
    • Measures of student and faculty satisfaction (50%)
    • Measures of student mastery (learning outcomes) (41%)
    • Changes in faculty teaching practice (35%)
    • Measures of student and faculty engagement (32%)
  • The survey results also indicate a need for support in undertaking impact-evaluation projects.
    • Knowing where to begin to measure the impact of technology-based innovations in teaching and learning
    • Knowing which measurement and evaluation techniques are most appropriate
    • Knowing the most effective way to analyze evidence 
  • The challenge of persuasion is what ELI has been calling the last mile problem. There are two interrelated components to this issue: (1) influencing faculty members to improve instructional practices at the course level, and (2) providing evidence to help inform key strategic decisions at the institutional level.
  • Broadly summarized, our results reveal a disparity between the keen interest in research-based evaluation and the level of resources that are dedicated to it—prompting a grass-roots effort to support this work.
  •  
    The SEI program is working with the teaching and learning community to gather evidence of the impact of instructional innovations and current practices and to help evaluate the results.
    The calls for more accountability in higher education, the shrinking budgets that often force larger class sizes, and the pressures to increase degree-completion rates are all raising the stakes for colleges and universities today, especially with respect to the instructional enterprise. As resources shrink, teaching and learning is becoming the key point of accountability. The evaluation of instructional practice would thus seem to be an obvious response to such pressures, with institutions implementing systematic programs of evaluation in teaching and learning, especially of instructional innovations.
« First ‹ Previous 81 - 98 of 98
Showing 20 items per page